Showing posts with label standards. Show all posts
Showing posts with label standards. Show all posts

Tuesday, September 27, 2016

Party time for GovCMS as it hits 102 sites, well ahead of target

It's party time at the Department of Finance as GovCMS continues its growth surge, from 78 sites less than a month ago to 102 sites this week.

This means the Drupal-based platform is tracking 70% ahead of its 2016 targets, demonstrating how successful a well-engineered and supported digital platform can be in government if well designed and supported.


While some of the growth may have come from agencies shifting away from GovSpace, which shuts down next year, part is also coming from state, territory and local governments who are beginning to consider the platform seriously.

While mandating a single webCMS and platform might be a step too far for Australian governments, the approach of providing a cheap and effective platform, with full standards support, a growing developer base and interoperability of plugins and modules (which can be reused across agency sites), is providing a strong 'pull' effect.

This 'pull', rather than a 'push' (mandated) approach to service design is one that government can also apply to citizen and business services, so I'm hopeful that the GovCMS experience is demonstrating to agencies how the carrot can be more powerful than the stick.

Given that even the Digital Transformation office has now fallen into line, after the DTO initially considered building its own WebCMS for the Gov.au site, GovCMS has been a massive success for government in Australia, and for the Department of Finance in particular.

GovCMS is supported by Acquia, the commercial entity created by the developers of the open-source Drupal platform, with a variety of local development partners involved in the development of specific agency sites.

Read full post...

Thursday, August 30, 2012

Australia's first 3rd Generation open data site - from the ACT

The ACT government today announced the soft-launch of their new open data site,  dataACT, through their equally new  Government Information Office blog.

In my view this is now the best government open data site in Australia.

What makes it the best?
  • Data is available in a range of common reusable formats - from JSON and RDF through RSS and XML - as well as CSV and XLS for spreadsheet users.
  • Visualisation tools are built into the site, so data is not only useful to data scientists and programmers, but to the broader public who can chart and map it without having to leave the site.
  • The built-in embed tool allows people to take the data and rapidly include it in their own site without any programming knowledge.
  • Users can reorder the columns and filter the information in the site - again without having to export it first, and
  • discussions are built into every dataset by default.
It follows a 'generational' path for open data I've been talking about for awhile.

Most open data sites start as random collections of whatever data that agencies feel they can release as a 'quick win', to meet a government openness directive. They then progressing through more structured sites with rigour and organisation, but still only data, through to data and visualisation sites which support broader usage by the general community and finally into what I term 'data community sites', which become collaborative efforts with citizens.

In my view dataACT has skipped straight to a 3rd Generation data site at a time when other governments across Australia are struggling with 1st or 2nd Generation sites.

Well done ACT!

Now who will be the first government in Australia to get to a 4th Generation site!

Read on for my view of the generations of open data sites:

1st Generation: Data index

  • Contains or links to 'random' datasets, being those that agencies can release publicly quickly. 
  • Data is released in whatever format the data was held in (PDF, CSV, etc) and is not reformatted to web standards (JSON, RDF, etc).
  • Some datasets are released under custom or restrictive licenses.
  • Limited or no ability to discuss or rate datasets
  • Ability to 'request datasets', but with no response process or common workflow

2nd Generation:  Structured data index

  • Some thought regarding selective datasets, but largely 'random'
  • More standardisation of data formats to be reusable online
  • More standardisation of data licenses to permit consistent reuse
  • Tagging and commenting supported (as in a blog for the site), with limited interaction by site management
  • Workflows introduced for dataset requests, with agencies required to respond as to when they will release, or why they will not release, data
  • Ability to list websites, services and mobile apps created using data

3rd Generation: Standardised data index

  • Standardisation of data formats with at least manual conversion of data between common standard formats 
  • Standardisation of data licenses to permit consistent reuse
  • Tagging and commenting supported, with active interaction by site management
  • Data request workflows largely automated and integrated with FOI processes
  • Ability to filter, sort and visualise data within the site to broaden usage to non-technical citizens
  • Ability to embed data and visualisations from site in other sites
  • Ability to list, rate and comment on websites, services and mobile apps created using data

4th Generation: Data community

  • Strategic co-ordinated release of data by agencies to provide segment-specific data pictures of specific topics or locations
  • Standardisation of data formats with automatic conversion of data between common standard formats
  • Standardised data licenses
  • Tagging, commenting and data rating supported, with active interaction by site management and data holding agencies
  • Data request workflows fully automated and integrated with FOI processes with transparent workflows in the site showing what stage the data release is up to - (data requested, communicated to agency, considered by agency, approved for release, being cleaned/formatted, legal clearances checked, released/refused release)
  • Support for data correction and conversion by the public
  • Support for upload of citizen and private enterprise datasets
  • Ability to filter, sort and visualise data, including mashing up discrete datasets within the site to broaden usage to non-technical citizens
  • Ability to request data visualisations as a data request
  • Supports collaboration between hackers to co-develop websites, services and mobile apps using data
  • Integrates the capability to run hack events - potentially on a more frequent basis (form/enter teams/submit hack proposals/submit hacks/public and internal voting/Winner promotion)

    5th Generation: Integrated data platform

    • A common platform for all national, state and local data, with the capabilities for each jurisdiction to make use of all Generation 4 features.
    • Integrated mapping environment for all levels of government, enabled with all available open data.

      Read full post...

      Tuesday, May 22, 2012

      Standardising content across government (or why does every agency have a different privacy policy?)

      Every government website serves a different purpose and a different audience, however there are also standard content every site must have and legislation and standardised policies they must follow.

      This includes content such as a privacy policy, legal disclaimer,  terms of use, accessibility statement, copyright, social media channels, contact page, information publication (FOI) pages and so on. It also includes the navigational structure and internal ordering of pages and the web addresses to access this content (such as for 'about us' pages).

      So is there a case to standardise the templates and/or content of these pages and where to find them in websites across government?

      I think so.

      From an audience perspective, there is a strong case to do so. Citizens often use multiple government websites and it makes their experience more streamlined and efficient if they can find what they need in a consistent place (such as www.agency.gov.au/privacy), written in a consistent format and, where possible, using identical or near identical language.

      It would also save money and time. Rather than having to write and seek legal approval for the full page content (such as for privacy information), only agency-specific parts would need writing or approval. Websites could be established more rapidly using the standard content pages and lawyers could focus on higher value tasks.

      To put a number on the current cost of individually creating standard, if you assume it cost, in time and effort, around $500 to develop a privacy policy and that there are around 941 government websites (according to Government's online info offensive a flop), it would have cost up to $470,500 for individual privacy policies for all sites. Multiple this by the number of potentially standardisable pages and the millions begin adding up.

      Standardisation could even minimise legal risks. It removes a potential point of failure from agencies who are not resourced or have the expertise to create appropriate policies and expose themselves to greater risks - such as over poorly written legal disclaimers which leave them open to being sued by citizens.

      In some cases it may be possible to use the same standard text, with a few optional inclusions or agency-specific variations - such as for privacy policies, disclaimers, accessibility statements, terms of use, and similar standard pages.

      In other cases it won't be possible to use the same content (such as for 'about us' pages), however the location and structure of the page can be similar - still providing public benefits.

      Let's take privacy policies specifically for a moment.There's incredible diversity of privacy policies across Australian Government websites, although they are all subject to the same legislation (the Privacy Act 1988) and largely cover the same topics (with some variation in detail).

      While this is good for lawyers, who get to write or review these policies, it may not be as good for citizens - who need to contend with different policies when they seek to register for updates or services.

      Many government privacy policies are reviewed rarely, due to time and resource constraints, which may place agencies at risk where the use of new tools (such as Youtube, Slideshare and Scribd) to embed or manipulate content within agency sites can expose users unknowingly to the privacy conditions of third party sites (see how we handled these in myregion's privacy policy with an extendable third party section).

      So, how would government go about standardisation? Although effectively a single entity, the government functions as a group of agencies who set their own policies and manage their own risks.

      With the existence and role of AGIMO, and the WebGuide, there is a central forum for providing model content to reflect the minimum standard agencies must meet. There are mandatory guidelines for agencies, such as for privacy, however limited guidance on how to meet it. A standard privacy policy could be included and promoted as a base for other agencies to work from, or even provided as an inclusion for sites who wanted to have a policy which was centrally maintained and auto-updated.

      Alternatively web managers across government could work together, through a service such as GovDex, to create and maintain standard pages using a wiki-based approach. This would allow for a consistently improving standard and garner grassroots buy-in, plus leverage the skills of the most experienced web masters.

      There's undoubtably other ways to move towards standardised pages, even simply within an agency, which itself can be a struggle for those with many websites and decentralised web management.


      Regardless of the method selected, the case should receive consideration. Does government really need hundreds of versions of what is standard content, or only a few?


      Examples of government privacy policies (spot the similarities and differences):

      Read full post...

      Monday, April 02, 2012

      Remove PDFs from your site to save money and increase traffic by 160x - the experience of the Vic Department of Primary Industries

      While there may now be accessibility techniques for PDFs, this doesn't mean that the format is necessarily the most appropriate for displaying information on the web and attracting usage, as the Victorian Department of Primary Industries discovered when they removed all PDFs from their website and converted them to web pages.


      As reported in thw case study, Unlock valuable content trapped in PDFs from BriarBird (as brought to my attention by Gian Wild's blog), the Department of Primary Industries Systems and Technical Manager Mark Bryant found that, 

      “As we converted more and more PDFs to HTML/web format, the stats just kept going up and up until we reached around 1.6 million extra page views per year – it was fantastic.”

      Mark also said in the case study that,
      “Our users were telling us they wanted to do things in a different way, and when we converted a few PDFs to web pages we found the web pages outperformed PDF by as much as 160 to one.

      “Initially we tried to create a web page to match each PDF, but in the end we introduced a blanket rule – no PDFs as it was far too difficult to manage both formats,” Mark said.

      “There was some resistance, but the business case is pretty simple when you can show that a web page is being read around 160 times more often than a PDF.

      “If you are spending money preparing content for the web, then that money is essentially being wasted if that content is locked up in a format people are unwilling to use.”
      Over the last ten years I've also consistently noticed a ratio of 100:1 or more for views to webpages vs PDFs in the websites I've managed.

      While PDFs often suit content creators (who are used to MS Word), they are rarely the best format for online content recipients - your audience.

      If your organisation is focused on having the customer at the centre it is worth reviewing your content creation and distribution approach to ensure it aligns with customer needs.

      For example, where a printable version is required, it is possible to achieve this with a print template for web pages using style sheets (CSS) rather than with a PDF. In effect when people click 'print' the web page is automatically reformated for A4 printing. This makes updating much faster and easier as you only have to maintain one version of the content.

      So why not save PDFs for when they are most needed and wanted and ensure that the majority of your content is 'native' using web pages. Your audiences will love you for it.

      Read full post...

      Wednesday, February 01, 2012

      One week left to comment on the Information Commissioner's issue paper on public sector information

      The Office of the Australian Information Commissioner has extended the deadline for commenting on their Information Policy Issue Paper 2: Understanding the value of public sector information in Australia until 8 February 2012.

      If you wish to comment on the paper, visit the Consultations page of the OAIC website.

      Read full post...

      Thursday, January 12, 2012

      New advice on publishing public sector information from AGIMO

      Last week AGIMO released new advice on publishing public sector information, typing together the 2010 Declaration of Open Government, the Office of the Information Commissioner's Principles on open public sector information and introducing a five-step process for publishing and managing Australian Government public sector information.

      The advice also provides information on how to publish, considering accessibility, discrimination, open standards, metadata and documentation.

      The advice, available at the Webguide, is another plank supporting agencies to carry out the government's Government 2.0 agenda and has the endorsement of the Australian Government 2.0 Steering Group.

      The test of it will be how agencies adopt the process over the next year.

      I will be watching avidly.

      Read full post...

      Tuesday, January 10, 2012

      Why aren't Aussies using open government data to create value?

      This post was inspired by a comment by John Sheridan on Twitter,
      craigthomler what I'd like for  ? egs of use of  for service delivery innovation, value creation etc, not just curiosity
      It's a good New Years wish and highlights two questions that I have been pondering for a long time.

      1. Why aren't people making more use of publicly release government data?
      2. Does making government data publicly available have any value if people aren't using it to value add?

      Let's take them in order...

      1. Why aren't people making more use of publicly release government data?
      In Australia the data.gov.au catalogue contains 844 datasets (and growing). NSW (data.nsw.gov.au) and Victoria's (data.vic.gov.au) catalogues are also quite large. 

      By comparison, the US data.gov catalogue contains over 390,000 datasets, Canada's data.gc.ca over 265,000, the UK's data.gov.uk around 7,700,  Singapore's data.gov.sg about 5,000 datasets and New Zealand's data.govt.nz over 1,600 datasets.

      Across these six countries (I am excluding the two states), that is in excess of 670,000 datasets released publicly. However if you search around there's not that many apps listed using the data. The US site lists around 1,150 and Australia's site lists 16 - however that's not many compared to the number of datasets.

      As Victoria's data blog asks, what has happened to all the apps produced in government-sponsored competitions? Are they actually worth holding?

      OK, let's work through a few possibilities. 

      Firstly it could be that these datasets are being widely used, but people simply aren't telling the catalogues. Data may be embedded in websites and apps without any communication back to the central catalogue, or it may be downloaded and used in internal spreadsheets and intranets. In this case there's no actual issue, just a perceived one due to lack of evidence.

      Secondly, to face facts, the majority of people probably are still not aware of these data catalogues - they haven't really been widely promoted and aren't of much interest to the popular media. Therefore there may be hundreds of thousands of people wishing to access certain government information but unaware that it is readily available.

      Thirdly, those people aware of these datasets may be daunted by the number released, unable to find the data they specifically want to use or simply aren't interested.

      Finally, perhaps simply releasing a dataset isn't enough. Few people are data experts or know what to do with a list of values. Could it be that we need simple and free analysis tools as well as raw data?


      There's steps governments can take to address all of these possibilities.

      If people aren't telling the government about their apps, why not establish light 'registration' processes to use them which capture information on why they are being used? Or if this is too invasive, offer people appropriate incentives to tell the central catalogue about their uses of the data.

      Secondly, there may be a need to promote these data catalogues more actively - to build awareness via appropriate promotion.

      Thirdly, perhaps we need to do more user-testing of our data catalogues to better understand if they meet the audience's needs. Combined with excellent mechanisms for suggesting and rating datasets, this could greatly inform the future development and success of these catalogues.

      And finally, governments need to consider the next step. Provide the raw data, but also provide sites and tools that can analyse them. Sure governments are hoping that the public will create these, and maybe they will, however that doesn't mean that agencies can't do so as well. There's also pre-existing tools, such as Yahoo Pipes, IBM's Manyeyes and analytics tools from Google which could be pre-populated with the government datasets, ready for users to play with.

      Alongside all these specific solutions, maybe governments need to start using some of the tools at their disposal to ask why people aren't using their data. Is it the wrong data? Presented in the wrong way? Too hard to use? Market research might help answer these questions.

      2. Does making government data publicly available have any value if people aren't using it to value add?

      Now to take the second question - does it really matter whether people are using open government data anyway?

      Are there other goals that releasing data addresses, such as transparency and accountability, intra-government sharing and culture change?

      If the mandate to release data leads to government culture change it may drive other benefits - improved collaboration, improved policy outcomes, cost-savings through code and knowledge sharing.

      Of course it is harder to provide a direct quantitative link between releasing data, changing culture and the benefits above. However maybe this is what we need to learn how to measure, rather than simply the direct correlation between 844 datasets released, 16 apps created.

      Read full post...

      Friday, October 14, 2011

      Treating bloggers right

      Many organisations still haven't cottoned on to the influence of a number of blogs or how to appropriately approach and engage with them - including PR and advertising agencies who should know better.

      I was reading an excellent example of this the other week, from The Bloggess, where a PR agency not only approached with an inappropriately targeted form letter, which indicated the agency hadn't even read her blog, but responded to her (relatively) polite reply with an annoyed response.

      The situation really escalated, however, when a VP in the PR agency, in an internal email, called her a "F**king bitch" (without the asterisks). This email was accidentally (by the VP) also CCed to The Bloggess.

      The Bloggess took a deep breath, and responded politely, however then received a torrent of abuse from the PR agency.

      At this point she published the entire exchange on her blog - in a post that has already received 1,240 comments, has been shared on Facebook 8,397 times and via Twitter 5,328 times.

      Her comments have also been shared widely and her post read by many of her 164,000 Twitter followers.

      The Bloggess's post is a good read - particularly for government agencies and their PR representatives - on how to behave appropriately when engaging bloggers, and the potential fallout when they don't.

      I'm also keeping a link handy to 'Here's a picture of Wil Wheaton collating papers' for those PR and advertising agencies who send me form emails asking me to post about their product or brand promotions on my blog (and yes there's been a few in the last six months - all Australian agencies).

      Read full post...

      Thursday, September 15, 2011

      "Last in first out" - is this a risk for social media expertise and channel use in government?

      I've seen (and spoken with colleagues about) a number of austerity measures taken in government agencies around Australia over the last few months.

      With various governments across the country looking to cut spending to balance budgets, or at least reduce debt levels, lower 2011-12 budgets require many agencies to look long and hard at what they can trim or where they can do more for less (without affecting services to the public).

      I wonder whether digital channels and expertise has been firmly enough established in many agencies to survive any cuts. Will management focus on their established infrastructure, maintaining their legacy IT systems and 'tried and true' communications and service channels at the expense of newer and more cost-effective, but less mature digital, channels?

      In other words will we see the "last in, first out" rule apply for social media channels and expertise in many agencies?

      (this is slightly rhetorical as I'm already seeing this in action in a few places)

      I hope agencies will use any budget tightening as an opportunity to look long and hard at their operational effectiveness and select the channels which deliver the most 'bang for the buck' and long-term sustainability and viability.

      Of course even if this means cutting non-digital channels in preference to digital, there is still a loss of expertise and corporate knowledge - though potentially a more sustainable one into the future.

      Do you see signs that budget pressures are impacting on your agency's online capability? (feel free to respond anonymously & keep the relevant public service code of conduct in mind)

      Read full post...

      Wednesday, June 29, 2011

      Are telephones a natural medium for internet natives?

      I wanted to share this interesting post discussing the challenges faced by people used to online communications technologies when attempting to use old technologies like the telephone.

      Technology’s Child: Why 21st-Century Teens Can’t Talk On the Phone discusses how phones conversations are "both too slow and too fast" and don't provide mechanisms for thinking about and carefully editing what is said.

      Will telephone ettiquette become a victim of the internet revolution, replaced by new skills?

      Time will tell.

      Read full post...

      Tuesday, May 17, 2011

      How much do your agency websites cost - and are they cost-effective?

      I have long struggled with techniques for costing websites in Government. Due to how resources and budgets are allocated - with program areas funding and conducting some content work, corporate areas other and infrastructure and network costs often rolled into a central budget in IT teams (which provides excellent economies of scale, but makes costing individual web properties harder) - it can be very hard to come to a complete and accurate figure on what any government website costs to launch or maintain.

      Regardless, we are all driven by budgets and must determine ways to estimate costs for planning new websites and set management, improvement and maintenance budgets for existing ones.

      A step further than costs is value, a necessary part of any cost-benefit equation. In order to assess whether a website is cost-effective - or at least more cost-effective than alternative tools - it is vital to be able to demonstrate how websites add value to an agency's operations.

      Unfortunately value is an even more nebulous figure than cost as it often has to measure qualitative rather than quantitative benefits.

      Sure you can count the number of website visits, visitors or pageviews, or in social media terms, fans and followers, however this is much like judging a meeting's success by the number of people who show up - the more people, the more successful the meeting.

      This metric works when you can place a commercial value on a visit - so this may work effectively for ecommerce sites, but not for most government sites.

      Another approach is to look at the cost per visit, with a presumption that a lower cost is better. However this relies on fully understanding the cost of websites in the first place, and also assumes that a cost/value ratio has meaning. For some websites a high cost might be appropriate (such as a suicide prevention site), whereas for other sites a lower ratio might be appropriate (such as a corporate informational site).

      Perhaps the key is related to that ecommerce site example, where the sales of goods is an outcome of a visit, therefore the commercial value of a visit is effectively a site outcome measure.


      The next challenge is assessing the outcomes agencies desire from their websites and giving them some form of quantitative value. Completing an online form, rather than an offline form might be worth $5 to an agency, reading an FAQ and therefore not calling or emailing an agency might be worth $30, reading FOI information online rather than making an FOI request might be worth $500, whereas reading emergency news, versus having to rescue someone might be worth $5,000.

      Of course this quantitative measure of values for outcomes is relative and has very large assumptions - however it does provide a model that can be tweaked and adjusted to provide a fair value of a site.

      It also has a far more valuable purpose - it forces agencies to consider the primary objectives of their website and how well their most important outcomes are satisfied by site design, content and navigation.

      If the main purpose of a site is to provide information on a program such that program staff aren't responding to calls from media and public all day, then the appropriate information needs to be front and centre, not hidden three levels deep in a menu. If the main purpose is to have people complete a process online, then the forms must be fillable online and back-end systems support the entire process without having gaps that force people to phone.


      Are there other more effective ways of measuring cost and value of websites? I'd love to hear from you.

      And for further reading, the posts from Diane Railton at drcc about UK government website costs are excellent reading, How much does your website cost?

      Read full post...

      Tuesday, May 10, 2011

      Harper Collins limits library eBook use to 26 lends before repurchase

      There's lots of interesting debates going on about ownership at the moment.

      Are the products and content you buy and enjoy owned by you? Do you have the right to switch formats, modify hardware, install software or make a personal copy?

      Sony has been fighting for years to prevent customers from modding their Playstations, arguing that customers do not have the right to install unauthorised hardware or software (even accepting you void the warranty).

      Movie and music distributors have long held the position that if you bought a cassette tape or video you have no right to the DVD version of the movie or song at simply the cost of the medium. You must buy the content again. Equally, in moving from DVDs to online, people in Australia do not have a legal right to download a movie or music they have already bought.

      As more content is digitalised, this ownership debate is spreading, with the latest areas of contention being ebooks. It seems that at least one book publisher is arguing similarly that libraries may not enjoy unlimited lending rights to ebooks they purchase, despite being allowed to lend out a paper copy as many times as they like.

      In response to fears that people will simply borrow these ebooks online, thereby cutting into book sales (which are already heavily moving online), Harper Collins has locked ebooks sold (via the OverDrive service) to libraries in the US and Canada. After 26 lends each ebook becomes unusable and the library must repurchase it to keep lending it out.

      This move has prompted outrage amongst librarians across North America, and a number of libraries have already boycotted Harper Collins, refusing to buy any further books they publish, in any format, until the policy is changed.

      If Harper Collins' decision is upheld, it may have major cost implications for public libraries in the future - as well as for organisations that maintain their own libraries, that buy business books for staff training purposes or even for citizens.

      Imagine only being able to read a book, watch a movie or listen to music you'd purchased a publisher-designated number of times before being forced to re-buy it.

      Oh - and I didn't mention that Harper Collins also wants to collect information on all readers borrowing ebooks from public libraries, so it can better understand and market to them.

      That's not a particularly open or transparent world.


      Here's some further articles discussing Harper Collins' decision:
      And there's also now a petition with over 60,000 signatures opposing the plan.

      Read full post...

      Tuesday, November 30, 2010

      Canberra Gov 2.0 lunch - 8 December

      It has been a big year for Government 2.0 in Australia, both at the federal and state levels. The Victorian Government in particular has committed to releasing the majority of public sector information under an open copyright license, continued to improve its whole-of-government intranet and released the Government 2.0 Action plan: a comprehensive strategy for guiding Victoria's government 2.0 efforts.


      To celebrate the close of the Gov 2.0 year, and to discuss the initiatives in Victoria, we're lucky to have Maria Katsonis, from the Victorian Government's Department of Premier and Cabinet, in Canberra.

      Maria is currently the Principal Adviser, Public Administration in the Department of Premier and Cabinet, leading projects that examine issues that shape and influence the Victorian public sector. This has included the development and implementation of the Government 2.0 Action Plan released earlier this year and the VPS Innovation Action Plan released in 2009.

      Previously Maria was Executive Director of Public Policy and Organisation Reviews at the State Services Authority where she led reviews at the request of the Premier, Ministers and Secretaries. She has also held the role of Assistant Director, Social Policy in the Department of Premier and Cabinet.

      Maria has a Master of Public Administration from the Kennedy School of Government at Harvard University and is a Fellow of Leadership Victoria.


      I know this is short notice, however if you are able to join us at Café in the House in the Old Parliament House for lunch Maria will be providing an interesting and insightful glimpse into how one goes about establishing and executing a whole-of-government Gov 2.0 program.

      Register here

      Read full post...

      Thursday, November 18, 2010

      The danger of permanent internet exclusion to egovernment and Gov 2.0

      The internet is increasingly defining the 21st century.

      It has become the primary medium used to find and share information, the most commonly used news and entertainment medium and has unleashed an outpouring of creativity which commentators, such as Clay Shirky have described as "the greatest in human history".

      Equally there have been pressures to constrain aspects of the internet. Around the world a number of nations are blocking access to certain pages, websites and services - sometimes based on concerns on the appropriateness of content, sometimes due to economic or political pressure.

      There have even been attempts, spearheaded by significant copyright holders, to block internet access for significant periods of time - or even permanently - from households or individuals accused of repeated copyright violations.

      This last topic is worth debate in a eGovernment and Gov 2.0 context.

      As governments shift information, services and engagement activities online there is greater expectation - and hope - that citizens will use the internet to interact with agencies.

      By shifting services online governments can cut offices and employ less phone staff.

      In a country where all citizens have the right to access the internet this is not an issue. Anyone who can engage online is encouraged to do so and offline government services can be reconfigured to suit audiences who are unable or unwilling to use the internet. Everyone wins.

      However what happens in a nation where internet access can be denied to otherwise capable citizens, either for long periods of time or permanently?

      What is the commercial impact after television and telephony have migrated to a (for instance) national broadband network? How would this distort these peoples' access to government services? What additional costs (at taxpayer expense) would government be forced to incur to service these people effectively? Does it exclude them from democratic participation or from vital health and welfare information?

      I can't see any nation deciding to permanently cut access to an individual or household's telephony services because they used it to make a few abusive calls. Neither can I see any state denying a household access to electricity or water because one resident was convicted several times for growing illicit drugs via a hydroponic system in their bedroom.

      However there are real threats emerging around the world that some individuals or households may be permanently excluded from online participation based on accusations, or convictions, for a few minor offenses.


      An example is France, which enacted a 'three strikes' law in 2009. Reportedly record companies are now sending 25,000 complaints per day via ISPs to French citizens they are accusing of flouting copyright laws.

      Under the law French citizens receive two warnings and can then be disconnected from their ISP and placed on a 'no internet' blacklist - denying them access to the online world, potentially permanently.

      While this approach was designed to discourage illegal activity, early indications are that this doesn't appear to have succeeded as piracy may have risen. It also, apparently, has annoyed US law enforcement agencies as it may encourage greater use of freely available, industrial strength, encryption technologies, thereby making it much harder to distinguish between major criminal organisations and file downloaders and hurting law enforcement activities.

      This is similar to an often-repeated storyline in Superman comics, when Superman can identify criminals as they are the only ones using lead shielding on their homes to block his X-Ray vision. If everyone used lead shielding, Superman couldn't tell the bad guys from the good guys (there's a future storyline for DC).


      Most importantly a 'three strikes and you're off' approach - or equivalent law - risks permanently excluding people from the most important 21st century medium, simply for being accused three times of copyright violation. Arguably, in today's world, that's a much more severe judgement than people receive for multiple murders, rapes or armed robbery.

      I don't see the Australian government rushing to embrace a similar approach, however it still raises the question of whether we need to consider internet access as a right at the same level as access to electricity or telephones.

      Other nations are considering this as well. Several European countries have already declared internet access a fundamental human right, including France, which places the country in an interesting position.

      The European Union (of which France is a member) has rejected a 3-strike law and, as Boing Boing reported, progressive MEPs wrote a set of "Citizens Rights" amendments that established that internet access was a fundamental right that cannot be taken away without judicial review and actual findings of wrongdoing.

      As the internet has now moved from a 'nice-to-have' service to a 'must-have' utility for many people, even actual findings of wrongdoing may no longer be sufficient reason to permanently exclude people. In fact this may be legally impossible to enforce anyway, due to public access and mobile services.

      Given the potential negative impacts on democratic participation, the ongoing cost to government and the potential commercial and social impacts - should it be possible for a government to legislate, a court to dictate or for ISPs to refuse to connect some citizens to the internet permanently?

      Read full post...

      Wednesday, November 10, 2010

      Whether to reuse or build - government choices in a connected world

      There's been discussion on Twitter over the last day about whether Australian government should be building online platforms, such as a video aggregation and distribution service, URL shortcut tools (which Victoria have done) or collective infrastructure for hosting and developing all government websites.

      This has been an area of on-and-off discussion for over a year in the Government 2.0 context, with several Gov 2.0 Taskforce projects exploring potential opportunities for Australian governments to build systems such as these.

      I expect this to continue to be a debate for many years. Choosing whether to build a service, or tap into a commercial one, can be a tough decision - even tougher online than it is in the physical world.

      Why so tough a decision?

      For starters, many of the services which government could use are hosted overseas, therefore posing some level of sovereign risk - whether that be,

      • a concern over whether the service will continue to provide what Australia needs (when foreign laws and business policies may change),
      • that personal or secure data might be accessed and misused by another jurisdiction (especially all those people who only use one password), or
      • that it might provide an entry point for hackers seeking confidential and secret government information.
      On the other hand, existing online services are frequently cheap and fast to implement, plus several are the 'norm' that people use around the world (such as Google, YouTube and eBay).

      In many cases government created systems could have to be developed to the extent where they are commercially competitive in order to attract the level of user traffic needed to justify their continued existence.

      So how to reconcile these differing perspectives... There's no single answer in my view. Decisions need to be made case by case. What makes sense for some jurisdictions won't for others and decisions that are right for one type of service won't be for another.

      In lieu of an easy answer, I offer up four tests that I believe these types of reuse or build choices need to consider.
      1. Will it reduce private sector competition?
        In other words, is the government competing directly against enterprise. If so there may be job and tax implications. Generally Australian governments shy away from entering commercial markets except when private enterprise is unwilling or unable to deliver the services to the entire population at a fair price.
      2. Will government deliver a superior outcome?
        This tests whether a government-run enterprise will provide a better outcome than a private sector organisation. Strange as it may seem, governments are better at providing some services and outcomes than private industry - particularly where equity or public value is an issue. If the government can deliver a superior outcome there is a strong case for stepping in - if private sector companies miss out then they need to look at whether they should have restructured.
      3. Will it attract a significantly large and appropriate audience?
        It is very important to consider whether a government-run service will attract enough users to make it worthwhile. For example, Facebook has build its audience over a number of years, holding on to them through being so useful that people cannot abandon it without damaging their social networks. If the bulk of the audience use Facebook, would they use 'Govbook' - a government equivalent service, even if it is a superior product? The answer may not always be yes - and without audience a government service may not achieve its goals.
      4. Is it sustainable?
        In asking this I mean will a government continue to support and run the service over an extended period of time - perhaps even transitioning it to a private concern. Or is it possible that funds will be cut to a level where the service is unable to continue to innovate and improve, thereby seeing the service slip into irrelevance. Funding maintenance alone is no longer sufficient to address the rate of development online.
      Of course these tests are merely suggestions. As pointed out on Twitter they are more guidelines than rules.

      However I think that applying these tests will support more effective, evidence-based decisions - particularly in light of the large number of demands on government resources and time.

      Read full post...

      Thursday, August 12, 2010

      Victoria releases best-practice Gov 2.0 Action Plan

      Victoria has maintained its lead over other Australian states in the adoption of Government 2.0 through today's release of the Government 2.0 Action Plan - Victoria.

      The Plan outlines four priority areas for Gov 2.0:

      1. Driving adoption in the VPS > Leadership
      2. Engaging communities and citizens > Participation
      3. Opening up government > Transparency
      4. Building capability > Performance

      With 14 initiaitves under these priorities, the plan was devised using extensive consultation and a wiki-based approach, engaging a wide range of stakeholders across government.

      This approach, previously used in New Zealand, the US and the United Kingdom, has proven effective in generating significant engagement and support for the eventuating plan.

      Rather than a 'big bang' approach - as used for many government initiatives, the Plan state that:
      Our approach to implementation is think big, start small and scale fast.

      In my view, Victoria's Gov 2.0 Action Plan is an example of best practice in how to prepare to systematically embed Government 2.0 techniques and tools into a government, taking the necessary steps to reform public sector culture, build capability, engage proactively and innovate iteratively to deliver the best outcomes for citizens.

      I believe that the effective execution of this Action Plan, ahead of Gov 2.0 efforts in other states, will give Victoria a substantial first-mover economic advantage, positioning the state as more innovative and better equipped to service citizens and businesses in the 21st Century.

      Read full post...

      Thursday, July 01, 2010

      Still on the Internet Explorer 6 web browser? Microsoft tells organisations to ditch it

      Microsoft has just released a beta version of Internet Explorer 9, however is still having to ask organisations to stop using Internet Explorer 6 (IE6).

      Despite lacking the ability to fully view the modern web IE6, released nine years ago, is still used by a number of Australian organisations, including some government agencies.

      The Sydney Morning Herald, in the article Microsoft begs users to ditch IE6 quotes Microsoft Australia's chief security officer, Stuart Strathdee as saying “IE6 has a lifecycle. We’re well beyond its expiry date”.

      The article also stated that,

      Strathdee said corporate users who haven’t yet upgraded to IE8 fearing the loss of customised ERP and CRM systems were probably running outdated versions of those and should look to upgrade them all. He said the company would be happy to help customers do so.

      “It’s only a very small number of queries on those systems that would be locked to IE6,” he said.

      “For us security and privacy are closely related. We’re really pleading with people to upgrade.”

      Is your agency still using IE6?

      If so the question becomes, are your senior management aware of the security and reputation risks they are taking by doing so?

      Read full post...

      Friday, June 11, 2010

      Reinventing website perfection

      Traditionally, in my experience both in the private and public sector, the way to build a 'perfect' website has been considered to be;
      invest a large quantity of resources, personnel and time at the start of the development process,
      use this investment to build all the functionality that the developers can dream up, write all the content the communicators can think of and test it with audiences,
      launch the 'perfect' website and hope it works, and then
      replace the website (fixing most of the bits that failed) after 3-5 years by repeating the process again.

      Personally I've never liked this approach. It places a lot of reliance on using past knowledge to guess future (organisational and audience) needs, involves investing a lot of resources upfront with limited ability to terminate or redirect projects until after they have failed and it also results in websites that degrade in effectiveness over time which can lead to progressively greater reputation and legal risks.

      I'd like to see the process for developing a 'perfect' website reinvented. The new process must involve a low upfront cost, the ability to be flexible and agile to meet changing needs quickly and be capable of making a website more and more effective over time, improving reputation and reducing legal risks.

      But how is it possible to achieve all these goals at once?

      The answer is actually quite simple and well understood by successful entrepreneurs.

      Rather than aiming for a perfect site on release day after an extended development period, the goal is to quickly build and launch a site that meets at least one critical audience need.

      Once the site has been launched, ensure there are tools for monitoring how it is used and identifying user needs. Then progressively build extra functionality and write more content, guided primarily by the needs of your audience.

      This approach ensures the site has enough value at launch to be successful, albeit in a more limited fashion than a 'kitchen sink' website (with more functionality at launch). It also ensures that the website grows progressively more useful and relevant to the audience you aim to serve.

      In this way the site becomes increasingly perfect in a more realistic way - perfect for the audience who use it, rather than 'perfect' for the stakeholders who think they know what different audiences want.

      We see this approach taken with all kinds of websites and products - from Apple's iPhones through to online services such as Gmail.

      It's time to see more of this approach used with government websites as well.

      After all - don't we want to create the 'perfect' website for our audiences' needs?

      Read full post...

      Monday, April 26, 2010

      What would you do if you had unlimited funds to spend on your department's online presence?

      Everyone who runs a website dreams of what they would do if they had more funds to spend on improving their online presence.

      I've been doing some thinking around this lately as a thought exercise around building priority lists for what needs to be done to strengthen my department's online presence.

      I always come back to strengthening base infrastructure first. Ensuring that our own staff have the best tools for their tasks, including high-powered computers, the right software, effective and fully implemented content management and reporting systems, appropriate connections between data and publishing to enable a consistent approach to openness and transparency and, very importantly, that all the staff concerned have the training and support to use all of these systems effectively and to their full potential.

      Next for me is strengthening governance and management, doing what is necessary to ensure my department has all of the appropriate governance and standards in place to operate a current, flexible and responsive online presence - including outreach activities to third party websites. blogs, forums and social networks.

      Third I look at capability building. Putting in place the systems and functionality that extends the basic infrastructure to allow the department to manage emerging needs.

      Interspersed amongst the priorities above are the staffing required to deliver what is needed and redevelopment of websites and tools as required to ensure our online presence meets the needs of our audiences, stakeholders and the government.


      Given that funding is not unlimited for most online managers, the next step is to consider what can be done within budget constrains. It's important to also look at which pieces can be funded from other budgets (such as staff training) or whether additional funds can be requested to meet legislative or campaign requirements or as part of modernisation initiatives.

      While it's not possible to do everything you want, there is often quite a bit you can actually achieve if you're prepared to spend the time educating decision-makers, liaising with other business areas and building the business cases needed to source funds.

      So if you were given a blank cheque, what would you prioritise?

      And given that you are unlikely to have one, what will you choose to actually achieve?

      Read full post...

      Tuesday, March 30, 2010

      Australian public servants told three times - open (reusable) government data is important.

      The Australian Public Service (APS) has now been told three times by three different reports in the last year about the importance of releasing much of its information openly to the community.

      This began with reforms to Freedom of Information which, once passed, will encourage a pro-disclosure environment within the APS and make it easier and cheaper for people to request information from government.

      Second was the Gov 2.0 Taskforce Final Report: Engage, which recommended managing public sector information as a national resource, releasing most of it for free and in ways that promoted reuse in innovative ways.

      Third is the report released yesterday by the Department of Prime Minister and Cabinet, Ahead of the Game: Blueprint for the Reform of Australian Government Administration. The report recommended that Departments should create more open government, with one of the detailed sub-recommendations being,

      Greater disclosure of public sector data and mechanisms to access the data so that citizens can use the data to create helpful information for all, in line with privacy and secrecy principles;
      The last two reports are yet to be responded to by the Australian Government, however I hope that Australian public servants at all levels are taking note.

      Once is chance, twice is coincidence, but three times is a strategy.

      Read full post...

      Bookmark and Share