Showing posts with label information management. Show all posts
Showing posts with label information management. Show all posts

Wednesday, January 15, 2014

Rethinking government IT to support the changing needs of government

We recently saw a change in the federal government in Australia, with a corresponding reorganisation of agency priorities and structures.

Some departments ceased to exist (such as Department of Regional Australia), others split (DEEWR into two departments, Education and Employment) and still others had parts 'broken off' and moved elsewhere (Health and Ageing, which lost Ageing to the (renamed) Department of Social Services).

This isn't a new phenomenon, nor is it limited to changes in government - departments and agencies are often reorganised and reconfigured to serve the priorities of the government of the day and, where possible, create efficiencies - saving money and time.

These adjustments can result in the movement of tens, hundreds or even thousands of staff between agencies and regular restructures inside agencies that result in changing reporting lines and processes.

While these reorganisations and restructures - Machinery of Government changes (or MOGs) as they are known - often look good on paper, in reality it can take time for efficiencies to be realised (if they are actually being measured).

Firstly there's the human factor - changing the priorities and allegiances of staff takes time and empathy, particularly when public servants are committed and passionate about their jobs. They may need to change their location, workplace behaviours and/or learn a new set of processes (if changing agency) while dealing with new personalities and IT systems.

There's the structural factor - when restructured, merged or demerged public sector organisations need to revisit their priorities and reallocate their resources appropriately. This can extend to creating, closing down or handing over functions, dealing with legal requirements or documenting procedures that an agency now has to follow or another agency has taken over.

Finally there's the IT factor - bringing together or separating the IT systems used by staff to enable them to do their work.

In my view the IT component has become the hardest to resolve smoothly and cost-effectively due to how government agencies have structured their systems.

Every agency and department has made different IT choices - Lotus Notes here, Microsoft Outlet there, different desktop environments, back-end systems (HR and Finance for example), different web management systems, different security frameworks, programming environments and outsourced IT partners.

This means that moving even a small group of people from one department to another can be a major IT undertaking. Their personal records, information and archival records about the programs they work on, their desktop systems, emails, files and more must be moved from one secure environment to another, not to mention decoupling any websites they manage from one department's web content management system and mirroring or recreating the environment for another agency.

On top of this are the many IT services people are now using - from social media accounts in Facebook and Twitter, to their email list subscriptions (which break when their emails change) and more.

On top of this are the impacts of IT service changes on individuals. Anyone who has worked in a Lotus Notes environment for email, compared to, for example, Microsoft Outlook, appreciates how different these email clients are and how profoundly the differences impact on workplace behaviour and communication. Switching between systems can be enormously difficult for an individual, let alone an organisation, risking the loss of substantial corporate knowledge - historical conversations and contacts - alongside the frustrations of adapting to how different systems work.

Similarly websites aren't websites. While the quaint notion persists that 'a website' is a discreet entity which can easily be moved from server to server, organisation to organisation, most 'websites' today are better described as interactive front-ends for sophisticated web content management systems. These web content management systems may be used to manage dozens or even hundreds of 'websites' in the same system, storing content and data in integrated tables at the back-end.

This makes it tricky to identify where one website ends and another begins (particularly when content, templates and functionality is shared). Moving a website between agencies isn't as simple as moving some HTML pages from one server to another (or reallocating a server to a new department) - it isn't even as easy as copying some data tables and files out of a content management system. There's enormous complexity involved in identifying what is shared (and so must be cloned) and ensuring that the website retains all the content and functionality required as it moves.

Changing IT systems can be enormously complex when an organisation is left unchanged, let alone when when teams are changing agencies or where agencies merge. In fact I've seen it take three or more years to bring people onto an email system or delink a website from a previous agency.

As government increasingly digitalises - and reflecting on the current government's goal to have all government services delivered online by 2017 - the cost, complexity and time involved to complete  these MOG changes will only increase.

This risks crippling some areas of government or restricting the ability of the government of the day to adjust departments to meet their policy objectives - in other words allowing the (IT) tail to wag the (efficient and effective government) dog.

This isn't a far future issue either - I am aware of instances over the past five years where government policy has had to be modified to fit the limitations of agency IT systems - or where services have been delivered by agencies other than the ones responsible, or simply not delivered due to agency IT restrictions, costs or issues.

Note that this isn't an issue with agency IT teams. These groups are doing their best to meet government requirements within the resources they have, however they are trapped between the cost of maintaining ageing legacy systems - which cannot be switched off and they don't have the budget to substantially replace them - and keeping up with new technological developments, the increasing thirst for IT-enabled services and gadgets.

They're doing this in an environment where IT spending in government is flat or declining and agencies are attempting to save money around the edges, without being granted the capital amounts they need to invest in 'root and branch' efficiencies by rebuilding systems from the ground up.

So what needs to be done to rethink government IT to support the changing needs of government?

It needs to start with the recognition at political levels that without IT we would not have a functioning government. That IT is fundamental to enabling government to manage a nation as large and complex as Australia - our tax system, health system, social security and defence would all cease to function without the sophisticated IT systems we have in place.

Australia's Prime Minister is also Australia's Chief Technology Officer - almost every decision he makes has an impact on how the government designs, operates or modifies the IT systems that allow Australia to function as a nation.

While IT considerations shouldn't drive national decisions, they need to be considered and adequately resourced in order for the Australia government to achieve its potential, realise efficiencies and deliver the services it provides to citizens.

Beyond this realisation, the importance of IT needs to be top-of-mind for Secretaries, or their equivalents, and their 'C' level team. They need to be sufficiently IT-savvy to understand the consequences of decisions that affect IT systems and appreciate the cost and complexity of meeting the priorities of government.

Once IT's importance is clearly recognised at a political and public sector leadership level, government needs to be clear on what it requires from IT and CIOs need to be clear on the consequences and trade-offs in those decisions.

Government systems could be redesigned from the ground-up to make it easy to reorganise, merge and demerge departments - either using common IT platforms and services for staff (such as an APS-wide email system, standard web content management platform, single HR of financial systems), or by only selecting vendors whose systems allow easy and standard ways to export and import data - so that a person's email system can be rapidly and easily moved from one agency to another, or the HR information of two departments can be consolidated in a merger at low cost. User Interfaces should be largely standardised - so that email works the same way from any computer in any agency in government - and as much code as possible should be reused between agencies to minimise the customisation that results in even similar systems drifting apart over time.

The use of these approaches would significantly cut the cost of MOGs, as well as free up departmental IT to focus on improvements, rather than meeting the minimum requirements, a major efficiency saving over time.

Unfortunately I don't think we're, as yet, in a position for this type of significant rethink of whole of government IT to take place.

For the most part government still functions, is reasonably efficient and is managing to keep all the lights on (even if juggling the balls is getting progressively harder).

It took the complete collapse of the Queensland Health payroll project to get the government there to act to rethink their systems, and it is likely to take a similar collapse - of our Medicare, Centrelink or tax system - for similar rethinking to occur federally.

However I would not like to be a member of the government in power when (not if) this occurs.

Read full post...

Thursday, October 10, 2013

The road to public sector IT hell may not be paved with intentions at all

Something that scares me enormously is the house of cards that many (if not most) governments have built with their IT systems.

It can be witnessed every time government agencies get 'MOGed' - Machinery of Government changes where parts of agencies are shifted to other agencies to meet the latest political whim.

In these cases it's not simply a matter of moving tens, hundreds or even thousands of public servants to new offices - in fact in many cases they may not move at all - it is about extracting them from the secure environment, software and network systems of one agency and connecting them (including all their historical records, emails and files) to the network and software of another.

This is a hugely complex and increasingly expensive exercise that can have an enormous productivity and cost hit each time it occurs.

Why is it complex and expensive? Because every agency uses different systems - or different versions of systems - and agencies are now so wedded to these systems after a purchase decision many years earlier that, even though senior bureaucrats recognise the issue, they can not address it without a complete (expensive and time-consuming) overhaul of how government runs its information technology.

Another example is eTax. While I have a great deal of praise for eTax, and it has been very successful by most measures, when the system was originally procured and built it was done in such a way that limited it to the IBM-PC platform. Certainly no-one can blame the ATO for not foreseeing the rise of Apple or the arrival of smartphones and tablets - however the decisions made at the time locked the system into a single platform, which has caused significant pain over the years.

Other examples include the Department of Finance and Deregulation's choice of a document management system as a Web Content Management System for www.australia.gov.au, an entirely appropriate decision at the time based on their well-governed procurement approach, but which led to delays and cost blowouts, constraining the site from what it could have become.

A better known example would be the failure of the Queensland Health payroll system several years ago, where an enquiry is still ongoing. It even has its own website - www.healthpayrollinquiry.qld.gov.au

Indeed, there are hundreds of examples both big and small, where this has occurred - a decision has been taken with the best possible knowledge at the time, or small incremental decisions have been taken over time - all for the right reasons - which have inadvertantly led into blind alleys or very expensive remedial work years later.

And lest you think this is an issue only for the public sector, consider the disaster that was Telstra's bill payment system, the issues our largest banks have had keeping their systems operating, or Virgin's booking system.

With the pace of change accelerating and the increasing limits on public sector employment, the likelihood is that these types of issue will continue to grow and plague IT, becoming even more widespread and expensive.

Agencies could increasingly find themselves trapped into slow and inefficient systems, restricting staff productivity and absorbing more and more of their resources to maintain, with no funds to 'jump tracks' to more future-proofed solutions.

This can even affect the performance of elected governments - who may be forced to change their policies to fit IT limitations. I am already aware of government initiatives that have had to be abandoned (never having seen the light of day) not because they were bad ideas but because the IT constraints in government make them impossible to cost-effectively deliver.

This isn't the fault of public servants or of politicians - seeing that far into the future simply isn't possible anymore. Technology isn't progress linearly and the accelerating rate of change means left-field technologies can appear and radically transform peoples' expectations and strain existing IT systems within a few years (remember the iPhone).

There's many more of these technologies emerging around us. For example 3D printers, capable of printing anything from kitchen utensils to medical devices to firearms, disintermediating physical manufacturers, opening a new front in the ownership of intellectual property and providing access to deadly weapons. There's also unmanned aerial vehicles (UAVs), drones that are capable of live-streaming video, or even carrying weapons, that can be bought online for a few hundred dollars and flown with limited chance of detection by individuals or corporations.

Many others technologies from Google Goggles to driverless cars are in development and could, in increasingly shorter timeframes, radically transform societies.

So when government agencies are still struggling to manage and maintain their legacy green-screen mainframe systems, out-dated (insecure and unsupported) web browsers, where they are locked into increasingly expensive proprietary technologies (due to the cost and resourcing required to migrate - even changing email systems can cost our largest agencies $100 million or more), what are they to do?

There's little time for innovation or for thinking of consequences - the majority of resources in an agency's IT team are committed to maintenance and quick patches on existing solutions.

The likely outcome over time is that we'll start to see more catastrophic IT failures - particularly across the most complex and most essential systems - such as welfare, payroll and grants management.

So how do we fix this? How do we break the cycle before the cycle breaks us?

There's no simply solution, but there's fortunately some trends which work for government agencies facing this challenge - if they're prepared to consider them.

A big area is open source software, which is increasingly being used by agencies in a variety of ways. While open source can run into the same issues as proprietary software, a platform with a large and diverse group of users can combine their IT assets to ensure the system is more useful to agencies and more rapidly updated as the world around it changes.

Another area is cloud-based solutions, which allow a government to more rapidly reconfigure itself to meet the needs of political masters. When software is independent from computer systems and there's a government-wide secure environment which can host software approved for use it can be far faster and cheaper for people moving agencies to retain the files and applications they require.

There's open data - which when made available in machine-readable formats liberates the data from proprietary systems and simplifies how it may be discovered and reused by other agencies (as well as the public).

These trends do not allow governments to replace all their existing systems - however they allow agencies to contain the problem to critical systems, which allowing all other services to be done 'in the cloud'. Imagine, a single email system and intranet across government. A web-based suite of office tools, graphic design tools, finance and HR tools - which can be managed centrally within a government, leaving agency IT teams to focus on the unique systems they can't share.

What does this vision take? Intention, planning and choice.

Governments that fail to proactively and intentionally plan their futures, who simply live on autopilot, will inevitable crash - not today, not tomorrow, maybe not in five years, but eventually - and the damage that their crashes will cause may take decades to recover from.

So for agencies who see themselves as being a continuous entity, with an existence that will exist as long as the state they serve, it is imperative that they plan intentionally, that they engage their Ministers and all their staff in understanding and addressing this issue.

It is not good intentions that will cause agency IT to fail, it is the lack of intention, and that is highly addressable.

CORRECTION: I have been advised by John Sheridan, the Australian Government CTO, there was no cost-overrun on australia.gov.au, it was a fixed price contract.


Read full post...

Thursday, July 25, 2013

Social media impacts on ICT teams - presentation from the Technology in Government conference

Over the last two days I've been down at the Technology in Government conference - an event I thought went very well, with a great group of speakers (including the UK Government's CIO Liam Maxwell).

I gave a presentation this morning, and chaired the afternoon, for the Connected Government stream and have uploaded my presentation for wider access.

In it I discussed the impact of social media on agency ICT teams and some potential approaches they can take to work with business areas to ensure that agency goals are met with a minimum of intra-agency friction.

Overall my message was that social media must be engaged with, not ignored, in government and agency ICT teams have a role to play.

There's several stances ICT teams can take - whether as a leader, supporter or observer of agency social media efforts and, depending on this stance, they could take on a greater or lesser involvement in the various roles required to implement a successful social media approach.

Social media offers benefits for ICT teams, as it does for other areas of agencies - it is simply up to ICT leadership to either step up and work with business areas in a closer ongoing way, or stay out of the way and allow other areas of an agency to move forward.



Read full post...

Wednesday, April 03, 2013

Opening the vault - Open data in Queensland - watch the livestream

Today the 'Opening the vault' event is being held in Brisbane, discussing open data in the context of the state.

Following from the Queensland Government's commitment to open data (with the appointment of Australia's first e-Government Assistant Minister), the event was opened by Premier Newman and is being livestreamed on the web - demonstrating the level of importance placed on this area in the state.

You may follow the event on Twitter using the #dataqld hashtag, and watch the livestream at data.qld.gov.au.

Keep an eye on the session after 11:30am Queensland time (12:30pm AEST) for the finalists in the latest data competition and to vote on who should win.

I've also embedded the livestream below.

Streaming by Ustream

Read full post...

Wednesday, March 27, 2013

The power of open data is often in serendipity

I often hear talk from government agencies about their wish to release more of their data openly, but their concern over how they allocate resources to ensure the most useful data is released first.

In several conversations I've had in different parts of Australia, the agency view was that they only wanted to release useful data, and were prepared to set up an internal review process to assess how useful data could be, then selectively release what they decided was valuable.

I strongly oppose this approach on the basis that it shouldn't be agencies who decide what data is useful, to whom, when or where.

There's no evidence that government agencies have the skills to successfully decide which data may be useful to particular groups in the broader community, or which won't. There's also no evidence that they are good at successfully predicting the future, which data will become useful at a future date.

My view is that agencies should simply release all the data they can without trying to assign levels of usefulness.

Decisions on usefulness should be left to the users - the community - allowing serendipity to thrive.


An example of this was featured at a Gov 2.0 Canberra lunch in November 2012, where Jake McMullin spoke about his use of a open dataset from the National Library to create a unique mobile app.

When he'd created the prototype app, he walked into the library and showed the first staff member he saw (who happened to be the project manager for their iPhone catalogue app).

As a result of this serendipitous meeting, the National Library funded the app, which has just been released in the iTunes store under the name Forte, with an accompanying event (on 25 March) and video (below).

Forte provides a way to explore the National Library's digitalised Australian sheet music catalogue by decade and composer.

The dataset Jake used had been released a year earlier by the National Library for a hack event, however had not been previously used, as another National Library staff member, Paul Hagon, discusses in his blog.

Government agencies cannot predict these types of events - which, when, where or how a dataset will become useful if it is released as open data. And they shouldn't try.

The power of open data is often in serendipity.

Read full post...

Tuesday, March 26, 2013

South Australia consulting on ICT policy

The South Australian government has released its draft ICT policy, SA Connected, for public consultation via the SA Plan consultation site.

SA ICT draft position paper's five key perspectives - serving people, innovating now, securing resilience, working together and improving delivery
The five key perspectives in the SA draft ICT policy 
The position paper, which has already undergone industry consultation, presents five key perspectives for the future of South Australian government IT,

  • Serving People
  • Innovating Now
  • Securing Resilience
  • Working Together, and
  • Improving Delivery
In what may be a first, the plan is available in ePub format for eReaders, although there's no HTML version and consultation is only via email reply.

The plan emphasises the need for government to innovate in partnership with industry,
We want to embed a new culture of innovation between government agencies, and between government and industry. Using and improving technology allows us to break down barriers that have previously prevented us finding shared solutions to common problems. To improve our ability to innovate, we will work more closely with industry to develop a practical and sensible framework for introducing new technologies into government.
It also recognises the need for the public sector to work in a co-ordinated manner, not simple as agency silos, and to employ an agile and iterative approach to ICT.

SA Connected also neatly uses personas to portray the potential future uses of ICT in government by 2030 - presenting a very positive view of how it could enable citizens and agencies.

There's also some very positive short-term improvements outlined, with real-time Adelaide Metro information becoming progressively available in 2013 for buses, trams and trains. Also a whole-of-government collaboration platform, StateLink, is being rolled out, incorporating instant messaging, desktop videoconferencing, meeting spaces and desktop and application sharing.

The boldest goal in the plan is to move to digital by default and collaborative democracy - placing citizens at the centre of government and digital at the centre of the web of channels used by government to engage.

There is also a goal to move agencies from competing to sharing - although I believe this will continue to be a challenge for all Australian governments while budgetary approaches and Ministers remain competitive and focused on their own interests ahead of whole-of-government.

The plan also outlines the intent to move from risk averse to risk managed behaviour and from large monolithic projects to rapid prototyping, with a multi-disciplinary design approach rather than a technology driven one.

This is also a challenging change for governments due to cultural and structural reasons and I will be interested to see how South Australia intends to achieve this.

The paper also provides a commitment to the establishment of a government innovation lab 'DemoLab',   for conducting trials and experiments in collaborative democracy. DemoLab will,
coordinate multi‑disciplinary teams made up of staff seconded from agencies, and people drawn from industry, academia and the community. DemoLab will use the best technical, operational, and behavioural thinking to address specific challenges and opportunities. Project teams will spend no more than thirty days developing small‑scale, operational prototypes of their solutions. Lessons will be learned, connections made, and successes will be recorded and replicated across the public sector.
I think this is a great idea - a government, like any other organisation, that doesn't reinvent itself will be reinvented from the outside, a far more unpleasant and messy outcome.

The positioning paper is written in a very conversational style (unlike many government papers - or most ICT plans), and is well worth reading and commenting on.

So if you want to have some input and influence over the South Australian government's future ICT strategy and aspirations, visit the SA Connected consultation.


Read full post...

Wednesday, January 16, 2013

Infographics: How does Australia compare on government open data released?

I've developed several infographics (below) comparing the open data performance of nations, looking at which have national open data sites, how many sites they have across different government levels and how many datasets have been released through their national sites.

It's not a way to judge 'winners' and 'losers' - or even to compare the relative performance of countries. However it provides useful information on who is doing what and how deeply open government has been embedded in the thinking of agencies. This said...

There are 41 countries listed (by data.gov) as having open data websites, out of almost 200 nations.

In their national open data sites, in total, these nations have released at least 1,068,164 data sets (I was unable to get a count from China, Timor-Leste, Tunisia or Sweden's national open data sites), for an average of 28,869 and a median of only 483 - due to a few high release countries (US, France, Canada).

How do Australia and New Zealand rank?
As people will look for this anyway, based on the number of datasets released as of January 2013, New Zealand is 9th (with 2,265 datasets) and Australia 11th (with 1,124 datasets).

Between us is Estonia, with 1,655 datasets.

The top nations above New Zealand are, in order: US (378,529), France (353,226), Canada (273,052), Denmark (23,361), United Kingdom (8,957), Singapore (7,754), South Korea (6,460), Netherlands (5,193).

Infographics




And finally, as a tree map showing the relative size of nations by datasets...




Raw data
The raw data is available in a spreadsheet at: https://docs.google.com/spreadsheet/pub?key=0Ap1exl80wB8OdFNvR3dja3E4UGtVVi1LMU11OFBmR1E&output=html

Caveats
When it comes to nations and states there's few absolute measures, there's simply relative performance - across jurisdictions or across time.

These comparisons are often flawed due to variations in data collection, lack of information or differences in approach, however there can still be value in 'placing' nations, identifying opportunities, challenges, flaws and risks.

My work above is not a measure of the success of open data itself, but provides a relative indicator of which governments have been more successful in embedded open government principles in agencies, and how deeply. It also provides insight into wich nations are working in this space.

My data spreadsheet is also a useful 'point in time' reference to track changes over time.

Note that I was unable to count open data released outside of national open data sites - there's a lot more of this, however it can be harder to locate. Due to the sheer number of state-based open data sites (210), I've not yet done a tally of the datasets they've released, only of the 41 national sites. Watch this space :)

The data may not be 100% accurate due to differences in the approach to releasing data. data.gov provided the list of data sites and I drew specific information on datasets and apps from all 41 national open data sites, each with a different design and functionality and across over a dozen languages.

Please let me know of any inaccuracies and I will endeavour to correct them.

Read full post...

Tuesday, November 06, 2012

OpenAustralia Hack(s)fest on FOI - for hackers, media, activists & FOI gurus

The OpenAustralia Foundation will be holding the first Australian Hack(s)fest as part of the countdown to the launch of their new FOI assistance site, designed to make it easier for ordinary Australians to put in FOI requests to Commonwealth agencies.

The event, being held in Sydney at Google's office, will be held on the weekend of 17-18 November.

For more details and to register, visit: www.openaustraliafoundation.org.au/2012/11/05/youre-invited-to-our-freedom-of-information-hacksfest/

Read full post...

Monday, October 29, 2012

Why do agencies struggle with FOI and open data so much?

The linked email conversation (in a blog post), Freedom of Information Request for Classification Data, provides an interesting insight into the struggles government agencies are having with FOI and open data and with the difficulties applicants are having accessing data which should be available in reusable formats.

In this case information which is publicly available and searchable has been made less accessible by an agency in their site (breaking a site scraper). Then after the developer asks the agency for access to the data under FOI the agency (after several delays) offers to make it available for $4,000.

As far as can be determined from the information provided, the process for releasing the data - which is already in a database - simply requires a single SQL command.

The appearance is that the agency is being badly let down by its IT systems or staff - or that it is unwilling to provide the data.

Either situation is a sad reflection on the agency and on the commitment of the government to openness.

I'll keep tracking this request - as are also a number of people in the open data space - to see how it is resolved, and how long it takes to do so.

Read full post...

Wednesday, October 24, 2012

Free our data - a great presentation from Pia Waugh

Open Data advocate Pia Waugh spoke recently on the topic of freeing government data at Ignite Sydney 9 (an event where speakers get five minutes and 20 slides to say their piece).

It provides a strong view as to why governments need to open up data to the community and is definitely worth viewing and sharing.

Read full post...

Tuesday, October 02, 2012

Making APIs for government data - should agencies do this or leave it to third parties?

APIs (Application Programming Interfaces) are a technique for interacting with data (usually on the web) which liberates users from relying on particular applications or having to do complex programming to reuse the data in interesting ways.

Unfortunately few government agencies go the extra distance to release their data with an API, instead using specific data formats which require specific applications to access them.

This is a real shame, as APIs essentially makes data application free - great for accessibility and both easier and faster for any web user or website to reuse the data effectively.

It is often relatively easy for to create APIs from an agency's released data, as demonstrated by the Farmer Market API example from Code for America, which took less than an hour to convert from a spreadsheet into a map visualisation.

Agencies can certainly take the position that they don't want to do the extra work (however little it may be) to provide APIs for their public data and leave it up to third parties to do this - wherever and whenever they wish.

This is a choice, however, that comes with risks.

Where an agency simply 'dumps' data - in a PDF, CSV, Shapefile or other format online, whether via their site or via a central open data site - they are giving up control and introducing risk.

If a third party decides to create an API to make a dataset easier to access, reuse or mash-up, they could easily do so by downloading the dataset, doing various conversions and clean-ups and uploading it to an appropriate service to provide an API (per the Family Market API example).

Through this process the agency loses control over the data. The API and the data it draws on is not held on the agency's servers, or a place they can easily update. It may contain introduced (even inadvertent) errors. 

The agency cannot control the data's currency (through updates), which means that people using the third party API might be accessing (and relying on) old and out-dated data.

The agency even loses the ability to track how many people download or use the data, so they can't tell how popular it may be.

These risks can lead to all kinds of issues for agencies, from journalists publishing stories to people making financial decisions relying on out-dated government data. 

Agencies might see a particular dataset as not popular due to low traffic to it from users of their site, and thereby decide to cease publication of it - when in reality it is one of the most popular data sets they hold, hence a third party designed an API for it which is where all the users go to access it.

As a result of these risks agencies need to consider carefully whether they should - or should not - provide APIs themselves for the data they release.

Open data doesn't have to mean an agency loses control of the datasets it releases, but to retain control they need to actively consider the API question.

Do they make it easy for people to access and reuse their data directly, retaining more control over accuracy and currency, or do they allow a third party with an unknown agenda or capability to maintain it to do so?

Agency management should consider this choice carefully when releasing data, rather than automatically jumping to just releasing that CSV, PDF or Shapefile, or some other file type.

Read full post...

Monday, October 01, 2012

Victorian Government launches consultation on draft 'digital by design' ICT strategy

The Victorian Government has announced it is seeking public feedback on a proposed ICT strategy, Digital by design developed by the Victorian Information and Communications Advisory Committee (VICTAC).

The draft provides advice on the future management and use of ICT by government and how the Victorian Government can design and use information and technology to deliver better services.

The public consultation is for just over two weeks, finishing on 17 October.

The strategy sets out objectives and actions focused in three key areas and proposes eight principles to guide ICT decision making (per the chart below).

While not focused on Government 2.0, the draft strategy takes into account the increasing digitalisation of communications, expectations of citizens and the need to increasingly co-design and co-produce policy and service deliver programs and to design code for reuse, as well as the need to embed innovation within ICT and release more public data.


To learn more and to leave comments, visit www.vic.gov.au/ictstrategy/

Read full post...

Tuesday, September 18, 2012

Mapping open data site generations

Over the last three years we've seen an increasing level of sophistication and capabilities in successive generations of open data sites.

To aid governments in their open data journey, I've mapped five generations for the progressive development of open data sites, detailed in the document below.

Please feel free to reuse the information within the bounds of the embedded Creative Commons license.

My next task is to release a view of open data sites around the world mapped against these generations to provide a view as to who is leading and who is lagging in the open data stakes.

Read full post...

Tuesday, July 03, 2012

Automating online activities without IT intervention - using web tools to make jobs easier

There's often lots of small - and not so small - activities that communications teams want to carry out online that would make their jobs easier, but aren't really tasks to give to IT teams.

For example, you may wish to update your agency's Facebook and Twitter profile pictures when your logo changes, automatically post your blog posts to LinkedIn and Facebook, be sent an email whenever someone tweets at you or receive an alert whenever your Minister is mentioned in a breaking news story.

This is where it is useful to get familiar with services like IFTTT and Yahoo Pipes.

IFTTT, or "IF This Then That" is a simple logic engine that allow you to string together a trigger  and an action to create a 'recipe' using the format IF [trigger] then [action].

For example, below is a recipe used to automatically tweet new posts on this blog:
A recipe in IFTTT
A recipe in IFTTT

This sounds very simple, but it can be a very powerful labour saving tool. Each trigger and action can be from different online services, or even physical devices.

A recipe in IFTTT
A recipe in IFTTT (click to enlarge)
Recipes can be more complex, with various parameters and settings you can configure (for example the recipe above has been configured to append #gov2au to the tweets).

For example, at right is the full page for a recipe that archives your Tweets to a text file in your Dropbox.

Besides connecting the trigger (a new tweet from you) with the action (posting your tweet in Dropbox),  you can choose whether to include retweets and @replies.

You can set the file name where your tweets will be stored and the file path in Dropbox, plus you can set the content that is saved and how it will be formated.

In this case the recipe is set to keep the text of the tweet (the 'Text' in a blue box), followed on a new line by the date it was tweeted ('CreatedAt') and then, on another new line, a permanent link to the tweet ('LinkToTweet'), followed by a line break to separate it from following tweets.

You can add additional 'ingredients' such as Tweet name and User Name - essentially whatever information that Twitter shares for each tweet.

Rather than having to invent and test your own recipes, IFTTT allows people to share their recipes with others, meaning you can often find a useful recipe, rather than having to create one from scratch.

In fact I didn't create either of the recipes I've illustrated, they were already listed.

There's currently over 36,000 recipes to choose from, for the 47 services supported - from calendars, to RSS feeds, to email, to social networks, to blogs and video services, from SMS to physical devices.

All the online services that can be 'triggers' for IFTTT
All the online services that can be 'triggers' for IFTTT
It is even possible to string together recipes in sequence.

For example, if I wanted to update my profile image in Facebook, Twitter, Blogger and LinkedIn, I can set up a series of recipes such as,
  • If [My Facebook profile picture updates] Then [Update my Twitter profile picture to match]
  • If [My Twitter profile picture updates] Then [Update my Blogger profile picture to match]
  • If [My Blogger profile picture updates] Then [Update my LinkedIn profile picture to match]
  • If [My LinkedIn profile picture updates] Then [Update my Facebook profile picture to match]
Using these four recipes, whenever I update one profile picture, they will all update.

Also it's easy to turn recipes on and off - meaning that you can stop them working when necessary (such as if you want to use different profile pictures).

However there's limits to an IF THEN system, which is where a tool like Yahoo Pipes gets interesting.

Yahoo Pipes is a service used to take inputs, such as an RSS or data feed, webpage, spreadsheet or data from a database, manipulate, filter and combine them with other data and then provide an output with no programming knowledge.

This sounds a bit vague, so here's a basic example - say you wanted to aggregate all news related to Victoria released by Australian Government agencies in media releases.

To do this in Yahoo Pipes you'd fetch RSS feeds from the agencies you were interested in, 'sploosh' them together as a single file, filter out any releases that don't mention 'Victoria', then output what is left as an RSS feed.

Building a Yahoo Pipe
Building a Yahoo Pipe (click to enlarge)
But that's getting ahead of ourselves a little... To the right is an image depicting how I did this with Yahoo Pipes.

Here's how it works...

First you'll need to go to pipes.yahoo.com and log in with a Yahoo account.

First I created a set of tools to fetch RSS from Australian Government agencies. These are the top five blue boxes. To create each I simply dragged the Fetch feed from the 'sources' section of the left-hand menu onto the main part of the screen and then pasted in each RSS feed URL into the text fields provided (drawing from the RSS list in Australia.gov.au).

Next, to combine these feeds, I used one of the 'operator' function from the left menu named Union. What this does is it allows you to combine separate functions into a single output file. To combine the Fetch feed RSS feeds all I needed to do was click on the bottom circle under each (their output circle) and drag the blue line to a top circle on the Union box (the input circle).

Then I created a Filter, also an 'operator' function and defined the three conditions I wanted to include in my final output - news items with 'Victoria', 'Victorian' or 'Melbourne'. All others get filtered out.  I linked the Filter's input circle to the Union's output circle, then linked the output from the Filter to the Pipe Output.

Then I tested the system worked by clicking on the blue header for each box and viewing their output in the Debugger window at bottom.

When satisfied it worked (and I did have to remove the filter condition 'Vic' as it picked up parts of words such as "service"), I saved my pipe using the top right save button, giving it the name 'Victoria RSS', then ran the pipe and published it at http://pipes.yahoo.com/pipes/pipe.info?_id=0392f5ec8f7450abbf650056c22f1e5d.


Note that pipes don't have to be published, you can keep them private. You can also publish their outputs as RSS feeds or as a web service (using JSON) for input into a different system. You can even get the results as a web badge for your site, by email, phone or as PHP for websites.

An IFTTT recipe built from the Yahoo Pipe above
An IFTTT recipe built from the Yahoo Pipe above
(click to enlarge)
Alternatively you can even combine them with IFTTT - for example creating a recipe that sends you an email every time an Australian Government agency mentions Victoria in an media release.

In fact I created this recipe (in about 30 seconds) to demonstrate how easy it was. You can see it to the right, or go and access it at IFTTT at the recipe link: http://ifttt.com/recipes/43242

So that's how easy it now is to automate actions, or activities, online - with no IT skills, in a short time.

There's lots of simple, and complex, tasks that can be automated easily and quickly with a little creativity and imagination.

You can also go back and modify or turn your recipes and pipes on and off when needed, you can share them with others in your team or across agencies quickly and easily.

Have you a task you'd like to automate? 
  • Finding mentions of your Department on Twitter or Facebok
  • Tracking mentions of your program in the media releases of other agencies
  • Archiving all your Tweets and Facebook statuses
  • Receiving an SMS alert when the weather forecast is for rain (so you take your umbrella)
  • Posting your Facebook updates, Blog posts and media releases automatically on Twitter spread throughout the day (using Buffer)
The sky's the limit!

Read full post...

Tuesday, May 22, 2012

Standardising content across government (or why does every agency have a different privacy policy?)

Every government website serves a different purpose and a different audience, however there are also standard content every site must have and legislation and standardised policies they must follow.

This includes content such as a privacy policy, legal disclaimer,  terms of use, accessibility statement, copyright, social media channels, contact page, information publication (FOI) pages and so on. It also includes the navigational structure and internal ordering of pages and the web addresses to access this content (such as for 'about us' pages).

So is there a case to standardise the templates and/or content of these pages and where to find them in websites across government?

I think so.

From an audience perspective, there is a strong case to do so. Citizens often use multiple government websites and it makes their experience more streamlined and efficient if they can find what they need in a consistent place (such as www.agency.gov.au/privacy), written in a consistent format and, where possible, using identical or near identical language.

It would also save money and time. Rather than having to write and seek legal approval for the full page content (such as for privacy information), only agency-specific parts would need writing or approval. Websites could be established more rapidly using the standard content pages and lawyers could focus on higher value tasks.

To put a number on the current cost of individually creating standard, if you assume it cost, in time and effort, around $500 to develop a privacy policy and that there are around 941 government websites (according to Government's online info offensive a flop), it would have cost up to $470,500 for individual privacy policies for all sites. Multiple this by the number of potentially standardisable pages and the millions begin adding up.

Standardisation could even minimise legal risks. It removes a potential point of failure from agencies who are not resourced or have the expertise to create appropriate policies and expose themselves to greater risks - such as over poorly written legal disclaimers which leave them open to being sued by citizens.

In some cases it may be possible to use the same standard text, with a few optional inclusions or agency-specific variations - such as for privacy policies, disclaimers, accessibility statements, terms of use, and similar standard pages.

In other cases it won't be possible to use the same content (such as for 'about us' pages), however the location and structure of the page can be similar - still providing public benefits.

Let's take privacy policies specifically for a moment.There's incredible diversity of privacy policies across Australian Government websites, although they are all subject to the same legislation (the Privacy Act 1988) and largely cover the same topics (with some variation in detail).

While this is good for lawyers, who get to write or review these policies, it may not be as good for citizens - who need to contend with different policies when they seek to register for updates or services.

Many government privacy policies are reviewed rarely, due to time and resource constraints, which may place agencies at risk where the use of new tools (such as Youtube, Slideshare and Scribd) to embed or manipulate content within agency sites can expose users unknowingly to the privacy conditions of third party sites (see how we handled these in myregion's privacy policy with an extendable third party section).

So, how would government go about standardisation? Although effectively a single entity, the government functions as a group of agencies who set their own policies and manage their own risks.

With the existence and role of AGIMO, and the WebGuide, there is a central forum for providing model content to reflect the minimum standard agencies must meet. There are mandatory guidelines for agencies, such as for privacy, however limited guidance on how to meet it. A standard privacy policy could be included and promoted as a base for other agencies to work from, or even provided as an inclusion for sites who wanted to have a policy which was centrally maintained and auto-updated.

Alternatively web managers across government could work together, through a service such as GovDex, to create and maintain standard pages using a wiki-based approach. This would allow for a consistently improving standard and garner grassroots buy-in, plus leverage the skills of the most experienced web masters.

There's undoubtably other ways to move towards standardised pages, even simply within an agency, which itself can be a struggle for those with many websites and decentralised web management.


Regardless of the method selected, the case should receive consideration. Does government really need hundreds of versions of what is standard content, or only a few?


Examples of government privacy policies (spot the similarities and differences):

Read full post...

Tuesday, May 08, 2012

Participate in Melbourne Knowledge Week 2012

The City of Melbourne was recognised in 2012 as ‘Most Admired Knowledge City’ in an award from the World Capital Institute and Teleos, an independent management research firm.

The city is building on this with the annual Melbourne Knowledge Week, designed to engage both the knowledge community and the wider public in a range of events and opportunities that help promote Melbourne's identity as global knowledge city.

I reckon there has to be a place for Gov 2.0 in this mix and wanted to flag to all my Victorian readers that an expression of interest is now open to businesses, organisations, educational institutions, networking groups, community groups and individuals who wish to showcase knowledge-related projects, thinkers and capabilities as part of this year's event.

Melbourne Knowledge Week runs from 26 November to 1 December. More details on the event, and the expression of interest, are at http://www.melbourne.vic.gov.au/enterprisemelbourne/events/KnowledgeWeek/Pages/KnowledgeWeek.aspx

Read full post...

Thursday, April 26, 2012

Patient Opinion launches in Australia

One of the UK's social media success stories, Patient Opinion, has now launched an Australian website at www.patientopinion.org.au.

Patient Opinion, which has been live since 2005, allows patients to rate and comment on their experience with health providers. It has been an amazing (if sometimes painful) success in the UK, leading to a number of care improvements across the health system and at individual providers.

Having worked in the area in government in Australia, I recognise the sensitivities that get raised around the idea of rating health providers, or allowing public comment on individual experiences, particularly from hospitals and health professionals.

However decisions are made every day by people based on their views and experiences - which product to buy or shop to visit. They are even made about health services in private conversations that health providers can neither see or address.

Patient Opinion makes patient views and experiences visible in a central and public way, allowing health providers with the ability to access and review - even respond - to comments. The site also provides a level of governance and safety through monitoring stories and comments to ensure they are not defamatory.

The approach allows health providers to view and address operational concerns and provides valuable insights for policy makers into the Australian health system which, after all, is supposed to maximise the outcomes for patients.

While fears of negativity are common amongst organisations and individuals when social media channels open, the Patient Opinion experience in the UK has been that there is a high level of positive feedback provided - people do have faith in many health providers.

A brief video about the site is below, and you can learn more about Patient Opinion in Australia at www.patientopinion.org.au/info/about

Read full post...

Friday, March 23, 2012

Don’t dumb me down! (guest post)

With the permission of Geoff Mason (@grmsn), I've republished his blog post Don’t dumb me down! from 21 March this year below.

I thought this was a very good post on a topic that, as increasing amounts of information and discussion only appear online, is increasingly affecting how effective public servants can be and the policy outcomes across government.

Don’t dumb me down!


There continues to be a fear of the unknown and the misunderstanding across the Australian public service about the internets – which baffles me to be honest.

Agencies continue to block social media websites, cloud based email services, and restrict mobile access during business hours. At the same time the government is pushing for greater innovation, greater mobilisation and capability of staffing, and increased staff performance while seeking to make cost reductions across the breadth of the public service.

The two are one in the same in this modern age. Social media provides the first point of call regardless of the industry for professional development, access to innovation, and in sharing how people work to increase productivity.

As a quick case study, Google + while not a social media site in itself provides a social layer which covers all its services from search through to document sharing and collaboration. The interlinked services include the Google email groups all of which requires access to not just the platform but to a Google account. The service helps tailor search results and improves the breadth of information and opinion provided by adding Web 2.0 functionality. Increasing a person’s ability to undertake a critical analysis of the information being provided.

For example, Tim O’Rielly a prominent person in many ways, including a leader in facilitating discussion, direction, and promotion of modern communications, and open and transparent government uses Google + as a key communication channel for engaging and sharing ideas of the many through an established community which actively engages in frank discussion on the merits and disadvantages of many key concepts attached a public servants work life.

Restricting access to this type of discussion during working hours means federal employees are required to actively engage in these environments during their down time - all the while trying to manage their families, their dogs, the gardening, and everything else which comes from having a life outside of the office. While I think that’s fine for myself, I don’t believe it should be expected of everyone.

As more and more key representatives access similar services as their communication channel of choice it will be fundamental for public servants to not only have access to but be encouraged to be a part of and monitor the discussions on these platforms as a cheap and effectively method for self-development and idea generation for not only their team but for their agency as a whole.

Beats the hell out of spending $2,500 to send staff along to a workshop to hear other public servants talking about something that they could be getting for free online don’t ya thunk?

In short, government agencies need soundly assess the short term risks which access to these systems pose in comparison to long term benefits which being a part of a global community could provide.

Read full post...

Monday, March 19, 2012

From open data to useful data

At BarCamp Canberra on Saturday I led a discussion asking how we can help governments take the step from open data (releasing raw datasets - not always in an easily reusable format) towards usable and useful data (releasing raw datasets in easily reusable formats plus tools that can be used to visualise it).

To frame this discussion I like to think of open data as a form of online community, one that largely involves numbers rather than words.

Organisations that establish a word-based community using a forum, blog, wiki, facebook page or similar online channel but fail to provide context as to how and why people should engage, or feed and participate in the discussion, are likely to get either receive little engagement or have their engagement spin out of control.

Equally I believe that raw data released without context  as to how and why people should engage and no data visualisation tools to aid participation in a data discussion are likely to experience the same fate.

With no context and no leadership from the data providers, others will fill the informational gap - sometimes maliciously. Also there's less opportunities for the data providers to use the data to tell good stories - how crime has decreased, how vaccination reduces fatalities, how the government's expenditure on social services is delivering good outcomes.

Certainly there will always be some people with the technical experience and commitment to take raw open data, transform it into a usable form and then build a visualisation or mash-up around it to tell a story.

However these people represent a tiny minority in the community. They need a combination of skill, interest and time. I estimate they make up less than 5% of society, possibly well under 1%.

To attract the interest and involvement of others, the barriers to participation must be extremely low, the lesson taught by Facebook and Twitter, and the ability to get a useful outcome with minimal personal effort must be very high, the lesson taught by Google.

The discussion on the weekend seemed to crystalise into two groups. One that felt that governments needed to do more to 'raise the bar' on the data they released - expending additional effort to ensure it was more usable and useful for the public.

The other view was that governments have fulfilled their transparency and accountability goals by releasing data to the community. That further working on the data redirects government funds from vital services and activities and that there is little or no evidence of value in doing further work on open data (beyond releasing it in whatever form the government holds it).

I think there's some truth in both views - however also some major perceptual holes.

I don't think it necessarily needs to be government expending the additional effort. With appropriate philanthropical funding a not-for-profit organisation could help bridge the gap between open and usable data, taking what the government releases and reprocessing it into outputs that tell stories.

However I also don't accept the view that there was no evidence to suggest that there was value in doing further work on open data to make datasets more usable.

In fact it could be that doing this work adds immense value in certain cases. Without sufficient research and evidence to deny this, this is an opinion not a fact - although the evidence I've seen from the ABS through the census program (here's my personal infographic by the way), suggests that they achieved enormous awareness and increased understanding by doing more than releasing tables of numbers - using visualisations to make the numbers come alive.

Indeed there is also other evidence of the value of taking raw data and doing more work to it is worthwhile in a number of situations. Train and bus timetables are an example. Why does government not simply release these as raw data and have commercial entities produce the timetables at a profit? Clearly there must be sufficient value in their production to justify governments producing slick and visual timetables and route maps.

Some may argue that this is service delivery, not open data (as someone did in the discussion). I personally cannot see the difference. Whenever government chooses to add value to data it is doing so to deliver some form of service - whatever the data happens to be.

Is there greater service delivery utility in producing timetables (where commercial entities would step in if government did not) or in providing a visual guide to government budgets (where commercial interests would not step in)?

Either way the goal is to make the data more useful and usable to people. If anything the government should focus its funds on data where commercial interests are not prepared to do the job.


However this is still talking around the nub of the matter - open data is not helping many people because openness doesn't mean usefulness or usable.

I believe we need either a government agency or a not-for-profit organisation to short circuit the debate and provide evidence of how data can be meaningful with context and visualisations.

Now, who would like to help me put together a not-for-profit to do this?

Read full post...

Monday, March 05, 2012

Who is your Marketing or Communications CIO?

I was struck by a comment from Dan Hoban (@dwhoban) at GovCamp Queensland on Saturday, which resonated with me, and with others in the audience, that organisations now need a CIO (Chief Information Officer) in their marketing or communications teams.

This is a person who understands the technologies we use to communicate with customers, clients, citizens and stakeholders and can provide sound advice and expertise in a manner that traditional ICT teams cannot.

The role of this person is to understand the business goals and recommend approaches and technologies - particularly online - which are a best fit. Then it may be this person and their team, or an ICT team, who build and deliver the solutions needed.

When Dan named this role I realised it fit absolutely the role I had been performing in government for my five years in the public service, and for a number of years prior in the corporate sector.

Where ICT teams were focused largely on reactive management of large critical ICT systems - the SAPs, payment frameworks and secure networks - it has long been left to Online Communications, or similar teams or individuals in other parts of the organisation, to proactively introduce and manage the small and agile tools communicators use in public engagement.

No organisation I've worked in or spoken to has ICT manage their Facebook page, Twitter account, GovSpace blog or YouTube channel. Few ICT teams are equipped to cost-effectively and rapidly deliver a focused forum, blog, mobile app or data visualisation tool. They don't recruit these skills or, necessarily, have experience in the right platforms and services.

When Communications teams seek advice on the online channels and technological tools they should use they ask ICT, but frequently are told that ICT doesn't understand these systems (even when individuals within ICT might be highly skilled with them), doesn't have the time or resources to commit in the timeframes required (due to the need to focus on critical systems), doesn't have the design skills or that it would take months (sometimes years) to research and provide an effective opinion - plus it will cost a bomb.

So Communications teams, who have their own deliverables, have no choice but to recruit their own social media and online communications smarts.

It is this person, or team's role, to understand Communication needs, make rapid and sound recommendations of channels and tools, design the systems and the interfaces, integrate the technologies (or manage the contractors who do) to deliver relevant and fast solutions on a budget.

So perhaps it is time to recognise these people for what they actually are for an organisation - a Marketing or Communications CIO.

I expect ICT teams will hate this. Information has long been their domain even though their focus is often on technology systems and they do not always understand the information or communication that feeds across these systems - the reason these systems actually exist.

Perhaps it is time for them need to rethink their role, or let go of the agile online and mobile spaces and focus on the big ticket systems and networks - remain the heart, but not always the adrenal glands or, indeed, the brains, of an organisation's ICT solutions.

Read full post...

Bookmark and Share