Showing posts with label data. Show all posts
Showing posts with label data. Show all posts

Monday, March 06, 2023

Artificial Intelligence isn't the silver bullet for bias. We have to keep working on ourselves.

There's been a lot of attention paid to AI ethics over the last few years due to concerns that use of artificial intelligence may further entrench and amplify the impact of subconscious and conscious biases.

This is very warranted. Much of the data humans have collected over the last few hundred years is heavily impacted by bias. 

For example, air-conditioning temperatures are largely set based on research conducted in the 1950s-70s in the US, on offices predominantly inhabited by men and folks wearing heavier materials than worn today. It's common for many folks today to feel cold in offices where air-conditioning is still set for men wearing three-piece suits.

Similarly, many datasets used to teach machine learning AI suffer from biases - whether based on gender, race, age or even cultural norms at the time of collection. We only have the data we have from the last century and it is virtually impossible for most of it to be 'retrofitted' to remove bias.

This affects everything from medical to management research, and when used to train AI the biases in the data can easily affect the AI's capabilities. For example, the incredibly awkward period just a few years ago when Google's image AI incorrectly identified black people as 'gorillas'. 

How did Google solve this? By preventing Google Photos from labelling any image as a gorilla, chimpanzee, or monkey – even pictures of the primates themselves - an expedient but poor solution, as it didn't fix the bias.

So clearly there's need for us to carefully screen the data we use to train AI to minimise the introduction or exacerbation of bias. And there's also need to add 'protective measures' on AI outputs, to catch instances of bias, both to exclude them from outputs and to use them to identify remaining bias to address.

However, none of this work will be effective if we don't continue to work on ourselves.

The root of all AI bias is human bias. 

Even when we catch the obvious data biases and take care when training an AI to minimise potential biases, it's likely to be extremely difficult, if not impossible, to eliminate all bias altogether. In fact, some systemic unconscious biases in society may not even be visible until we see an AI emulating and amplifying them.

As such no organisation should ever rely on AI to reduce or eliminate the bias exhibited by its human staff, contractors and partners. We need to continue to work on ourselves to eliminate the biases we introduce into data (via biases in the queries, process and participants) and that we exhibit in our own language, behaviours and intent.

Otherwise, even if we do miraculously train AIs to be entirely bias free, bias will get reintroduced through how humans selectively employ and apply the outputs and decisions of these AIs - sometimes in the belief that they, as humans, are acting without bias.

So if your organisation is considering introducing AI to reduce bias in a given process or decision, make sure you continue working on all the humans that remain involved at any step. Because AI will never be a silver bullet for ending bias while we, as humans, continue to harbour them.

Read full post...

Thursday, February 02, 2023

It's time for Australian government to take artificial intelligence (AI) seriously

Over the last two and a half years I've been deep in a startup using generative artificial intelligence (AI that writes text) to help solve the challenge organisations face in producing and consuming useful content.

This has given me practical insights into the state of the AI industry and how AI technologies can be successfully - or unsuccessfully - implemented within organisations to solve common challenges in the production, repurposing and reuse of content.

So, with a little prompting from the formidable Pia Andrews, I'm taking up blogging again at eGovAU to share some of my experience and insights for government use of AI.

I realise that Australian governments are not new to AI. Many agencies have been using various forms of AI technologies, directly or indirectly, to assist in understanding data or make decisions. 

Some may even include RPA (Robotic Process Automation) and chatbots - which in my humble opinion are not true AI, as they both are designed programmatically and cannot offer insights or resolve problems outside their programmed parameters and intents.

When I talk about AI, my focus is on systems based on machine-learning, where the AI was built from a body of training, evolving its own understanding of context, patterns and relationships.

These 'thinking' machines are capable of leaps of logic (and illogic) beyond any programmed system, which makes them ideal in situations where there are many edge cases, some of which can't be easily predicted or prepared for. It also places them much closer to being general intelligences, and they often exhibit valuable emergent talents alongside their original reasons for being. 

At the same time machine-learning is unsuitable for situations where a decision must be completely explainable. Like humans it is very hard to fully understand how a machine-learning algorithm came to a given conclusion or decision.

As such their utility is not in the realm of automated decision-making, but rather is assistive by encapsulating an evidence base or surfacing details in large datasets that humans might overlook.

As such machine-learning has vast utility for government. 

For example,

  • summarizing reports, 
  • converting complex language into plain, 
  • writing draft minutes from an intended purpose and evidence-base, 
  • extracting insights and conclusions from large research/consultation sets, 
  • crafting hundreds of variants to a message for different audiences and mediums,
  • developing structured strategy and communication plans from unstructured notes,
  • writing and updating policies and tender requests, 
  • semantically mapping and summarizing consultation responses,
  • developing programming code, and
  • assisting in all forms of unstructured engagement and information summarization/repurposing.

As such machine-learning is as an assistive and augmentation tool. Extending the capabilities of humans by doing the heavy lifting, rather than fully automating processes.

It's also critical to recognise that AI of this type isn't the sole purview of IT professionals and data scientists. Working with natural language AIs, as I do, is better supported by a strong business and communications skillset than by programming expertise. 

Designing prompts for an AI (the statements and questions that tell the AI what you want) requires an excellent grasp of language nuances and an extensive vocabulary.

Finetuning these AIs requires a strong understanding of the context of information and what constitutes bias, so that an AI is not inadvertently trained to form unwanted patterns and derive irrelevant or unneeded insights.

These are skills that 'business' folks in government agencies often possess to a far greater degree than most IT teams.


So through my eGovAU blog, I'm going to be regularly covering some of the opportunities and challenges I see for governments in Australia seeking to adopt AI (the machine-learning kind), and initiatives I see other governments adopting.

I will also blog occasionally on other eGov (or digital government) topics, however as this is now well-embedded in government, I'll only do so when there's something new I have to add.

Read full post...

Monday, May 06, 2019

Mapping Canberra's startup ecosystem

I've had a continuing interest in start-up ecosystems across Australia, having been a member of several of these ecosystems & helping to mentor and support a range of start-ups over the years.

I've maintained a Canberra ecosystem map for about four years now, mostly for my own interest and to understand some of the relationships between different players and the startups they support.

This was inspired by work by BlueChilli on the defunct StartRail maps, which was based on some of the international work portraying startup ecosystems in the style of metro rail maps. Unfortunately they focused on Sydney and Melbourne, missing some of the smaller, yet equally vibrant, scenes in Perth, Brisbane and Canberra, all of which I am linked to in various ways.

Recently I've seen some sterling work by Gordon Whitehead mapping the startup ecosystem for the Hunter & Central Coast, which had been reinterpreted by Brian Hill of Laughing Mind.
As such I've decided to share my Canberra startup ecosystem map for anyone interested.

Also keep an eye out for the work by Chad Renando at StartStatus, who is engaged in a national effort as part of his Phd, which should provide a broader view of the Australian startup ecosystem as a whole (which tends to be city-based with a few cross-ties of various strength).

Chad has also done some intensive work looking at models for measuring startup ecosystems and identifying their strengths & weaknesses that will be very valuable to government, not-for-profit and corporate interests in years to come.

As for Canberra - here's my humble contribution....






Read full post...

Monday, November 12, 2018

The #GovHack 2018 National Awards by the numbers

I shared this via a Twitter thread, but wanted to include it here for longevity.

To learn more about GovHack and the National Red Carpet event, which I attended as a representative of my team & the ACT Spirit of GovHack winner (and finalist for the National Spirit of GovHack), visit www.govhack.org

    




I've analysed the #opendata & here's the #GovHack 2018 Awards by the numbers:

There were 33 National Awards (including Spirit & Government Participation).
A total 88 Awards were issued: 33 First places, 18 Runner-ups and 37 Hon. Mentions.

Excluding Spirit of #GovHack & Gov Participation 59 teams won at least one Award.

Two teams won 3 Awards each:
  • Tiny Happy People Hacking: 1 First place, 1 Runner-up, 1 Hon. Mention
  • in time: 2 Runner-ups & 1 Hon. Mention
(Incidentally Tiny Happy People Hacking was my team)

Another 16 teams won two #GovHack Awards.
  • 5 teams won 2 First places! (as.numeric, Blockheads, Tartans-AU, insolvit & TeamTeam)
  • Another 5 won 1 First place, with 3 also winning a Runner-up (Big Orange Brain, Oakton, TechPreppers) and 2 also an Hon. Mention (DataCake & TeamX).

41 #GovHack teams won 1 award: 13 won a First, 5 a Runners-up & 23 a Hon. Mention.
  • Firsts:
    ARVIS, Bachmanns and Fulwoods, Get Active USC, Hack aPEEL, I’m Learnding, Living Spirit, Lucky Shot, Motley Crue, Team Marika, Team Rocket, Technotelecomnicon, The Ogrelords, This Place

By state/territory, inc. State/Local Government Participation & Spirit of #GovHack, National #GovHack Awards followed population size (except ACT which punched above its weight):
  • NSW won 21 
  • Vic won 19
  • Qld won 17
  • ACT won 14
  • SA won 9
  • WA won 4
  • NT won 2
  • Tas won 0 (sorry folks)

The #GovHack results looks a little different in detail, with Victoria winning more First places than anyone else & Queensland tying with NSW! (the NA are the National Government Participation Awards which I excluded as they give ACT an unfair bonus)


In fact, using a 3-2-1 scoring system for First Place, Runner-up and Hon. Mention, Victoria outscores NSW, and ACT even closer to the top three in the raw #GovHack award numbers.


Finally, looking at #GovHack National Awards by venue, the central city venues did better than regional locations in all cases, except in Queensland - where the Sunshine Coast won more awards than anywhere else in Qld, including Brisbane... Amazing work guys!!!



And here's the table of National #GovHack Awards by venue...


And that's the wrap on the by the numbers Awards for #GovHack 2018.
My data is still a bit messy, but I'll clean it up and put it in a Google sheet at some point in the next week so others can access it.


Read full post...

Tuesday, August 01, 2017

Roundup from GovHack 2017

Starting in a single Canberra venue in 2009, GovHack is now the largest open data hacking competition for government worldwide, with over 3,000 participants, coaches, mentors and organisers across 36 venues around Australia and New Zealand.

Over a 46-hour period participants including coders, creatives, data crunchers and facilitators, redesign and reimagine citizen services and use open data to visualise fresh insights into government decision-making, taking part in a competition with over 80 prizes and a prize pool of over $250,000.

The event is organised and run by volunteers, but GovHack has support from the Australian and New Zealand Governments, all Australian state and territory governments and many local governments across ANZ, as well as a range of corporate sponsors. This was the first year that the Northern Territory became involved with the event.

Many senior public servants drop into the event over the weekend, and have a keen interest in using ideas from GovHack within their agencies.

This year Accenture was the Platinum Sponsor for GovHack, the first time a corporation has taken such a significant interest in the event - a trend I hope continues as these types of event gain steam as a creative way for companies and governments to innovate quickly.

Accenture sponsored two awards, the ‘Into the New’ award for Australia challenged participants to demonstrate innovation and new thinking in all forms. This could be new ways to experience and interact with public data or new approaches to citizen experiences that help citizen and governments journey into the new together. It attracted 138 entrants from around Australia, from a total of 373 projects submitted.

Accenture’s ‘Re:Invention’ award for New Zealand challenged participants to design a citizen experience that builds on something government already does to deliver a more effective and engaging way of interacting. It attracted 12 entrants from Wellington, Auckland and Hamilton, from a total of 66 New Zealand projects submitted.


GovHack by the numbers
While GovHack itself is over for 2017, state award events will be held in August, and an international Red Carpet event for National and International Award winners in October. You can view the closing video from GovHack 2017 here.

All the projects created this year are online in the GovHack Hackerspace, available for inspiration and learning – remaining online to provide hundreds of fresh perspectives on how government can deliver more value to citizens.

you can read more about GovHack 2017 in this LinkedIn post by a mentor, or on Twitter.

Read full post...

Wednesday, July 26, 2017

Get revved for GovHack across Australia & New Zealand (28-30 July)

As the world’s largest hackathon, GovHack  is on at over 25 locations across Australia and New Zealand again this year from Friday 6pm this week until Sunday afternoon (28-30 July).
With over 3,000 participants and 437 completed projects in 2016, GovHack is an opportunity to develop prototypes of new services, visualisations and mashups with government open data and other datasets with the chance to be nationally recognised and win prizes at national, state and local levels.

Supported by all levels of Australian government, GovHack is not just for programmers. Some of the projects in previous years have included board games and jewelry (for instance 3D printed bracelets of climate data), alongside websites, mobile apps, wearable apps and APIs.

National awards are announced at a Red Carpet Event, which filled the PowerHouse Museum in Sydney in 2015 (the last one I attended).


While some people form teams before the event, you can also come along as a solo participant, or form a team on the day – providing an opportunity to rub shoulders with all kinds of talented people.

There’s still room to register for some venues if you want to participate.

I’m helping run the ACT local event this year, so will be onsite at Canberra Grammar all weekend. If you’re participating here, come and say hi!

For more information visit the GovHack website or read last year’s report.

Read full post...

Monday, November 28, 2016

Guest post from Henry Sherrell on access to open data for effective policy development

Henry Sherrell is a former Australian Public Servant who now works in policy research at the Australian National University.

As a researcher, open data has become an important input into his work. As such I thought it worth sharing (with his permission) this post from his blog, On The Move, as an example of some of the difficulties researchers still face in accessing data from the Australian Government for important policy work.

It is notable that since Henry published his post, only four days ago, the legislation regarding Henry's policy work is going back to parliament - still with no modelling of its impact on affected communities or any real public understanding of the potential consequences.

I've reproduced Henry's post as a guest post below in full. You can also view Henry's post here in On The Move.

My battle with the Australian Border Force Act: A small, but worrying, example

There are hundreds of interesting questions to ask when someone moves from one country to another. For as long as I can remember, Australia has been one of the best places to explore migration. There are two reasons for this: We welcome immigrants and the government and bureaucracy collect and make accessible robust migration data.
They are not household names but people like Graeme Hugo, the late Paul Miller, Deborah Cobb-Clarke and Peter McDonald have shaped global debates on migration. A new generation of scholars are now examining big, important questions about the intersection migration and work as well as any number of other themes, many of which will help us as a society in the future. Yet this tradition depends on access to Australian migration data from a number of sources, including the ABS, the Department of Immigration and various surveys funded by the government.
Until I received the following email from DIBP, I hadn’t realised just how uncertain this type of knowledge will be in the future:
“The data that was provided to Department of Agriculture was done so for a specific purpose in line with the Australian Border Force Act 2015 (ABF Act).  Unfortunately your request does not comply with the ABF Act and we are therefore unable to provide the requested data.”
I didn’t receive this email because I asked for something controversial. The reason this email stopped me in my tracks was I asked for something which was already largely public.
About a month ago I stumbled across the below map in a Senate submission to the Working Holiday Reform legislation.  The Department of Agricultural and ABARES had produced the map to help show where backpackers worked to gain their second visa. This was an important part of a big public debate about the merits or otherwise of the backpacker tax (as I write this legislation has just been voted on in the Senate, amended and defeat for the government).
I’d never seen this information before and I’m interested in exploring it further as there are decent labour market implications stemming from backpackers and the results may shed light on employment and migration trends. As you can see below, the Department helpfully documented the top 10 postcodes where backpackers worked to become eligible for their 2nd visa:
screen-shot-2016-11-24-at-2-48-38-pm
I get teased a little bit about the number of emails I send asking for stuff. But I’ve found you normally don’t get something unless you ask for it. So using the Department of Agriculture’s handy feedback form on their website, I asked for the data showing how many 2nd working holiday visas have been granted for each postcode.
The top 10 postcodes are already public but as the map shows, there is lots of other information about what you might term a ‘long tail’ of postcodes. One reason I wanted this information was to match up major industries in these postcodes and understand what type of work these people were doing. It would also be good to go back a couple of years and compare trends over time, whether employment activity shifts over time. All sorts of things were possible.
One thing I’ve learnt in the past is don’t ask for too much, too soon. In addition, there is always a potential privacy consideration when examining immigration data. For these reasons, I limited my request to the list of postcodes and number of second visa grants in each. That’s it.
This ensured I excluded information about individuals like age and country of birth which may compromise privacy. I also assumed if the number of backpackers in a postcode was less than five, it would be shown as “<5 as="" data.="" for="" immigration="" is="" of="" other="" p="" practice="" standard="" this="" types="">
ABARES let me know they had passed the response to the Department of Immigration and Border Protection. After following up with DIBP twice, about a month after my initial request, I received the above email which prompted a series of internal questions roughly in this order:
  • You have to be f****** kidding me?
  • If the data was provided to the Department of Agricultural with the knowledge it would be at least partially public, why isn’t the same data available but in a different format? i.e. a spreadsheet not a map based
  • How does my request not comply with the ABF Act? What’s in the ABF Act which prevents highly aggregated data being shared to better inform our understanding of relevant public debates?
And finally: why couldn’t someone work out a way to comply with the ABF Act and still provide me with data?
From what I can work out, the relevant part of the ABF Act is Part 6 pertaining to secrecy and disclosure provisions. Section 44 outlines ‘Disclosure to certain bodies and persons’ and subsection (1) is about ‘protected information that is not personal information’ disclosed to “an entrusted person”. This is the same process causing serious consternation among health professionals working in detention centres.
I am not “an entrusted person”. According to subsection (3), the Secretary of the Department has authority to designate this. Perhaps I should email and ask? Again from what I can work out, it looks like the person who created the data made a record now classified as protected information. This information is then automatically restricted to people who are classified as entrusted, including other bureaucrats, such as those in the Department of Agriculture.
Yet this begs the question. If the Department of Agriculture can publish a partial piece of a protected record, why can’t the Department of Immigration and Border Protection?
All I know is this stinks. And while this concern does not rank anywhere close to those faced by doctors and nurses who work in detention centres, the slow corrosion of sharing information caused directly by this legislation will have massive costs to how we understand migration in Australia.
Think about the very reason we’re even having a debate about the backpacker tax. Not enough people knew about immigration policy, trends and behaviour. The wonks at Treasury didn’t do any modelling on the labour market implications and the politicians in ERC and Cabinet – including the National Party – had no idea about what this might do to their own constituents. Outside the government, when I did a quick ring around in the days after the 2015 budget, the peak industry groups for horticultural didn’t think the backpacker tax would be a big deal. If I was a farmer, I’d rip up my membership. People should have known from very early on this would have real effects in the labour market as I wrote 10 days after the Budget. The fact no-one stopped or modified the tax before it got out of control shows we are working off a low base in terms of awareness about immigration.
The Australian Border Force Act is only going to make that more difficult. Hiding basic, aggregated data behind this legislation will increase future episodes of poor policy making and limit the ability of Australia to set an example to world for immigration. Our Prime Minister is fond of musing on our successful multicultural society yet alongside this decades of learning that has shaped communities, policy decisions, funding allocations and everything else under the sun.
I have no idea how I’m meant to take part in this process if access to information is restricted to bureaucrats and ‘entrusted persons’, who at the moment don’t seem able to analyse worth a damn, judging from the quality of public debates we are having. I don’t expect a personalised service with open access to immigration data. But I expect the public service to serve the public interest, especially when the matter is straightforward, uncontroversial and has the potential to inform relevant public debate.

Read full post...

Monday, November 09, 2015

Can an AI understand your online personality? How about your agency's online persona?

I've been having a play with IBM Watson's Personality Insights Service.

The service uses "linguistic analytics to extract a spectrum of cognitive and social characteristics from the text data that a person generates through blogs, tweets, forum posts, and more."

While this is quite a mouthful, the service provides an interesting external perspective on how individuals and organisations present themselves online.

As a benchmark, this is how Watson sees me from my last dozen blog posts in eGovAU (excluding the guest posts):
You are shrewd, skeptical and tranquil. 
You are philosophical: you are open to and intrigued by new ideas and love to explore them. You are empathetic: you feel what others feel and are compassionate towards them. And you are imaginative: you have a wild imagination. 
Experiences that give a sense of prestige hold some appeal to you. You are relatively unconcerned with tradition: you care more about making your own path than following what others have done. You consider achieving success to guide a large part of what you do: you seek out opportunities to improve yourself and demonstrate that you are a capable person.



In a nutshell that's not too bad an analysis.

However what happens when we analyse a government agency's social media presence?

In this case I decided to analyse the Digital Transition Office (DTO), by taking all their blog posts from July to November (excluding guest posts) and plugging them into the Watson Personality analyser.

So this is how Watson sees the personality of the DTO:

You are heartfelt and rational.  
You are self-controlled: you have control over your desires, which are not particularly intense. You are empathetic: you feel what others feel and are compassionate towards them. And you are proud: you hold yourself in high regard, satisfied with who you are.  
Experiences that make you feel high efficiency are generally unappealing to you. You are relatively unconcerned with tradition: you care more about making your own path than following what others have done. You consider helping others to guide a large part of what you do: you think it is important to take care of the people around you.


Now that's a pretty good result for an organisation (except maybe the pride bit). 

It seems to me that the personality that the DTO is projecting through their blog is fairly close to the approach and persona that the DTO wishes to portray within government and more broadly in the community.

Now how about another agency...

Here's a review of the Department of Immigration's Migration blog, taking all blog posts from October 2014 onwards (to provide a sample of the same size as the DTO and my blog).

Here's what Watson said about the Department of Immigration's blog's personality:

You are heartfelt, tranquil and skeptical. 
You are calm under pressure: you handle unexpected events calmly and effectively. You are self-assured: you tend to feel calm and self-assured. And you are philosophical: you are open to and intrigued by new ideas and love to explore them. 
Experiences that make you feel high well-being are generally unappealing to you. 
You are relatively unconcerned with taking pleasure in life: you prefer activities with a purpose greater than just personal enjoyment. You consider achieving success to guide a large part of what you do: you seek out opportunities to improve yourself and demonstrate that you are a capable person.

On the money, or off the mark?


Now of course this kind of process is flawed. An AI can only read the words it sees, it doesn't have a broader picture of an individual or organisation and it has been programmed to respond in given ways to given words or phrases.

However it does highlight an important point about communicating online, and in every medium, in the way you wish to be seen.

Does your online persona represent how you wish to be seen? Is it consistent across platforms? Is it appropriate to your organisation's goals?

It's worth using tools like this to check how your organisation is communicating and identify if there's attributes you are portraying which are contrary to how you wish to be seen.

If your online persona isn't aligned closely with your goals it can create issues in how people see you and how they engage with you - leading to greater negativity in interactions and diminishing trust and respect.

So think carefully about every post, tweet and status update - do they represent and reinforce your organisational values, or do they damage your image in the public eye.

Read full post...

Tuesday, July 21, 2015

How does government manage the consequences of an imbalance in speed of transparency & speed of accountability?

One of the emerging challenges for governments in the online age is managing the discrepancy between the speed of transparency and the speed of accountability.

With digitalisation and the internet, the speed at which government information is made public is becoming faster, with it being easier to collect, aggregate and publish information and data in near or even real-time.

We see this particularly in public transit data, where many cities around the world now publish real-time data on the location and load of their buses, trains and trams, and in the health industry where a number of states have begun offering near real-time data on the congestion in emergency waiting rooms.

We're also seeing similar near real-time reporting on river levels, dams, traffic congestion and closures, and estimated real-time reports on everything from population to national debt levels.

This trend is expanding, with the Sense-T network in Tasmania pioneering an economy-wide sensor network and data resource. Similarly the Department of Finance in Canberra is working on a system to provide real-time budget information on government expenditure down to every $500 for internal management and public transparency purposes.

This trend is a leap forward in government transparency, providing citizens, bureaucrats and politicians with far greater visibility on how our governance systems are performing and far more capability to identify trends or patterns quickly.

We're seeing a similar transparency event at the moment, with the expenses scandal enfolding the Speaker of the House of Representatives, Bronwyn Bishop, related to her use of a helicopter and several charter flights to attend political fund-raising events.

What this event has also highlighted is that while Australia's governance systems are increasing the speed of transparency, our capability to apply that information to accountable decision-making isn't consistently accelerating at the same rate.

In other words, while we increasingly can obtain the information needed for rapid decision-making, the entrenched processes and methods for decision-making in government are lagging far behind.

We see this in the failure rate of IT projects, which can drag on for years after it's clear they will fail, when laws fail to work as they should and it takes months or years to amend them, when the public has judged a politician's actions, but parliament can take no formal action for months due to being out of session.

Of course many sound reasons can and are given by bureaucrats and politicians as to why decisions need to take lots of time.

Decision-makers from the pre-internet world will say that they need to ensure they have all the necessary data, have digested it, reflected on it, considered alternatives and consequences, consulted widely and only then are able to tweak or change a decision.

This is a fair position with many defensible qualities - it reflects the world in which these people grew up, when decision-making could be undertaken leisurely while the world waited.

However both management theory and the behaviour of our communities have changed.

Start-ups grow and become huge companies based on their ability to make decisions rapidly. They are continuously experimenting and testing new approaches to 'tweak' their businesses for greater success. This is underpinned by streams of real-time data which show the consequences of each experimental change, allowing the organisations to adjust their approach in very short time-frames, minimising their potential losses from sub-optimal decisions.

The community equally reacts very quickly to evidence of poor decisions and bad outcomes, with the internet, particularly social media, fueling this trend.

While this doesn't mean the community is consistently in the right on these matters, it does require decision-makers to respond and address concerns far more rapidly than they've had to in the past - 'holding the line' or 'depriving an issue of oxygen' are no longer effective strategies for delaying decision-making into the leisurely timeframes that older decision-makers grew up with.

This issue in the disparate speed of transparency (data release) and accountability (clear and unequivocal response) is growing as more organisations release more data and more of the public is collecting, collating and releasing data from their interactions with organisations.

The imbalance is fast becoming a critical challenge for governments to manage and could lead to some very ugly consequences if politicians and agencies don't rethink their roles and update their approaches.

Of course governments could attempt to sit back and 'tough it out', trying to hold their line against the increasing speed of transparency and accountability. In my view this would result in the worst possible result in the long-term, with increasingly frustrated citizens resorting to more and more active means to have government take accountability for their decisions in the timeframes that citizens regard as appropriate.

My hope is that government can reinvent itself, drawing on both internal and external capabilities and expertise to find a path that matches fast transparency with appropriately fast accountability.

I'd like to see governments challenge themselves to test all of their historic assumptions and approaches - reconsidering how they develop policy, how they consult, how they legislate and how they engage and inform the community, in order to address a world where 'outsiders' (non-public servants) are identifying issues and worrying trends at an accelerated rate.

Perhaps we need a radical new ways to develop and enforce laws, that provide scope for experimentation within legislation for agencies to reinterpret the letter of a law in order to fulfill it's desired outcomes and spirit.

Perhaps we need continuous online consulting processes, supported by traditional face-to-face and phone/mail surveys, which allow government to monitor and understand sentiment throughout policy development and implementation and allow a 'board' of citizens to oversee and adjust programs to maximise their effectiveness over time.

Perhaps we need mechanisms for citizens to put forward policies and legislation for parliament to consider, tools that allow citizens to recall politicians for re-election or a citizen-led approach to determining what entitlements are legitimate for politicians and what they should be paid, with penalties and appropriate recourse for citizens to sack representatives who fail to uphold the values the community expects at a far greater speed than the current election cycle.

There's sure to be many other ideas and mechanisms which may help deliver a stable and sustainable democratic state in the digital age of high-speed transparency and accountability - we just need governments to start experimenting - with citizens, not on them - to discover which work best.

Read full post...

Wednesday, July 08, 2015

No it's a not appropriate to load test on your citizens in production - particularly when it's a critical service

The last week has seen a range of major issues for the Australian Government's new MyTax service.

As reported across both traditional and social media, people using MyTax to file their tax returns have experienced shut-outs, had the process freeze when they were almost complete and had it fail to autofill their pre-saved details.

MyTax is an online version of the eTax software which had been the primary way for people to digitally complete their tax returns for the last fifteen years. eTax improved year on year and had enormous take-up. In all respects it was a major success for the ATO.

This is the first year the Australian Tax Office has deployed the MyTax system and integrated it with MyGov. While the intention was, and is, good - to give Australians a single way to validate themselves with multiple government agencies - the implementation in this case hasn't withstood the real world.

This isn't a unique experience and it isn't limited to government. We've seen it with certain banking services, with retailers (particularly on a certain contrived Australian online shopping day each year), with A-grade games (such as SimCity) and with a range of other online services such as Apple maps.

In fact this issue is relatively rare, in comparison to the private sector, in government, with the last major issue of this type internationally being with healthcare.gov in the US, and the last I recall in Australia being with the MySchools site launch.

This type of issue will happen from time to time. Unforeseen bugs or network issues, denial of service attacks or other environmental issues can bring down even the most robust service, particularly at launch.

In every one of these cases there's a backlash from customers - and in every one of these cases the organisation responsible is judged based on how they manage and recover from the disaster.

In the MyTax case, while the ATO were probably aware of the risks, and may even have learnt some lessons from several of the issues highlighted above, it appears they're still struggling to manage and recover from the situation.

When asked about the siituation the CIO of the ATO, Jane King, wrote, as reported in the Sydney Morning Herald, that "Capacity planning and testing was completed as part of the rolling out of the new digital design, however due to the complexity of our environment, production is always the real test."

I read this as her saying that while they did conduct testing, they were actually relying on real citizens, at real tax time, to fully evaluate how the MyTax system would perform.

Just as the UTS professor John Leaney, quoted in the SMH article above, says - this type of statement just isn't good enough.

"We're not in the 1950s; we're not even in 1990s, we've learnt a lot and from what we've learnt we apply the techniques for proper capacity modelling," Leaney said. "There should have been much better testing; it's not something you should learn the hard way on a major government system."

The ATO needs to do better at risk planning around situations like this. It needs to test capability properly and not hide behind the 'too many users' defense.

Government agencies need to carefully watch and learn from this experience - and learn the right lessons.

The first lesson is to conduct appropriate capacity testing. Look at the ABS's implementation of eCensus and the level of testing and resilience it put in place the first time eCensus was used in 2006. The ABS gave a great presentation on the topic, which I attended, which highlighted the risk mitigation steps they'd taken - from capacity testing through to multiple redundant systems and real-time monitoring with developers on standby and fallback manual systems in place.

The second lesson is to not release major systems at a time when they are going to come under a huge load. Release a new tax system in February or March, or after tax time in October, giving time to shakeout the production system and address issues before it hits peak load.

The third lesson is to avoid releasing major systems. Instead release smaller, but useful, services and progressively integrate them into a major new service, testing each carefully as they go. This is how Facebook totally replaced its back-end without any disruption to people's use of the service - modularly upgrading aspects of the service until it was completely done.

The final lesson is to plan your recovery before your system fails. Design a failover plan for what happens if the system doesn't work for people, a manual solution if required. The ATO should direct anyone with issues to a hotline where they can complete their tax return over the phone, or via screen sharing, so no-one is left waiting for days or in a position of financial distress due to not receiving a tax return fast.

I feel for the ATO (particularly their ICT team) and don't blame them for the issues they're having with MyTax, however I do hold the agency responsible for how the ATO recovers from this disaster.

They need to stop defending their implementation of MyTax and focus on ways to meet citizen needs - even outside the MyTax system - to ensure that the 'tax returns get through'.

Otherwise this issue could turn into another Apple maps-style disaster, or even worse, as there's no 'competitor' to the ATO that citizens can turn to to complete their tax returns. At least, not yet...

Read full post...

Wednesday, October 08, 2014

How current events play out in search requests - terrorism & related terms in Google trends

While agencies often invest significant money into tools for tracking trends on social media, one of the simplest ways to detect and monitor the rise and fall of key topics and issues online is through Google Trends.

Google Trends tracks the frequency of use of specific search words in Google searches. This represents the majority of online and mobile searches in countries like Australia (93%) and the US (68%)

As a free service, Google Trends has been used over the years to monitor trends in seasonal diseases, such as influenza and dengue fever, to track the relative level of attention paid to politicians, the number of mentions of sports during grand final seasons, and to understand the impact of advertising on product sales.

I used the service back in 2006-2007 to help track a government agency's rebranding program, and have used it subsequently, both with and outside government, to track the level of interest in particular issues and topics.

So today I decided to see what Google Trends can tell us about the level of interest or concern in terrorism, specifically related to ISIS and concerns about muslim extremists.

I chose five main words to track - 'Terrorist', 'ISIS', 'Islam', 'Muslim' and 'Burqa' - which told an interesting story.



Until May 2010 the burqa does not appear to be a particular concern for Australians, with few searches of the term.

However since then it has become more topical, with some interest throughout 2011, then a sudden surge in September 2014 when the 'ban the burqa' movement began to receive significant political support and media coverage.

In contrast, terrorist was a term of interests to Australians in 2004 and particularly in the second half of 2005, with surging interest in July and November of that year. Following this, it settled down into a largely quiescent state, with only a small surge in November 2008 interrupting the mostly flat line.

This changed in August 2014, with a huge rise in searches for the term across Australia resulting in the highest level of searches for the term in the history of Google Trends in September this year.

The same trend can be seen for mentions of ISIS, which were flat until May 2014 and have rapidly escalated since. Early mentions of the term presumably relate to other uses of the term (such as the Egyptian god), with the sudden rise in searches only attributable to the rise of the Islamic State.

Searches for Islam and Muslim have also been rising this year after a long largely flat period. While these terms are the subject of many legitimate searches related to the culture and religion, the recent rise in searches does tend to suggest and correlate with the rise in searches for terror-related terms, indicating that people have linked the terms in some way, at least out of curiosity.

It's possible to compare and contrast these trends with global trends in Google Trends, per the chart below.



This chart provides evidence of growing global interest in terms such as Islam, Muslim and, particularly, ISIS. However it shows little international concern over the burqa or regarding terrorism.

This can be seen in detail when looking at individual countries.

For example while similar trends of increased interest in searching the term ISIS are visible for the USUK, Canada, SwedenJapan, Thailand and many others, only a relative few see the burqa as a rising source of concern and many also are not experiencing heightened searches for terms such as islam or muslim.

This may be coincidental, or may reflect political statements and media reports on these topics - a more detailed review of coverage would be needed to confirm direct links.

However given that researchers have found that Google Trends can provide an accurate view of community concerns regarding infectious diseases and product trends, I believe there's sound reasons to suppose a correlation between what leaders say and what people search for.

Read full post...

Monday, September 01, 2014

92% of Australia's federal politicians now use Facebook and/or Twitter

I've been tracking the number of Australian Federal politicians using Australia's leading social channels for two years now, seeing the number using at least one of Facebook and Twitter grow from 79% in April 2012 to 90% in November 2013 to a current level of 92%.

What's even more interesting is in the details, which you'll find in my thoughts below.

 To access the raw data and statistics, go to my latest Google Doc at: docs.google.com/spreadsheet/ccc?key=0Ap1exl80wB8OdFhaQ1gzdzg3d1VWcFJpSl91bkQwbWc&usp=sharing

While the overall use of social media is at 92.48%, or 209 out of 226 federal politicians (150 Members of the House of Representatives and 76 Senators), the situation is very different between the houses.

The House of Representatives has a far greater level of social media use at 95.33% compared to the Senate at 86.84%. The gap has remained consistent with my last review in November 2013, where it was 92.67% compared to 85.14%. Senators, on average, are slightly older than Representatives (51.55yrs versus 50.19yrs), which is likely a factor, as is the consideration that Senators don't campaign in the same way as Representatives due to the difference in how they are elected and who they represent (states/territories, not electorates).

I still believe that Senators are missing a trick here - due to their different responsibilities in most cases they represent much larger electorates and thus social media can be of value for listening to and engaging with people they can't as easily drive out to see face-to-face.

This rationale also carries over to Representatives with geographically large electorates, such as in the NT, WA, SA and Qld.

I am particularly glad to see that the Nationals, who focus on rural and regional seats and were previously the party whose elected politicians were least likely to engage online, has picked up their act in this area, and 19 of their 20 elected members now uses social media, a greater proportion that either Labor or Liberals (and I count Qld LNP Nationals as Nationals federally).

Social media remains very important for smaller parties to get their message out, with the Greens maintaining their 100% use of social media and every other minor party (KAP, PUP, DLP, Family First) and independent politician using social to some extent.

I see the same trends we've seen in previous years from politicians, and the population in general, still hold true.

Similar to my past reviews, female politicians are more likely to use social channels than male politicians, though by a smaller margin (4% rather than 7%) than previously.


Equally, the older a politician is, the less likely they are to engage on social media, with a clear divide by decade.


This reflects the same phenomenon as we see in the community - with politicians aged 60+ far less likely to use online channels for engagement, and those aged under 45 likely to use it as one of their primary ways to communicate and collect information.

This tends to suggest that the maturity of political decisionmaking regarding the internet is likely to continue to improve as older politicians retire and younger digital natives take their place.

For your reviewing pleasure, below is an infographic with some of the key statistics, please have a play with the interactive elements - they provide an interesting view of how actively different groups of politicians engage via Twitter and Facebook.




Read full post...

Tuesday, July 08, 2014

The importance for government of respecting open source and open data copyrights

An interesting situation has arisen in Italy, with the country's Agenzia delle Entrate, the Italian revenue service and taxation authority, accused of copying OpenStreetMap without respecting the site's copyright license.

As documented on the Open Street Maps discussion list, Italy's OpenStreetMap community discovered a little over three months ago that the maps used by the Agenzia delle Entrate in the website of the Italian Observatory of the Estate Market (housing market site) closely resembled those from OpenStreetMap.

In fact, they were able to establish that the Agenzia delle Entrate had copied data from OpenStreetMaps, then superimposed other data on top.

Now given OpenStreetMaps is an open source project, crowdsourcing the streetmaps of the world, that shouldn't normally be a problem.

OpenStreetMaps' data is freely available to copy and reuse - that's the entire point of it.

However there was one factor that the Agenzia delle Entrate had ignored. That the copyright license to freely reuse OpenStreetMap data came with one condition - to credit the source.

Using a Creative Commons by Attribution license, which is also the default copyright for Australian Government information, OpenStreetMaps required only one thing of organisations and individuals reusing their data - to provide an attribution back to the source.

This the Agenzia delle Entrate had failed to do.

OK - this isn't a big issue, and the folk in Italy's OpenStreetMap community weren't that worried to start with. They simply emailed the agency to ask it to correct this omission.

No reply.

Three months later - with no formal response from the agency, and no rectification of the copyright on the site, the OpenStreetMap folk stepped up their criticism.

They created a website where Italians and others can view and compare OpenStreetMap with the Agenzia delle Entrate's site to see how the Italian government agency has violated copyright for themselves.

You can view the website here: http://agenziauscite.openstreetmap.it/

It's in Italian (naturally), so if you don't read the language an online translation tool can help, but isn't required to compare the maps.

I suggest that visitors use the search tool in the left-hand map to find 'Milan', which is the city recommended for comparison purposes. Note that the agency took its copy of OpenStreetMap a few months ago, so is not as up-to-date as OpenStreetMap itself.


The situation has grown from a simple omission into an active campaign, not only because the government agency ignored the community concerned, but also because that community now feels that if the government is prepared to ignore copyright requirements so blatantly, how is any other copyright in Italy safe.

Essentially if a government agency won't do the right thing when reusing intellectual property, why should businesses or individuals trust them - or do the right thing themselves.

It's something that every government agency should ponder.


Read full post...

Monday, May 26, 2014

My Speech at the Sir Rupert Hamer Record Management Awards

On 22nd May 2014 the Public Records Office of Victoria hosted the 16th annual Sir Rupert Hamer Record Management Awards.

While Records Management is not often highly regarded by people outside the field, however it plays a vital role for organisations in retaining a history of their activities and interactions and, when their actions and decisions have public impact, on the history of a state or nation.

I was honoured to be invited to be the keynote speaker and, despite having my iPad stolen at Melbourne Airport, forcing me to fall back on less well constructed notes, gave a speech about the challenges of records management in the digital age.

Unfortunately as my notes had partially been lost I don't have a full record of my speech, but what I do have is included below.

Ladies, gentlemen and distinguished guests,
 It is a great honour to have the opportunity to speak to you tonight on a topic I have become very passionate about – the importance of public records. I wanted to start by share my earliest workplace experience with record keeping. It was in my first job after university, working for a management consultancy as an analyst on Sydney’s north shore in 1992.
 Computers were just coming into offices and my new employer had paid roughly six thousand dollars on a brand-new Apple 2ci for me, with the very latest in word processing and spreadsheet programs – AppleWorks – which many of you have probably never heard of. My first week was spent climbing on desks to network my new computer to the only other computer in the office, operated by the Office Manager, using Appletalk cables. The Office Manager’s computer was probably the most valuable electronic device in the office. She used it to transcribe all of the work by the consultants into formal reports and documents. Every week she diligently backed up her computer to a tape drive. Each month the last four weeks tapes were driven to a bank a few suburbs away and stored in a safety deposit box.
 The IP in that box was the value of the company. One day, a few months after I arrived, I decided to test our backups to ensure that we could retrieve their contents. The Office Manager and I brought back several tapes from the bank and loaded each in turn into the tape recorder. For each we tried to restore all the files that had been stored – a process that took around half an hour. And in each case to her mounting horror, we found the tape was blank. It turned out that none of the tapes she’d been diligently recording for several years had stored any information, because the consultant who had set up the system had made a mistake in the settings, and no-one had ever tested the tapes before. The real value and importance of public records really didn’t strike home for me until three years ago at a conference in Perth where a representative of the WA State Records Office told us a story of how public records saved a man’s life. The story goes something like this – the man’s parents migrated to Australia in the 1940s, bringing with them the little that remained of their lives in Europe. Their names were changed on arrival and they settled in a rural region where they could continue farming as their family had for centuries. They had a son, who was born at home and baptized at the local church – which burnt down some years later, taking all its records with it. The son was never issued with a birth certificate, and as he never aspired to university or to travel, neither he nor his parents ever applied for a passport or other official papers. His parents never bothered to formally become Australian citizens and when setting up bank accounts, mortgages and businesses in the 1970s he was never required to provide a birth certificate or other official documents. Moving ahead to the 21st century, the man’s parents had died and he was still living on their farm. He found himself in financial straits and applied to the government for support for the first time in his life. This brought him to the attention of officials for the first time and, when it became clear he had no birth certificate, no passport, no living relatives and no proof that he was Australian, the government set about the process of having him deported to the country from which his parents came. He didn’t speak the language, had no living family there – even the country had disappeared following the fall of the Soviet Union.
 He tried every avenue of appeal, and when they were all exhausted he went to the WA State Records Office to see if they had any evidence that he had been born and lived in Australia for his entire life. The State Records Office went back through their archives and managed to locate the records of the small rural school he had attended as a child, providing physical documentation of his enrolment when aged 5 years old. On the basis of that information, the Australian Government ended the deportation action and granted him citizenship.
 He was able to access the benefits he was entitled to and to remain living on his farm. This story made me realise how important public records can be. They don’t only capture a record of who we were and why decisions were made. They can also have real impacts on peoples ‘lives. As Record Managers, you’re not just preserving a historic record of what happened, but building a living breathing memory of Australian lives. And I commend you all for this. Record keeping may sometimes be undervalued, but is never unimportant. When I entered government I learnt of the record keeping principles that underpin many of the activities of agencies. Most of my colleagues were diligent record keepers – diligently setting up files for every project and sending folders of printed documents to the warehouses where they were stored. It was harder in my work – which was largely online – there wasn’t always clear guidance on how online communications should be stored as records. At one stage we were instructed to print every page of our websites, and reprint pages every time they changed. Over the three months this was the agency’s policy we printed over 20,000 pages  - and even then I think we missed many of the changes. I also recall the early concern and confusion over social media record keeping in the late 2000s. Should every tweets, posts and update sent or received by an agency be captured and stored, or simply those related to decisions? What tools were available to capture social media messages, and did the government even have the legal right to take copies of updates submitted by other people? I still encounter some concerns and lack of clarity over how to manage digital conversations – and fair enough.
 For the last hundred years record managers have dealt largely in one record format – paper. Governments could easily legislate what was and wasn’t a record. Paper could easily be captured, stored and controlled. They could be easily indexed, sought and found. Paper records could be preserved for hundreds of years - and pending long-term changes to language,  they could be easily read by future generations. However the world of paper records is now disappearing. Ever since the first Australian government websites went live in 1996 we’ve seen a gradual move from physical to digital records. Suddenly many documents no longer ever exist as paper records, except if there’s a conscious choice to print them. Decisions are requested, discussed and resolved via email. Policy documents go through dozens of iterations before anyone thinks to print them. Citizen enquiries arrive via social channels and are resolved in the same way. And suddenly rather than a single format, paper, record managers have had to contend with hundreds of formats, which can appear and disappear over a short time. From WordStar and Wordpress to Tweets and Facebook posts, Pinterest pins and Disquis comments – record keeping has fragmented. Each of these formats can individually be captured, as can their context – the format and conversation thread for which each is a part. However preserving many of them for later access is becoming a challenge.
 Right now it is hard to find working versions of many old word processing programs. In the future it is likely to be hard to find tools that can reproduce government records s in a contextual form from messages on many of today’s social media platforms. Beyond this moving feast of formats, we’ve seen a huge increase and fragmentation in the types of records that governments and the public are generating.
 Alongside the white papers and reports, memos and Ministerial correspondence that governments continue to create, information is increasingly conveyed in shorter, faster and more frequent chunks through emails, tweets and SMS. I can see a future where rebuilding decision-making processes, or responding to Freedom of Information requests, increasingly involves the skills of a jigsaw master. Historians of the future will have an advantage in that so much information is captured and stored, however the ‘bones’ of the past will increasingly need to be pieced together from powdered dust – thousands or millions of small pieces of information. The other main challenge for record keepers into the future is the risk of a digital black hole. Other societies have already found that as information is digitalized more of it is only kept in a transitory way, or is stored in ways difficult to retrieved. When I worked in government, as soon as I left an agency my email address was deleted and all the emails lost – as were my folders and files on the computers I had been assigned. Yes much of this was supposed to be backed up – however it required IT skills and time to restore, a cost impost that agencies could not bear in a wholesale way. Nominally these records were kept, but to be truthful, they could never be easily accessed. This digital black hole is probably the biggest challenge for record management today. While so many records are kept, they are kept in very different ways on different platforms and can be hard to translate into retainable formats while preserving the context and conversations. Records management professionals have to understand how to best preserve each type of record, not simply in paper or even digital files, but in formats that will speak to future generations, providing not only the words but the meaning, the context and the broader environment. They need to do this with an explosion of information and data, while files formats are constantly evolving and within a world of increasing scrutiny. This is an amazingly large challenge, and an important one for the history of the state, Australia and humanity, and fortunately record managers in Victoria have the experience and expertise to take on and be successful at this challenge.

Read full post...

Bookmark and Share