Showing posts with label emetrics. Show all posts
Showing posts with label emetrics. Show all posts

Wednesday, August 27, 2014

Algorithm detected Ebola outbreak nine days before health authorities using internet posts

There's a great article over at TechRepublic by Lyndsey Gilpin on how the computer algorithm behind HealthMap detected the recent Ebola outbreak nine days before it was identified by health authorities.

In How an algorithm detected the Ebola outbreak a week early, and what it could do next, Gilpin describes how by tracking, collating and corroborating information published in online news sources and social media, the algorithm was able to identify the 'mystery hemorrhagic fever' over a week before official health agencies.

However the significance of the outbreak was not realised by HealthMap founders until after health authorities got involved.

This type of use of internet 'chatter' and algorithms to make sense of the world offers enormous potential for organisations to better identify and understand underlying trends.

For government this means the ability to identify outbreaks of human, animal and crop diseases earlier, detect early indications of potential crises and trends in population views and behaviours.

In all these cases it gives government the opportunity - using only public sources of information - to react sooner and more appropriately, containing problems and getting ahead of issues.

Equally this capability can be used by commercial entities for marketing and product development, by financial organisations for faster and better informed investment decisions and by activists, lobbyists, foreign interests and terrorists to identify weak points for destabilising a nation or gaining advantage.

It remains early days in this area - not as early as when Google first released its flu map for Australia back in 2009 - but early enough that few organisations are, as yet, investing in this area (giving them a huge advantage over rivals).

However with HealthMap's algorithms now successful at screening out over 90% of unrelated information, the value of using this type of approach in policy and service delivery has now reached the point of commercial viability, which should only accelerate investment and research into the area in coming years.

If Australian governments aren't yet mining the public internet for intelligence to help improve decision-making, hopefully it won't be long until they do - at least to contend against others who might use this intelligence for less than positive purposes.

Read full post...

Wednesday, June 12, 2013

Sentiment analysis: where 'disabled' and 'disability' are often considered negative terms

It's come to my attention that a number of automated sentiment analysis tools include 'disabled' and 'disability' as negative terms.

This means that when calculating whether a particular statement in social media is positive or negative, the use of these words is used by these sentiment analysis tools as an indication that the statement is negative towards the topic of the statement - such as a topic, issue, individual or organisation.

I've checked a number of sentiment dictionaries online and found that both 'disabled' and 'disability' appear frequently as negative terms. However I have not yet been able to confirm whether any sentiment analysis products treat these words in this manner.

This disturbs me, given the efforts of governments and civic organisations in Australia and many other countries to remove negative stigma attached to the word 'disabled', even given its potential application in statements such as 'their system has been disabled'.

It also concerns me that agencies engaging online about disabilities or with disabled people, might accept that the sentiment reported by their social media monitoring tools indicates negativity where in actuality no negativity exists.

I would caution government agencies using automated sentiment analysis tools to get to know they work and check how terms such as 'disability' and 'disabled' are treated in these systems.

I'd welcome comments from makers of sentiment analysis tools to confirm how they treat these words or from agencies using automated sentiment tracking if they've seen these words or others rated negatively or positively in ways which might be misleading and misrepresent the actual sentiment.

Read full post...

Monday, May 27, 2013

Australian academia beginning to learn to crawl in 2.0 social channels

I've long lamented the speed at which academia was embracing the internet, social channels and 2.0 approaches - with limited courses available on modern online techniques for under and post graduates, old fashioned-approaches to research and publication.

There's been hints of brilliance overseas - with US universities placing courses online and UK universities embracing social in a major way - however Australia has largely remained a backwater for higher education in a 2.0 world, with individual exceptions at specific universities, such as Dr Axel Bruns and Julie Posetti.

To demonstrate some of the impact of this Australian academic drought, a few months ago I was approached by a European professor about identifying an Australian academic working in the Gov 2.0 field to write a chapter in an upcoming book on Government 2.0. 

This professor, who I had previously worked with on a major report on global Gov 2.0 for the European Parliament (unfortunately not publicly available), had failed to identify anyone in Australia working in the Gov 2.0 space through her academic channels.

I made initial enquiries through a number of my Gov 2.0 contacts in government, as well as to a range of academics and universities, however was unsuccessful at finding anyone through their systems. In the end I was very lucky to encounter an academic in South Australia with relevant expertise at an event I was speaking at in Adelaide. This academic is now working on the book project and I'm very interested in how it turns out.

We have seen some recent stirring towards greater acknowledgement of 2.0 approaches in the recent ARC (Australian Research Council) moves towards open access publishing of public-funded research, however this is still a very small step.

We have also seen some good debates on the role of the public in science, and some pilots such as the Peer-to-Patent, which strike at the commercial end of the spectrum, and the Atlas of Living Australia, which involves citizens in mapping Australia's biodiversity.

We're also now seeing some steps to move beyond the traditional peer review process to consider new ways of measuring the reach and impact of academic research, with the 'altmetric' movement gaining steam.

What are altmetrics? I admit I hadn't heard about them until recently and when I first encountered the turn found the name little more than marketing buzz. 

Essentially the term describes the use of online social metrics to assist in measuring academic success - mentions on Facebook and Twitter, the level of reuse of raw research datasets via APIs, 'semantic publication' of specific passages and references to academic articles in blogs and forums, and more.

The term altmetrics was developed by the founders of one of the first companies that is spruiking altmetrics solutions to academics, and the biggest supporters of the term are other companies seeking to profit from the same rush to web statistics. Therefore I am still inclined to regard the term itself as marketing buzz for the types of social metrics commercial and public sector organisations have been using for years (see the chart below on the growth of use of the term in Google searches).

However it does signify an important and major change in how academic research is measured and valued.

If academics begin measuring their success in how well discussed and commented on their work is in the public sphere, they will likewise begin talking more about their research publicly in order to grow their buzz and their recognised academic prowess.

This will encourage academics to get out from their lecture theatres into the community, become more proficient at communicating their thoughts and work to a broader layman audience and making research more accessible, interesting and influential in public debates and policy work.

I also hope more publicly available research will also lead to more people interested in pursuing these careers, greater commercialisation of research work, improved scrutiny of findings and better social outcomes.

However I hope that at some point academics will realise that 'altmetrics' are simply no more than metrics - ones that are already becoming business-as-usual in commercial and public sector spheres - and focus more on involving people in and sharing their research than on the marketing buzz.

For more information on altmetrics, see:

Read full post...

Tuesday, February 05, 2013

Infographic: The top government Twitter accounts in Australia

In January 2013 I found that the total tweets by all government agencies and councils in Australia I track had exceeded one million.

As a reflection of that achievement I've worked through the data I have on the use of Twitter by government agencies and councils in Australia to produce the following infographic (scroll for more).

I'll be producing state by state (including territories and federal), local and topic-based infographics as a follow-up over the next few weeks, with more detailed information.

I'm considering writing an academic paper on the use of Twitter by government in Australia in case there's any academics out there who would be interested in co-authoring.

Read full post...

Wednesday, January 16, 2013

Infographics: How does Australia compare on government open data released?

I've developed several infographics (below) comparing the open data performance of nations, looking at which have national open data sites, how many sites they have across different government levels and how many datasets have been released through their national sites.

It's not a way to judge 'winners' and 'losers' - or even to compare the relative performance of countries. However it provides useful information on who is doing what and how deeply open government has been embedded in the thinking of agencies. This said...

There are 41 countries listed (by data.gov) as having open data websites, out of almost 200 nations.

In their national open data sites, in total, these nations have released at least 1,068,164 data sets (I was unable to get a count from China, Timor-Leste, Tunisia or Sweden's national open data sites), for an average of 28,869 and a median of only 483 - due to a few high release countries (US, France, Canada).

How do Australia and New Zealand rank?
As people will look for this anyway, based on the number of datasets released as of January 2013, New Zealand is 9th (with 2,265 datasets) and Australia 11th (with 1,124 datasets).

Between us is Estonia, with 1,655 datasets.

The top nations above New Zealand are, in order: US (378,529), France (353,226), Canada (273,052), Denmark (23,361), United Kingdom (8,957), Singapore (7,754), South Korea (6,460), Netherlands (5,193).

Infographics




And finally, as a tree map showing the relative size of nations by datasets...




Raw data
The raw data is available in a spreadsheet at: https://docs.google.com/spreadsheet/pub?key=0Ap1exl80wB8OdFNvR3dja3E4UGtVVi1LMU11OFBmR1E&output=html

Caveats
When it comes to nations and states there's few absolute measures, there's simply relative performance - across jurisdictions or across time.

These comparisons are often flawed due to variations in data collection, lack of information or differences in approach, however there can still be value in 'placing' nations, identifying opportunities, challenges, flaws and risks.

My work above is not a measure of the success of open data itself, but provides a relative indicator of which governments have been more successful in embedded open government principles in agencies, and how deeply. It also provides insight into wich nations are working in this space.

My data spreadsheet is also a useful 'point in time' reference to track changes over time.

Note that I was unable to count open data released outside of national open data sites - there's a lot more of this, however it can be harder to locate. Due to the sheer number of state-based open data sites (210), I've not yet done a tally of the datasets they've released, only of the 41 national sites. Watch this space :)

The data may not be 100% accurate due to differences in the approach to releasing data. data.gov provided the list of data sites and I drew specific information on datasets and apps from all 41 national open data sites, each with a different design and functionality and across over a dozen languages.

Please let me know of any inaccuracies and I will endeavour to correct them.

Read full post...

Wednesday, October 03, 2012

Government tops the list of effective email marketers

For all the claims of government communication being expensive or ineffective compared to the private sector, government has topped the Vision 6 Email Marketing Metrics Report for January - June 2012.

Vision 6, an email marketing company based in Queensland, has reported on the email marketing effectiveness of Australian companies and agencies for the last five years.

Government has consistently performed well in these reports, well ahead of industries such as IT & Telecommunications, Insurance and Superannuation, Advertising/Media/Entertainment, Retail and Consumer Products, Hospitality and Tourism and other 'traditional' heavy email marketers.

In the January - June 2012 report, Government topped the list of 16 industries both for most email opens (33.64%) and most clickthroughs (8.89%).

Open rates for industries from Vision 6's Email Marketing Metrics Report
Looking across all industries, the average bounce rate for emails was around 5.5%. This varied slightly by size of list, the lowest for lists of 10,000 or more email addresses at 5.01% and the highest for lists of 500-9,999 email addresses at 5.81%, with Government averaging 5.38% across the board.

The lowest bounce rate was received by the Trade and Services industry at 1.98% and the highest by Science and Technology at 11.67%.

All days saw fairly even open and click-through rates, dispelling the myth that people prefer opening emails on Tuesdays, and Thursday appeared to be the most popular day for sending emails, despite being average for open and click throughs.

Almost two-thirds of emails (64.65%) that were opened were opened within the first 8 hours (30.2% within one hour and another 34.45% between one and eight hours), four in five within 24 hours and 91.66% within 72 hours (three days) of sending.

Vision 6 says that with increasing use of mobile devices the time before emails are opened is falling - so with only about half of Australians using smartphones and 12% of households owning a tablet (compared to 18% in the US according to Pew Internet), there's plenty of scope for email open timeframes to continue to decrease.

Mobile has become so important already for consumers that Vision 6 also reported that the iPhone mail application has leapt into third spot (at 16.28%) behind Outlook 2003 (at 17.54%) and Apple Webkit (at 16.53%). In fact mobile accounted for 24.33% of all email opens.

To gain more insights on email marketing, and to view all of the reports back to 2006, visit Vision 6 Email Marketing Metrics centre.

Read full post...

Sunday, April 01, 2012

Australian government agencies achieving the highest click-throughs of all sectors for email marketing campaigns

I've been browsing the latest Email Market Matrics Australia report from Vision6 and it definitely has good news for government agencies.

This series of reports has been running since the second half of 2006 and has, for me, provided a very useful insight into the effectiveness of email marketing in Australia over the last five years.

The reports are based on data from Vision6, so there's a slight bias based on being a single vendor (competitors such as CampaignMonitor don't yet release similar reports, or combine their information into a single industry report). However it is based on 259 million messages distributed via 112,000 separate campaigns by predominantly Australian companies (and they exclude all emails sent by  stand alone resellers and corporate networks) - so it is a large sample for reporting purposes.

Vision6's software (similar to its competitors) tracks email campaigns by sends, bounces, email opens and click throughs (to links in email messages).

This provides very useful ROI data for agencies. I have always tried to encourage agencies to use these types of tools to manage their email newsletters so they can properly report on them and detect user sentiment and trends (this also takes the load off the, often overburdened, email systems used by government agencies).

The cost of these products is quite low considering their capabilities - particularly when looking at A/B testing to identify the most effective newsletter format and content (by sending differently formated emails to several small subsets of your email list, comparing open rate/click throughs and then distributing the most effective email format to the full list).

I'm not aware of any agencies who do currently use A/B testing for either email or websites, though this is widely used by business to maximise ROI - however I live in hope.

Back to the Vision6 survey and its results - the latest July-December 2011 survey reports that  government agencies and defense have retained their position as achieving the highest open rate of any industry sector in Australia, with 31.66% of emails opened by recipients (an increase of 0.97% from last survey).

This means that if, as a government agency,  you send out an email to a 10,000 person list, on average 3,166 of them will be opened. The others will end up deleted, ignored, blocked or bounced (where email addresses are full or closed).

While this doesn't sound great, it's actually a much higher exposure level than achieved through other mediums. It's also a much better rate than for many other industries, such as construction (20.99% open rate) or sales and marketing (14.79% open rate).

It is also important to consider that smaller lists tend to achieve higher open rates - perhaps due to the additional effort in managing the integrity of larger lists.

By send volume, on average across all industries, lists with under 500 subscribers achieve a 33.17% open rate, dropping to 19.76% for lists with more than 10,000 subscribers.


Government also topped the unique clickthrough rates for all sectors, with 8.42% of subscribers clicking through from the email to further information on a website. This compares to the bottom-place IT and Telecommunications sector, which only received a unique clickthrough rate of 2.25%.

The average clickthrough rate for all sectors was 4.22%, although this also declined by list size (from 7.31% for up to 499 subscribers down to 4.07% for lists of 10,000 or more.

Government also did well on bounce rates, with only 4.43% of emails not getting through. Whilest not the lowest rate, which is held by the Call Centre/Customer service sector with 3.29%, government was third highest and much, much better than the 15.27% bounce rate suffered by the Science and Technology sector, or 10.47% by the Manufacturing/Operations sector.

The average bounce rate was 5.45% and, interestingly bounce rates didn't consistently increase with larger lists.

Vision6's report indicated that lists with under 500 subscribers received, on average, a bounce rate of 5.28%. However lists with more than 10,000 subscribers received a marginally lower 5.26%. There was a bump in the middle however, with lists of 5,000-9,999 receiving 6.07% bounces and lists with 500-999 and 1,000-4,999 reaching 5.90% and 5.70% respectively.

The time taken to open email addresses appears to be falling, with 29.46% opened in the first 24 hours and 90.72% in the first 72 hours. Vision 6 reports that this last figure has increased consistently onver the last five years.

So, finally, what about the email clients used by people? This is important as emails can be distorted, or even unreadable, if the email client doesn't correctly display it.

While the majority of government agencies use Outlook or Lotus Notes email, this isn't the case in the broader world.

When looking at the email clients used by people opening received emails (an average of 21.83% of emails sent), Outlook accounted for 43.54% of clients (22.24%, 14.90% and 6.40% for Outlook 2003, 2007 and 2010 respectively).

Hotmail accounted for another 16.21% and iPhone Mail accounted for 15.14% of email clients (and iPad Mail for another 3.7%) - demonstrating how strong mobile email has become - followed by Apple Mail at 11.98%.

'Other' received 20.11% - which included a range of services such as Gmail, Lotus Notes and others. I would like to see Vision6 really break this out further - however individual agencies can do this if using this type of email management platform.

There's clearly a strong need for organisations to understand how their subscribers receive and view emails as there can need to be important design differences depending on the client - even between different versions of the same product (such as for Outlook).

In conclusion, government in Australia already appears to be using email marketing well - at least when they are using email management systems such as Vision6, it's harder to judge email lists that don't use a management and reporting tool.

However there always remains room to learn from the figures and further improve the design and cut-through of email newsletters - particularly as mobile email continues to strengthen.

Email is still a very strong channel for reaching people with information, particularly in older demographics where social media engagement is less, and should be a core plank of any government communication strategy.

Remember that an email list is an organisational asset. People who have agreed to receive information from you are far more likely to engage and influence others. Don't squander and destroy this asset through poorly considered email strategies, which may include too frequent, too irregular or too 'boring' email updates.

Use approaches like A/B testing to determine what layout and headlines get the most cut-through, improving your ROI, and keep an eye on what people click on to see what types of information or stories hit the mark.

Email marketing is a science, there's plenty of evidence available on what works and cost-effective quantitative measurement tools for tracking and tweaking your own email newsletters.

Don't waste the opportunity by ignoring the evidence, or destroy the ROI by not measuring, reporting and adjusting.



Read full post...

Monday, March 26, 2012

Is online influence measurable or meaningful?

Online influence is a hot topic right now, with companies such as Klout, PeerIndex, Empire Avenue and Kred all building online services that aim to measure the influence of internet users, in order to better target advertising dollars.

But how effective are these services really?

Does the number of followers, retweets or likes or some form of combination really identify those most likely to influence decisions and behaviours on a large scale?

Would any of these services have identified Janis Krums as an influencer of millions, before he tweeted a photo and message to his 170 Twitter followers about the plane that had landed on the Hudson River?

Would they have identified QLD Police Media as an important and influential account a few weeks before the Brisbane floods?

Would any of them have identified Rebecca Black, singer of 'Friday', as influencing an entire generation?

Influence online can ebb and flow rapidly. People go from virtually unknown to globally famous to unknown in a matter of weeks, days - even hours.

Therefore I was interested, but perplexed when I received the following email from PeerIndex a few days ago.

PeerIndex email:
I work at PeerIndex and we have a group on Australia top Twitter influencers and was wondering if I could get your feedback because you are on the list. PeerIndex measures interactions across the web to help people understand their impact in social media.

I was wondering if you could look over the list and let me know if you felt it was accurate? Do you recognise the other people on this list?  Is it missing people that you think are important?  

We would like to open up a dialogue with people in your field and think this would be useful to them (or at least start a conversation) it was accurate and interesting.

 
Thanks very much for your time,

I had a bit of a think about this and realised that I am an influence sceptic.

I am interested in sentiment online - whether people believe/perceive and say good or bad things about a topic. I think there's a strong future in this as a way to judge a general mood, supported by other more refined techniques.

However influence is just too hard to measure if only one dimension - online is taken into account.

Hence my reply, below:
Hi ,
 
I would love to help, however I don't think I honestly can.

I just do not understand how influence on Twitter, or on other online or offline social networks or situations, can be calculated in any effective manner.

Interactions online don't necessarily translate into actions offline and influence is generally a subtle and cumulative process - which requires multiple sources over a period of time.

For example, you tell me something on Twitter, I see something related from someone else in a forum, it gets discussed at work, I do some research as my interest is raised, then it appears in the traditional media and then I see others I trust taking a position and then I do.

The interlockings between topics and influence are incredibly complex and related to individual mental models and worldviews. Something that would influence one person will have no impact on another, people weight influence based on source, channel, frequency and relationship - and every individual has their own influence model - what will or will not change their view.

For an example (or study) of this, just watch the classic movie '12 angry men'. It is a brilliant look at how varied the influencers for different people may be.

I don't think there is a reliable way to identify influencers or put people in boxes for influence.

I find your, and other similar services, amusing, but do not see how your algorithms have accurately modeled my, or anyone else's levels of influence on the micro or metro topical level. 

Your models are simply far too simple and work on a subset of observable influences with no characterization of the individual influentiability of different people in different environments at different times - nor how long-term that influence will be.

Behavioural psychology is an extremely complex and poorly understood science. About the only way we can reliable detect influencers at any specific time or micro topic is in hindsight.

Humans are lousy at determining what is likely to be influential, other than by 'gut instinct', or through sledgehammer techniques, such as mass repetition (show the same message enough times to a broad enough group of people and some will be influenced).

So sorry, I don't know what makes people influential - chance, chemistry, repetition, a match with a particular mental model, a combination of influencers all working in alignment, or a reaction against a 'negative influencer' (a de-influencer? Someone we love to disagree with).

I certainly don't see how dividing people into boxes by arbitrary topic helps define their broader influence, or specific influence across other topics. The amount they talk about a topic isn't a good judge either, and it is always unclear whether someone 'heard' the message on a service such as Twitter.

So I don't think I can help you. Nor am I sure if your service, or Klout or the others in the space has a real business model. Though I do hope that your collective efforts expand our understanding of how connections between people can sometimes influence them.

Cheers,

Craig

What do you think?.


Read full post...

Monday, September 13, 2010

What does it cost to build and run a government website?

The Guardian reported in July that the UK government has released details on the costs of developing, staffing and hosting their major government websites.

The data includes web traffic, accessibility and user opinions on the websites.

This type of data is very useful when modeling the costs of developing and operating government sites, allowing agencies to more accurately forecast costs and staffing needs. It allows agencies to compare their web operations with other agencies, providing a view on who is most - and least - efficient.

The approach also allows hard-working, poorly resourced and funded web teams to more effectively argue for a greater share of the agency pie.

I would love to have such data available here in Austraia - down to being able to derive a total cost per visit (which for UK sites ranges from 1 pence up to 9.78 pounds - see the Google spreadsheet below). It would significantly assist web teams and agencies in their planning and activities.


The UK website data can be downloaded here.

Or see the data visualised (using IBM ManyEyes) and a Google spreadsheed of the costs below.








Download the full list as a spreadsheet

Read full post...

Wednesday, December 02, 2009

Getting serious about web analytics in Australian government - join the new group

Last week the Australian Bureau of Statistics ran a free event for government website managers to discuss web analytics - how different agencies were doing it and what, collectively, we would like to see happen in the area.

There were a number of excellent presentations and plenty of time for group discussion. In fact it's the best such event I've seen run to-date within government and was better than many of the (more costly) commercial conferences.

Some of the outcomes of the day included a recognition that while there are many different tools and reasons for measuring public websites, there are some standards we should have in place across government to define and agree on appropriate metrics - beginning with the basics like page-views, visits and unique visitors.

There was also a good discussion around the prospect of a whole-of-government web reporting system which would allow agencies to directly benchmark and compare against appropriate peers. The Victorian government has made great strides towards this already, as has the NSW government.

To continue the conversation, and begin to recommend some firm ideas for how to proceed in the web analytics space at all levels of Australian government, a Web Analytics For Australian Government group was established at Google at the end of last week, and is already beginning to see some discussion of the topic.

If you're involved or interested in website management and measurement - or simply wish to understand how to measure the effectiveness of websites alongside other communications channels - please join the Web Analytics For Australian Government group.

Read full post...

Wednesday, October 28, 2009

Short takes for public sector management - Shift happens & Did you know?

If you're having difficulty getting across to your management the magnitude of the impact of the internet and changes in society, try showing them one or more of these videos - each is only around eight minutes long.

They provide a snapshot (in figures) of the changes taking place in the world.

In case you experience resistance, mention that Did you know 3.0 was used by New York State's CIO, Dr Melodie Mayberry-Stewart, in her presentation at the recent CEBIT Gov 2.0 conference.

Did you know 4.0 (2009)


Did you know 3.0 (2008)


Did you know 2.0 (2007)


Shift Happens (Did you know 1.0) (2006)

Read full post...

Tuesday, October 27, 2009

What's the median age of social network users?

Often it's assumed that teenagers are the main users of social networking tools from Facebook to Twitter.

However the research conducted over the last few years indicates that the real situation is a little different.

Based on the most recent Pew Internet research (of US internet users 18+) the median age of popular social networks are as follows,

  • Twitter median user age 31yrs (stable from May 2008),
  • Facebook median user age 33yrs (up from 26yrs in May 2008),
  • MySpace median user age 26yrs (down from 27yrs in May 2008),
  • LinkedIn median user age 39yrs (down from 40yrs in May 2008).
Looking at Twitter in focus, Comscore reported that while 12-17yr olds only made up 12% of visitors to Twitter's website in June 2009 this was double the percentage reported at the same time last year, and 18-24yr olds increased to 18% of visitors compared to 11% last year.

Neilsen data from February also suggests that Twitter is most popularity among older demographics, with adults ages 35-49 having the largest representation on Twitter in February 2009, comprising nearly 42% of the site’s audience.

Pew Internet's profile of a (US) Twitterer also provides useful information on who is Twittering - and why.

Age Distribution of Twitter users (Comscore - April 2009)
Source: http://blog.comscore.com/2009/04/twitter_traffic_explodesand_no.html




















Read full post...

Thursday, October 22, 2009

Register now for the ABS's Web Analytics in Government forum

The ABS's Customer Insights Team, in conjunction with AGIMO, is holding a Web Analytics in Government forum on Tuesday 24 November in Belconnen.

To quote the ABS,

Our aim is that the forum will allow participants to learn from others and share practical knowledge and experience in:
  • pitfalls of implementing web analytics in a government environment;
  • understanding online behaviour and experience of users;
  • developing performance indicators for websites; and
  • knowing which reporting metrics to use and when.
An outline of the programme is available online.

There are only around 50 places available at the forum, so if you're interested in attending register now.

Read full post...

Wednesday, August 12, 2009

Introducing a common web reporting platform across federal government

Over the last few years I've often thought about the value of having a complete picture of web traffic to the Australian government.

This would require a common way to track and report on the usage of each discrete government website and the ability to track and measure the traffic between them over time (using anonymous user data).

I see enormous value in this approach. Firstly it would help government departments holistically understand how citizens see the inter-relationships between different government services and information across agency boundaries.

Secondly it would support smaller agencies to cost-effectively develop appropriate reports and access the data they need to improve their online presence and provide ROI for online initiatives. Rather than web reporting sophistication being a factor of agency size it would become a consistent core whole-of-government capability, regardless of agency size, budget, technical skills and inhouse web expertise.

Thirdly this approach would help executives and web professionals moving between government departments as they could expect a consistent level of reporting for the online space no matter where they worked. This would cut down learning curves and help improve the consistency of online channel management across government.

Finally, having standardised and consistent web reporting would lead to consistent and more accurate reporting to parliament of the overall size of the government's online audience, and the share held by each department, supporting decision making for the use of the online channel.

So could this be done?

I think it could.

We have precedents for whole-of-government licenses in the use of technologies such as Funnelback for search (which crawls all government sites for Australia.gov.au and is available for departments to use for their web search) and Adobe Smartforms for business forms (via business.gov.au).

The technology for whole-of-government online reporting is readily available without requiring major changes to how any department operates. The reporting could be deployed simply by requiring the addition of a small piece of code to every web page on every site, as is used by systems like Google Analytics and WebTrends On-Demand. Departments could even continue to also use their existing in-house tools if they so chose or exclude websites where special circumstances applied.

Through aggregating the reporting function, more funds and expertise could be focused on producing more meaningful and useful reports. Standard report templates could be developed for departments to use - or not - as they preferred.

Finally, this approach would provide cost and procurement efficiencies for government. Only one procurement process would be necessary to select the product, rather than individual processes being conducted by various agencies. The scale of the federal government means that government could purchase and maintain the tool at a much lower cost per department than it would cost a department to purchase an appropriate tool.

Read full post...

Thursday, July 02, 2009

US Federal government launches public IT Dashboard

I've been known to say, from time to time, that what you cannot measure you cannot manage. This is especially true in IT-based projects, which often involve significant investments and where deadlines and budgets can easily slide.

Given it has been estimated that 68% of IT projects fail to realise the benefits or outcomes they set out to achieve, it is vitally important that good measurement be in place to manage these investments and ensure that the responsible parties are accountable for the outcomes.

The US government has taken a major step towards public accountability over government IT investments with the release of the IT dashboard website.

Speaking to the Washington Post in the article, Government Launches Web Site to Track IT Spending, US Federal CIO Vivek Kundra stated that,

"Everyone knows there have been spectacular failures when it comes to technology investments," Kundra said. "Now for the first time the entire country can see how we're spending money and give us input."
Featured at the Personal Democracy Forum in New York on Tuesday, the IT Dashboard provides information on US$76 billion of US Federal IT spending, breaking it down by agency and into individual projects.

The site is more than a list of numbers. It provides interactive graphics and charts which allows visitors to identify which projects are running behind schedule or over budget - as well as those on time and on budget.

The site also makes the underlying data available in open formats, able to be reused in citizen applications and cross-referenced with other information sources to generate new insights.

While the site is undoubtably a nightmare for CIOs who have inadequate cost accounting systems or a high level of date and over-budget projects, it provides an extremely valuable role in enforcing accountability on public spending and supporting both citizens and elected officials to visualise, understand and ask the right questions about government IT investment decisions.

In other words the site aids the democratic process and encourages Federal Departments to ensure that they are running their IT projects effectively - which Kundra has already seen happen in practice,
"I talked to the CIO Council and saw the data change overnight," Kundra said. "It was cleaned up immediately when people realized it was going to be made public."
Consider the benefits to the US if government IT failure rate could be cut significantly - potentially doubling the value of every public dollar invested in IT.

I would love to see a similar site in Australia as I believe there would be similar benefits to the democratic process, transparency, accountability and improved ROI for the taxpayer dollar.

Below is a video explaining the site.

Read full post...

Tuesday, June 02, 2009

IAB publishes social advertising best practices guidelines

The Interative Advertising Bureau (IAB) has published guidelines for best practice in social advertising, prefacing it with the statement,

Social media has overtaken email as the most popular consumer activity, according to a recent Nielsen study
If your department or agency is looking to use any of the online social channels that now exist, from Facebook to Youtube to Twitter, to reach your audiences, these guidelines are a must read.

To quote the guidelines,
This document outlines recommendations for these key social advertising topics and is intended for social networks, publishers, ad agencies, marketers, and service providers delivering social advertising. These best practices were developed via a thorough examination of the critical consumer, media and advertiser issues to help social media further realize its advertising potential.
They are available at the IAB website's social advertising section as a PDF download.

The IAB also provides definitions of social metrics for measuring social media sites, blogs, widgets and social media applications.

Read full post...

Friday, March 20, 2009

The power of raw government data

In the US President Obama's newly appointed (and first) Federal Government CIO Vivek Kundra has committed to finding new ways to make government data open and accessible.

The Computer World article, First federal CIO wants to 'democratize' U.S. government data, discusses how,

In a conference call with reporters, Kundra said he plans to create a Web site called Data.gov that would "democratize" the federal government's vast information resources, making them accessible in open formats and in feeds for developers.

He also said he hopes to use emerging technologies like cloud computing to cut the need for expensive contractors who often end up "on the payroll indefinitely."
These are not idle words from a political appointee - Kundra, who I have mentioned previously, is well-known amongst egovernment practitioners around the world for his innovative work in pushing the boundaries of egovernment as the District of Columbia's CTO.

Politicians often have reservations about releasing raw data, despite being collected using public funds, due to perceived concerns that the data might be used to politically damage their reputations.

Similarly government departments often restrict the release of raw data due to concerns over how it may be reused or misused.

In Australia we even go to the extent of copyrighting government data. In the US most data, publications and other tools created by their Federal government are copyright free.

However with the US's moves the debate will soon shift to the disadvantages of not allowing free access to most raw government data.

As history has recorded, countries that remove barriers to the free flow of ideas and information develop faster, are economically more successful and their people enjoy higher standards of living.

Fostering innovation directly leads to national success.

So in a world where some countries make data freely available, how do other nations continue to compete?

To draw an analogy from the publishing world, Wikipedia disrupted the business model for Encyclopedia Britannica. By providing free 'crowd-sourced' information of greater depth and about the same accuracy as a highly expensive product, Britannica has been struggling to survive for years.

After trialing a number of different protective business models to sustain its existence, but protect its data, Encyclopedia Britannica has finally adopted one that might work - it has opened its articles up to 'crowd-sourcing', accepting suggestions which are then reviewed and acted on by its professional editors - a step towards openness. Visit the Britannica blog to learn how to suggest changes to the encyclopedia.

In other words, you cannot beat openness with secrecy - the only way to remain successful is to step towards openness yourself.

This really isn't news. Many have talked about the need for greater openness of government data before. I've even mentioned it myself once or twice.

To finish, I thought I'd flag this recent talk given by Tim Berners-Lee (the father of the world wide web) at TED on the need for open data. It has some points worth reflecting on.

Read full post...

Wednesday, January 28, 2009

Australia's 2008 eGovernment Survey results released

It has taken me a little while to post about this report, as when I first saw the media release in my webfeed I mistakenly thought this was news about the previous survey.

However having finally caught up, thanks to the eGovernment Resource Centre, I'm pleased to see that the 2008 eGovernment survey shows the same trends as previous years of increasing internet usage by Australians and increasing online engagement with government.

It also bursts a few of the prevalent myths about internet users, such as all internet users being young and hip (ok so they are all hip, but some of them are also older).

Some of the key findings included,

  • 79% of Australians use the internet, this decreases by age, with 94% of those 18-24 years old, 93% of those 25-34 years old, 90% of those 35-45 years old, 81% of those 45-54 years old, 74% of those 55-64 years old and only 44% of those over 65 years
  • Nearly two-thirds of people had contacted government by internet at least once in the previous twelve months
  • More than three in ten now use the internet for the majority (all or most) of their contact with government
  • The internet has replaced contact in person as the most common way people had last made contact with government
  • Those who use the internet to contact government have the highest levels of satisfaction followed closely by those who made contact in person. Those who used mail to contact government had the lowest levels of satisfaction.
  • Over two-thirds of people use broadband at home
  • More than four in five people use newer communication technologies at least monthly. The most common are email, SMS, news feeds, instant messaging, social networking sites and blogs
So, in short, 
  • most people can contact government online, 
  • more people are choosing online as their most preferred way to contact government, and
  • those that contact government using the internet are more satisfied.
That's a great story to sell to senior public servants!

The full report is available for download from the Department of Finance as, Interacting with Government - Australians' use and satisfaction with e-government services—2008

Read full post...

Tuesday, January 06, 2009

Does your department adequately manage your website?

As a former and current business owner I'm constantly considering and reflecting on our numbers regarding how well my agency's online channel is performing. My goal is to maintain an ongoing awareness of our performance and why we're performing in that manner.

To that end my agency has multiple web reporting systems in place and I use them regularly - given that I've been measuring websites now for over ten years and have a good feel for valid and invalid web metrics.

For me measurement leads to effective management. Without the information from measurement over time I cannot make good decisions regarding our online channel, provide expert advise to senior decision-makers, advocate for appropriate development of the channel or prioritise the content updates that are most importance to our audience.

However this doesn't appear to be the experience for all website managers across the Federal public service.

Per a report in the Canberra Times, the Commonwealth Auditor-General says many Federal Government agencies inadequately manage their websites, are unaware what they cost to run, and risk providing the public with outdated or inaccurate information.

The ANAO report, available as a PDF at Government Agencies' Management of their Websites was published on 16 December and involved a survey of 40 federal agencies, followed by an audit of five.

It found that agencies were increasingly relying on websites to provide information and services to the public and that,

This increased reliance by agencies on websites to provide information and services, brings with it a greater need for agencies to have sound approaches to manage their sites. Poorly managed websites not only increase the risk that information and services are not provided to website users at reasonable cost to government, but can have adverse impacts on other service channels such as extra work loads for call centres and inquiry outlets.
It also commented that,
All of the audited agencies monitored website user activity and satisfaction. However, none of the audited agencies reported specifically on how their websites were meeting their respective purposes and how they were contributing to agency business goals. Also, most agencies had little information on the costs of operating and maintaining their websites. Agencies with websites that pose significant risks to service delivery or that have multiple websites would benefit from an improved understanding of their website user activity, performance, and cost information.


In the forty agencies surveyed, only six maintained firm website cost data - meaning that the other 34 did not have a clear idea how much their online channel cost relative to other channels.

In another case the ANAO reported that one agency simply provided raw weekly hit data to management as a performance tracking tool, with no explanation of what 'hits' meant, nor what a good or bad outcome would be.

The ANAO followed up with four recommendations for agencies,
  • develop a clearly stated purpose for each website;
  • strengthen agency decision making through improved risk management;
  • review content management processes and practices; and
  • strengthen performance monitoring and reporting.


I agree with all of these recommendations. They're an important basis for the management of any type of channel, program, project or product - and a website is certainly no exception.

Read full post...

Wednesday, November 12, 2008

The 2008 Australian Web Analytics survey is now open

If you're interested in web metrics, pop over and complete the 3rd annual Australian Web Analytics survey at Bienalto's website.

Respondents will receive a copy of the survey results, which should provide insights into how your organisation compares to others in their use and prioritisation of web analytics area.


The 2006 and 2007 survey results are also available from their site.

Some of Bienalto's key findings from the 2007 survey included...
89% of businesses actively measure website performance
77% of respondents were satisfied with web analytics data 75% of the time or more
Google Analytics was the most popular web analytics tool.
Learn more about and complete the Web Analytics survey

Read full post...

Bookmark and Share