Showing posts with label navigation. Show all posts
Showing posts with label navigation. Show all posts

Thursday, April 05, 2012

Governments need to ensure their websites work for modern users

I went to the Australian Business Register site (www.abr.gov.au) this afternoon to set up an ABN (Australian Business Number) for a company.

This is a very common step, taken by hundreds, if not thousands, of Australians every week.

However I immediately hit a speed bump.

The site's online ABN registration process threw up an error message (image below) stating:

Browser not supported
The Australian Business Register currently supports the following browsers:
  • Internet Explorer 5.0 and above
  • Netscape 6.0 and above
You should update your browser version before you continue using the Australian Business Register. If you believe your current browser is suitable to use, please continue.

Refer to Technical Information for details on how to configure for your browser for the Australian Business Register.
This was confusing and offputting as I was using Firefox 11.0 - one of the most modern web browsers available.

Fortunately I had Internet Explorer 9 on my system and gave this a try - no error screen appeared.

Now if you read far enough into the error message it does state that 'If you believe your current browser is suitable to use, please continue.' - however I was in a hurry at the time and, like many users, didn't read the error message all the way through.
The error message visible at the Australian Business Register site, together with the 'About' information window for the web browser in use
The error message visible at the Australian Business Register site,
together with the 'About' information window for the web browser in use

Regardless of whether this translates into a user error, I believe that there is an obligation on government agencies to ensure their websites are accessible and usable in modern web browsers without unnecessary and confusing error screens.

Essentially, when I have Firefox 11.0, I don't expect to receive an error stating I need 'Internet Explorer 5.0 and above' or 'Netscape 6.0 and above' - as my web browser is "above" both and, in fact neither of those web browsers have been current for more than 10 years!

For such an important and common business process as registering an ABN the responsible agency needs to take a little more care in its online delivery of services.

Otherwise their online services will damage trust and respect in the government's ability to deliver and cause customers to migrate to what are slower and (for agencies) higher cost channels.

I'll bring this issue to the attention of the responsible agency, the Australian Tax Office, and check back in six months to see if anything has changed.

For all other government agencies out there, please check that your public online systems aren't needlessly damaging your credibility in this way. Please make sure your websites work for modern users!

Read full post...

Thursday, January 19, 2012

Is it time for government to take Google Plus seriously?

Often in government there's only two social media networks discussed and considered for community engagement and communications, Facebook and Twitter.

MySpace is a distant memory, LinkedIn is used just for resumes and services like FourSquare, Plurk, Ning and others are not well-known.

Also not that well known is Google Plus, and perhaps rightly so - it is very new and still quite small in social media terms, only around 62 million users, although it is predicted to grow to over 293 million by the end of 2012, or so Google believes.

However with the recent integration of Google Plus into Google search, it may be time for governments to consider establishing Google Plus channels alongside Facebook and Twitter, due to the impact on search results.

With Google's search tool holding close to 90% of Australia's search market, it is a more dominant 'publisher' than News Limited - and remains the number one website in Australia. Search engines are also the primary source of traffic for Australian government websites, with an average of over 40% of visitors reaching government sites from a search engine (according to Hitwise) - and therefore around 36% coming direct from Google.

So what has Google done? According to Gizmodo, they've integrated Google Plus into their search product in three ways,
First, it now provides "Personal Results" which include media—photos, blog posts, etc—that have been privately shared with you as well as your own stuff. Any images you've set to share using Picasa will also be displayed. Second, Google Search will now auto-complete queries to people in your circles and will display people who might also be interested in what you're searching for in the search results. Finally, it simplifies the process of finding other Google+ profiles for people or specific interest groups based on your query. So if you search for, say, NASA, it will display Google+ profile pages for NASA and space-related Google+ interest groups in addition to the normal results.
Whether you believe this is a good move, a legal move, or not, it does provide opportunities for organisations to leverage Google Plus to improve their overall presence in Google search by operating a Google Plus account.

It's certainly something to keep an eye on, if not actively consider. 

Read full post...

Tuesday, September 14, 2010

NSW launches live traffic monitoring online

The NSW Roads and Traffic Authority (RTA) has launched RTA Live. This new website provides live updates on road conditions across NSW, including road work, fires, floods, accidents as well as feeds from 67 traffic cameras across Sydney.

There's also a widget embeddable on blogs and websites to provide traffic information.

Displaying the data on Google Maps, the site is an excellent example of the use of Web 2.0 technologies in a government context.

My only suggestion for the site would be to include data for Canberra to fill the annoying ACT-sized hole in the map.

Read full post...

Friday, April 30, 2010

The street as a platform, what's government's role?

An extremely thought-provoking post about The street as platform written by Dan Hill in February 2008 has been brought to my attention by Darren Sharp.

The post explores the virtual life of a city street, all the digital data exchanging hands between systems, infrastructure, vehicles and people in the street unseen to human eyes.

While condensed into a single street, the post is based entirely on current technologies and practices. It could easily represent a real street in any major city anywhere in the world today.

The question for me is what is government's role in building the infrastructure, managing and effectively using the data collected?

Streets are generally infrastructure created and maintained by governments and the systems that 'power' a street are often installed and managed by public concerns (roads and pavements, water, sewage, electricity and telecommunications) or at least guided by government planning processes (the nature of the dwellings and commercial services provided on the street). So there's clearly a significant role for government in the virtual aspects of streets as well.

There has been some work done internationally on what precisely is the role of government (some articles and publications listed at the Victorian Government's eGovernment Resource Centre, but have we done enough here in Australia?

Given we have a national broadband network planned, and are already in the process of preparing for pilot roll outs, ensuring that this enables, rather than limits the vision of our digital streets in a managed and well-thought out manner is clearly moving its way up the priority list.

Read full post...

Wednesday, February 03, 2010

Google to end support for Internet Explorer 6 during 2010

Google has announced that it will progressively end support for Microsoft Internet Explorer 6 during 2010 - beginning with Google Docs and Sites in March. Youtube, another Google company, is also phasing out support.

Announced in the Google Enterprise blog post last week, Modern browsers for modern applications, Google Apps Senior Product Manager, Rajen Sheth, said that the web had evolved in the last ten years from simple text pages to rich interactive applications and that very old web browsers cannot run these new features effectively.

This approach isn't limited to Google. A number of companies have already dropped support for Internet Explorer 6.0 in their online applications and more, including Facebook and Digg, plan to drop it in the near future.

Microsoft (up to CEO level) have also advocated dropping the IE6 web browser for their latest version, Internet Explorer 8.


EDIT at 8:10AM 3/2/09:
Nick Hodge, a Microsoft staff member, has commented on this post that Microsoft is also progressively dropping IE6 support, saying that Microsoft has,
dropped support for IE6 in Sharepoint 2010 and the forthcoming web versions of Word, Excel, Powerpoint and OneNote 2010; plus live@edu and other web properties. 
END EDIT

However, to support its customers, as there are a number of major corporations still tied to the ageing browser, Microsoft recently extended support for IE6 until April 2014, when all support for Windows XP ends.

Given the recent severe security issues reported with IE6 and the increasing proportion of the internet unavailable to those using the 2001 vintage web browser, I hope to see the remaining organisations migrating away from the browser in the near future.

It is estimated that only 20% of web users - predominantly workers in large organisations - still use IE6, however up to 50% of Chinese internet users are still on the web browser.

Reportedly Microsoft's Internet Explorer web browser has been losing market share at least 2004, when it reached 90% of the market. According to Wikipedia's Usage share of web browsers article, it is now estimated (through tracking subsets of internet users) that only about 60% of internet users are on one of the Internet Explorer variants, with Firefox 3.5 having overtaken IE8 as the most popular browser by version.

Some commentators expect to see Microsoft's share of the web browser market fall below 50% by mid-2011.

Read full post...

Thursday, December 03, 2009

New quick start beginners guide for government Twitter use released

Dave Briggs of Learning Pool in the UK has written a quick start guide to Twitter for those working in and around government (although it's equally applicable for other people as well).

The guide particularly targets Twitter newcomers and is written in a very readable and conversational style.

David spent more than five years working in government and has a good understanding of how to approach the topic in order to make this guide useful.

I see this guide as a companion guide to the UK Government's Template Twitter Strategy. Like the Template Strategy, just about all of this guide is immediately usable in an Australian context.

Read full post...

Sunday, March 22, 2009

Delivery of a website 'realignment'

Last year I posted about redesigning sites to put customers at the centre of the universe.

At the time we were reviewing my agency's primary site based on usability research and surveys. Through these our customers had indicated that the site was perceived as about us rather than about them (the tools and information they wanted to access quickly).

I'm pleased to say that, after working through a redesign process to align the site more closely with agency goals and styles and some tough decisions on specific content to feature, the new design is now live, largely reflecting the original wireframe concept.

I think we managed to meet the rules I set for my team,

  • put customer needs first
  • use less words
  • minimise disruption
  • lift the look


You can view the site at www.csa.gov.au.

Feedback is welcome.

Read full post...

Wednesday, November 05, 2008

US satisfaction with egovernment services rising

The US government has recorded the second consecutive increase in satisfaction, to an average 73.5 percent in the latest E-Government Satisfaction Index, part of the broader American Customer Satisfaction Index (private sector website satisfaction is at 80 percent).

As reported in CRM Buyer, 25 percent of sites achieved a rating over 80 percent.

The feature constituents were least satisfied with was navigation (37 percent were satisfied), whilst 96 percent were satisfied with search functionality.

Commentators are expecting the upward trend to continue as a result of the ongoing US financial crisis.

This upward trend will likely continue, Freed [Larry Freed, president and CEO of ForeSee Results] said, if for no other reason than current budgetary constraints. With the U.S. government now committed to a US$700 billion financial rescue plan, money will be tight in all other categories. "E-government can deliver a huge payback because it is so much more efficient," he observed.





Type rest of the post here

Read full post...

Wednesday, September 17, 2008

A glimpse at the future of the semantic web

Fresh+New, a blog written by Seb Chan from the Powerhouse Museum, has brought to my attention Aza Raskin’s Ubiquity, a very interested look at the possible web of the future, using semantic browsers to provide a more connected experience.


More details are in Seb's post, More powerful browsers - Mozilla Labs Ubiquity, or on Aza's blog.

Below is the video introducing Ubiquity.

Ubiquity for Firefox from Aza Raskin on Vimeo.

Read full post...

Monday, September 08, 2008

Getting the basics right - US presidential hopefuls fail website navigation

Forrester Research has released a report critiquing the navigation of the websites of John McCain and Barack Obama, claiming that both fail basic navigation tests by potential voters.


Nextgov reported in the article, Web sites of both presidential candidates fail to connect with users, that,

Forrester used five criteria in its evaluation: clear labels and menus; legible text; easy-to-read format; priority of content on the homepage; and accessible privacy and security policies. McCain's site passed two of those benchmarks: clear and unique category names and legible text. Obama's site succeeded in one area: straightforward layout making it easy to scan content on the homepage.

Neither site gave priority to the most important information on the homepage, or posted clear privacy and security policies, Forrester concluded.
This came on the back of another report by Catalyst, which tested seven criteria. The Nextgov article quotes that,
Catalyst asked individuals to perform seven tasks while evaluating each campaign site, including donating money, reading the candidates' biographies and finding their positions on specific policy issues. Obama's site stood out for its design and navigation, but users were confused about certain labels on the homepage, such as "Learn," which contained links to information about the Illinois senator's background and policy positions.

What were the lessons for all government sites?
  • A modern professional look is critical for drawing in users and making them want to use the site.
  • Effective prioritisation of information (most important at top) and clear, simple navigation are important for the success of a website, but if the look isn't right users won't stay long enough to use it.
  • Focus on the most important information and reduce the clutter, direct users to the most useful information, activities and tools for them.

Read full post...

Monday, September 01, 2008

How Internet Explorer 8 beta performs - new features to add to toolkit

I installed the public beta of Microsoft's Internet Explorer 8 on my personal laptop last week to look at how well my agency's sites were reflected in the browser, and to get more of a feel for the new features it adds to the mix.

I'm pleased to say that just about all the sites I looked at using the browser performed well, with only a few minor issues with form field lengths and div handling.

The browser has certainly played catch-up, taking on all of the great features I am already using in Firefox 3 (such as the smart address bar), making them available to a broader audience who have not tried other browsers before.

A couple of new features may also provide benefits to organisations innovative enough to use them. I'll be feeding some ideas back into my agency to see where we can get to, as below.

Web Slices
This features is a way for websites with frequently updating content to enable users to subscribe to be notified when content changes.

The user benefit is that they do not have to scan through their favourite sites regularly to see whether anything has changed, they can get on with higher priority activities and allow a visible notification in their web browser to let them know when content on a favourite site has been updated.

It works well for news items, stock quotes and other frequently updating content, providing a soft in-browser alternative to RSS feeds (which remain underutilised by the broader online community).

Time will tell how popular this function becomes, but as a way to push market website content, rather than relying on people coming back, it may be beneficial to organisations who have a need to distribute information rather than passively wait and hope their audience returns.


Accelerators
Accelerators are tools to allow users to right click on website content and access specific functionality from third-party online providers. For instance, right-click an address and choose to view it in Google maps, or right-click on text and translate the language using another website.

This adds to current right-click functionality that supports functions from local applications, extending the user's operating system onto the web.

Organisations can add specific functionality, such as legal definitions, purchase information or who to contact for more information.

I would be particularly interested in functionality that can be added on a site-specific basis, such as providing links and definitions from the Child Support Act when on the CSA website, and from Centrelink legislation when on the Centrelink website, however this may not be supported in Internet Explorer 8 - though there might be ways around this.

Read full post...

Use the right online metrics for the job

One of my mantras in professional life is 'you can't manage what you don't measure'.

Therefore it always worries me when I encounter organisations or individuals with a less than firm grasp on how to measure the success or failure of their online properties.

Depending on the type of web property, different metrics are most important for regular tracking and I believe it's the responsibility of top managers to understand the online metrics they use - just as they need to understand business ratios or balance sheets.

After more than twelve years of trial and error, below are the metrics I most and least prefer to use to track different types of online media.

What are the best metrics to use?
Standard websites
Visits
This tracks the total number of visits by users to a website over a period of time (month, week, day). This can include the same unique visitor returning to the site multiple times - which is the same way calls are commonly tracked for call centres.

Visits gives you an overall view of website traffic and, when divided by Unique visitors, provides a measure of 'stickiness' - how often people return to your site.

Note that for an unauthenticated site, a visitor is essentially an IP address, a computer. As multiple people can use a single PC, or a single person can use multiple PCs and it also may track search spiders and other bots, visits doesn't provide a perfect measure of human traffic but it's sufficiently good for trend analysis over time.

In addition, caching by ISPs or organisations can also influence visits - reports based on AOL from a few years ago indicate that visits reports may under report website traffic by as much as 30 percent due to caching - though this is less important today.

In comparison 'readership' is a much looser metric, but is often held in high regard in the print trade.

Unique visitors
Unique visitors tracks the individual IP addresses used to visit a website and as such provides a rough count of the number of actual users of a site, no matter how many times they visit.

This equates to 'reach' for a site - with growth in unique visitors indicating more people are coming to a website.

This is affected by the same IP versus human issue as visits, however is again still far more accurate than 'readership' figures provided by the press or 'viewer' figures provided by TV and radio - which are based on a sample rather than a population (as unique visitors is).

Pageviews
Pageviews are a more specific measure of the views of specific pages within a website, and is most useful for tactical website tracking, allowing the identification of high and low traffic pages and the impact of different navigational or promotional approaches.

Looking at pageviews also provides a psychological view of your audience's top interests - allowing you to quickly prioritise content to be expanded and which can be downplayed.

Pageviews is becoming less important as technology cocktails such as AJAX are more widely used to load part of a page's content automatically or in response to user actions. In these cases a single pageview may not track what the user views in the page.


Authenticated website (transactional services)
Active users
Active users tracks the actual use by authenticated users (real humans) in a time period.

This is the best measure of an authenticated site's success as it tells you how well you've encouraged ongoing use of a website, rather than simply how good a job you've done at getting people to sign up.

Many authenticated sites prefer to talk about Registered users as this is a much larger number, however if a user has registered but never returns, your organisation gains no value from it.

A low ratio of active users to registered users can indicate site problems, and should prompt website managers to ask the question why don't people come back?

Transaction funnels
Transaction funnels track the completion of transactions step-by-step in a service - and isn't necessarily only for authenticated sites.

This provides a website manager with tactical insights into any issues in a transactional process (or workflow), allowing them to diagnose which steps have the greatest abandonment rate and redevelop the process to improve completion.

Generally improving transaction funnels results in more transactions and more active users, which means greater utilisation of the service.


Multimedia (video/audio/flash)
Views
For any type of rich media, the number of views of the media is critical in determining success. However it has to be weighted against the Duration of views to determine if users spent long enough viewing in order to take away the message, or just viewed the first few seconds.

Duration of views
The duration of media views is a more granular measure of the effectiveness of the presentation - tracking whether the media actually communicated its message to users.

Looking at the average duration viewed, compared to the actual duration of the media (where such exists) provides a very strong effectiveness measure.

Shares
One of the keys with the success of media content is how much it is shared with others online - the word of mouth factor. For media with a 'refer to a friend' tool, tracking the use of this will provide a strong indication of how positively users view the material, and therefore how viral it will become. Media that is rarely shared is probably not getting the message across in a memorable way, whereas highly shared material is correspondingly highly memorable - at least for a short time.

Documents (pdf/rtf/docs)
Views
Often 'downloads' is used to track documents. Personally I prefer views as there are some technical issues with tracking downloads of files such as PDFs. In effect the two measures should be identically, but as PDFs, and sometimes other documents, download by segment, they can significantly overreport downloads (which becomes almost as useless as 'hits'), whereas views is a more accurate measure.

There are ways to fix this within reporting systems - which I've largely done in my Agency's system - however this is not possible in all systems.

Social media
Activity by user
Like authenticated sites, the goal of social media is to encourage participation - whether it be forum posts/replies, wiki edits or social network updates/messages.
Each of these represents activity - which may need to be tweaked by the type of social media.

The more activity by users, the more engaged they are with the site and the greater the prospects of longevity.

Views
The other useful measure is views, measuring the passive involvement of users with a social media site. Not all users will actively post, however if they return regularly to view, they are still engaged to some extent with the site.

Commonly the breakdown between active and passive participants is divided as 1/9/90 (Very active/active sometimes/passive observer), however in practice this varies by medium and community.

While that 90 percent doesn't add to the content of the site, they are vital for the other ten percent to participate.


Search
Top searches
Search is also an important area of sites, with the top searches providing another insight into what people want from your site - or what is not easily findable in other navigation.

Tracking this over time provides another perspective on the psychology of your website users. It helps you understand their terminology for navigational purposes and can help prioritise the content you should modify or add to in the site.

Zero results
Any search terms which result in zero results in your site should be looked into as a high priority.

Generally this reflects areas where your website lacks content or uses the wrong context or different language to the audience.


What are the wrong metrics?

Hits
Probably the least useful metric of all time, Hits is still the best known measurement for websites, despite having no practical business uses.

Hits measures the number of files called from a web server, with each separate file accounting for a single 'hit' (on the server).

On the surface this doesn't sound so bad - however webpages consist of multiple files, with the base page, style sheets, graphics and any database calls or text includes each accounting for a separate hit.

A webpage might consist of a single file, or it might consist of 20 or more - meaning that there is no clear relationships between hits and actual page views or user visits to a website.

To increase the number of hits to a website it simply requires the website owner to place more file calls in the page - potentially calling extremely small (1 pixel square) images, therefore hits can be easily manipulated with no effect on the actual number of website users.

So while hits figures are frequently impressive, even for small websites they can easily reach millions each month, they don't provide any useful business information whatsoever.

Read full post...

Monday, August 18, 2008

Making website error pages helpful - 404 no more

If you've ever mistyped the name of a webpage, or used a hyperlink to visit a page that has been removed, you've probably seen a website's 404 - page not found 'error' page.

This code is meant to communicate that the web server hosting the website could not find the page you requested.

The default 404 error page for websites, as illustrated below, is generally not very helpful for users.

The default is largely a dead-end page, without clear pathways to the site's homepage, top content, search, sitemap or other navigational aids.

There are no mechanism to provide feedback, alerting the website's owner of the issue, and uses codes and terminology which many internet users would not understand.

If your website error page looks like this, you may want to consider creating a custom error page - one that provides a more effective message, and navigation options to your audience.

My personal preference is to remove all mention of '404' or 'error' - the numerical code can alienate non-technical users, and is largely meaningless to them anyway.

Calling the page an 'error' could be construed as it being the user's fault that they reached this page. This is neither relevant nor helpful. The goal is to get the user to the content they need, not to tell them that they are at fault.

Many government agencies have already made these types of changes to their 404 error pages. Below are several examples of them in action.

  • A very helpful page is the ATO site error page, which provides ample navigation to the top sections in the site, plus routes back to the homepage and to leave feedback.
  • Another example is the Australia.gov.au error page, which directs the user to the homepage, sitemap and FAQ page, plus provides quicklinks to three of the top current government campaigns.
  • Centrelink's error page is also helpful, with links to their homepage, search and A-Z list, plus a way to provide feedback on the site.
  • The CSA website error page (which my team manages), is a simple, but communicative page. We've renamed it from being an error, to simply reporting that the page could not be found, and provide some avenues to get to the correct content via search and sitemap.

Read full post...

Thursday, July 31, 2008

US releases eGovernment satisfaction results - useful benchmark for Australian sites

ForeSee Results has just released the findings of the latest quarterly US eGovernment satisfaction survey, looking at citizen satisfaction with over 100 US government websites.

Available as a PDF download, the E-Government Satisfaction Index (PDF 1.2Mb) uses a uniform system to compare satisfaction across US sites and was selected as the US government's standard measure in 1999.

Based on the results of this latest survey, there has been a small increase in average satisfaction to 72.9 percent, the first rise in a year.

The report does a good job of identifying the US government sites with the highest level of citizen satisfaction, which can be used by Australian government as good benchmarking examples.

It identifies the major priorities for improvement across agencies, with search topping the list (88% of agencies identified it as a top priority) followed by functionality at 59% and navigation at 41%.

The benefits of higher satisfaction have also been identified in the report, being that highly satisfied customers (scores of 80 or more) are;

  • 84% more likely to use the website as a primary resource
  • 83% more likely to recommend the website
  • 57% more likely to return to the site


The use of a standard government website satisfaction methodology, as I have previously suggested, makes it much easier for government agencies to compare their performance, identify and learn from successes and address issues. It is also an excellent accountability tool for Ministers and agency heads.

Read full post...

Monday, July 21, 2008

Is a busy website really that bad?

A theme I often hear in Australian web design circles is "make the website less crowded".

It's accepted wisdom that a website should have plenty of white space, clearly separated parts - and as little text as possible - particularly on the homepage.

Similar to Google's 28 word limit, Australian communicators seem to consider the best homepage design as the one with the least on it.

Certainly in the user testing I've done over the years with Australians I've heard the terms 'too busy' and 'too crowded' come up frequently.

Those are, however, perceptual measures. What about actual usage?

I have never specifically tested for the 'busyness limit' (the theoretical limit when text, link or graphical density begins to negatively impact on user task completion) - nor am I aware of any testing that has ever been done on this basis.

I am aware, however, of cultural differences in website design and use.

Look at the difference between US or Australian and Chinese or Japanese websites for example. In China and Japan, as well other Asian countries, the density of graphics, links and text is up to five times as high as in the US or Australia.

These high-density website countries also have high populations for their geographic size - which may form part of the difference in approach. Perhaps the amount of personal space people expect is related to the amount of whitespace they want to see in a website - although some high density European nations do not exhibit quite the same trend.

With the changing demographics in Australia it's important to keep an eye on what our citizens are looking for - our communicators and graphic designers may not always represent thecultural spread of the public.

So is anyone aware of research undertaken to look at the differences in expected information and graphical density of websites across different countries or cultural groups?

It could be an interesting (and useful) thesis project for someone.

Read full post...

Friday, July 18, 2008

What's next for your agency's search tools? Google testing user rated search

Some readers may be aware of Digg, a site where the users vote on news stories and those with the most votes get listed on the homepage.

It's an approach based on a news site's users knowing more about what they want to see than the professional news makers - and it has been relatively successful to-date (valued north of US$100 million).

Google has been testing similar features, allowing individuals to rate search results and make comments, then in future searches only see the results they prefer.

This would also be an interesting feature within websites and intranets, providing a human way to validate the search acronyms in use and ensure that the most relevant result - as determined by a person - is displayed at top.

Now this is still in 'bucket' testing at Google - meaning that a small select group of their users get to see the function. However TechCrunch has provided a video on what users see and how the system works.

Take a look below, or read the article Is This The Future Of Search?



Can you see uses in this for your website or intranet?

Read full post...

Friday, June 27, 2008

Review: Funnel Back''s new search feature - Flusters

To provide a little background, Funnel Back is a search technology developed and commercialised by CSIRO.

It has been deployed in Australia.gov.au as their Whole-of-Government search technology as my agency's website search tool (as a hosted solution) and in many other agencies and companies across Australia and other countries.

It's a reasonably good search engine if some time is spent configuring it and I've been happy with the search success levels we achieve (though always trying to improve them).

AGIMO recently invited my agency to participate in the live pilot test of Funnel Back's new search feature - Fluster (50kb PDF).

In brief Fluster helps users find what they are looking for by offering alternative phrases to refine their search terms.

An example of this in action is visible in Australia.gov.au - simply use the search and look at the Related Search area at the right of the page.

We've been trialing this feature within our site for a little over a month now and I have an initial view on how Fluster has been performing.


How Fluster is doing
Initially I was concerned about the relevancy of the topics and phrases that Fluster would choose to display. This hasn't proven to be an issue, Fluster is providing highly relevant results.

However I'm not convinced that people are using the tool effectively. We've seen no measureable change in the search success rate and I do not have evidence that visitors to our site are using the Fluster Related Search area when searching.

This could be an education issue. We currently present Fluster in the search results page without any form of help, meaning that our visitors are not guided to the tool.

It could also reflect that improvements are necessary in the reporting of Fluster use so we can determine if the tool is assisting people find what they need. These reports are still being refined by Funnel Back.

Another factor I keep in mind is the trend towards more sophisticated internet users.

A large proportion of people are very familiar with Google and other 'generic' search engines and have learnt to use phrases rather than individual words to increase the relevance of results.

In fact, the average length of a search term in Google exceeded four words at the end of 2007 - at least according to WebProNews which reports that People Are Finding More Words To Search With.

This means that people are already refining their own search terms, potentially reducing the value in having a search engine do it for them.

In conclusion
So my preliminary conclusion is that Fluster can add value to search results.

However more time will be required to really understand the impact it is having and test ways to help people use it effectively.

While internet users are becoming more sophisticated, this doesn't negate the value of Fluster. There are always new people coming into the user pool and even experienced users may on occasion find that Fluster suggests a topic or phrase that they had not considered but leads them to a relevant result.

Read full post...

Thursday, May 22, 2008

Managing taxonomies

I found this an interesting article on a systemic way to manage taxonomy creep.

Managing taxonomies

Read full post...

Bookmark and Share