Categories
Folksonomies Web 2.0

Steve.museum update as a podcast

I’ve just finished a presentation to art museum folk at the Sites of Communication 3 conference at the National Gallery of Victoria, and true to form there was quite a bit of interest in social tagging. There seems to now be widespread awareness of the problem of the ‘semantic gap’ between the language of art museums audiences (especially as they are being seen to be diversifying) and that of art curators and researchers. And there is increasing interest addressing this problem.

Thus when museum people ask about collection tagging projects other than our own, I send them off to the Steve.museum project website. Invariably they come back, having dipped their toes into some of the research material, with more questions. Jennifer Trant has produced a rather excellent podcast summary of the project to date and some of the preliminary results emerging from it. The podcast is a good example of making what is otherwise a time consuming and text heavy task an easy-to-digest and informative 12 minute presentation – complete with a few slides. (It uses the M4A format so you will need Quicktime or iTunes.)

Steve.museum is doing some excellent and very considered research that will re-assure many tag skeptics and no doubt lead to more and better tagging implementations down the track. Whether the Steve results will be able to be applied directly to collections outside of visual art – social history and natural history collection especially – remains to be seen.

Categories
Conceptual Social networking Web 2.0 Web metrics Young people & museums

Social production, cut and paste – what are kids doing with ‘your’ images?

It has been one of the worst kept secrets of web statistics – deep linked image traffic. While this has been going on for years, since the beginning of the WWW actually, it has increased enormously in the past few years. On some cultural sector sites such traffic can be very substantial – a quick test is to look at exactly how much of your traffic is ‘referred’ from MySpace. It is also one of the main reasons why Photobucket has traditionally reported traffic so much higher than Flickr is – its deep linking and cut and paste engagement with MySpace. With the move away from log file analysis to page tagging in web analytics, some, but not all of this deep linking traffic is fortunately being expunged from analytics reporting.

Two Powerhouse examples include a Chinese news/comment portal that deep linked a Mao suit image (from an educational resource on our site), sending us 51,000 visits in under 24 hours in August 2005, and an A-grade Singaporean blogger who deep linked an image of Golum (from our archived Lord of the Rings exhibition pages) to use to describe an ugly celebrity which generated over 180,000 visits over 8 days In January 2007. (In both of these examples the visits were removed from the figures reported to management and funders.)

What is going on here sociologically?

At the recent ICA2007 event in San Francisco danah boyd and Dan Perkel presented an interesting look at the subcultural behaviours that are, in part, producing this effect. Although they look specifically at MySpace there are threads that can be drawn across many social sites from forums to blogs. Drawing on the work of many cultural theorists, they argue that on MySpace what is going on is a form of ‘code remix’. That is, young people’s MySpace pages are essentially ‘remixes’ of other content – but unlike a more traditional remix in audio and video cultures, these code remixes occur through the simple cut and paste of HTML snippets. By ‘producing’ both their MySpace pages as well as their online cultural identity in this way, they are reshaping concepts of ‘writing’ and digital literacy. They are also, importantly, not in control of the content they are remixing – a deep linked image can easily be changed, replaced or removed by the originating site.

There are plenty of examples – boyd and Perkel give a few – where the content owner changes the linked image to disrupt the deep linker. In the case of our Singaporean blogger we renamed the linked image to prevent it from appearing on her site (and in our statistics).

Revealingly, Perkel’s research is showing that many MySpace users have little, if any, knowledge or interest in website production – that is CSS and HTML. Instead, what has formed is a technically simple but sociologically complex ‘cut and paste’ culture. This is what drives the ‘easy embedding’ features found on almost any content provider site like YouTube etc – it is in the content providers’ interest to allow as much re-use of their content (or the content they host) because it allows for the insertion of advertising and branding including persistent watermarking. Of course, the museum sector is not geared up for this – instead our content is being cut and pasted often without anyone outside the web team having a deep understanding of what is actually going on. There are usually two reactions – one is negative (“those kids are ‘stealing’ our content”) and the other overly positive (“those kids are using our content therefore they must be engaging with it”). Certainly Perkel and others research deeply probelmatises any notion that these activities are in large part about technical upskilling – they aren’t – instead those involved are learning and mastering new communication skills, and emerging ways of networked life.

One approach that some in the sector have advocated is the widget approach – create museum content widgets for embedding – to make repurposing of content (and code snippets) easier. There have been recent calls for museum Facebook apps for example. But I’m not sure that this is going to be successful because a great deal of embeds are of the LOLcats variety – perhaps trivial, superficial, but highly viral and jammed full of flexible and changing semiotic meaning. Whereas our content tends to be the opposite – deep, complex and relatively fixed.

Categories
Digitisation Web 2.0

How to do low cost transcription of hand written and difficult documents

So your museum has already done the easy part of digitisation – taking digital photos of your objects, but now you have a complex hand-written materials you need to digitise . . . what can you do?

This is a question that has popped up in several meetings over recent months.

Our Curator of Information Technology, Matthew Connell, came up with a brilliantly simple solution – and there is no need for the original material to leave your organisation.

With the low cost of MP3 recorders it is very to now record large amount of audio into a single file, already compressed. Take one of these MP3 recorders and ask the expert who is familiar with the document or material requiring digitisation to read the document clearly into the recorder. This may be done over an extended period of time – there is no need to do it all in one go.

When completed, upload the MP3 of clearly spoken audio to a web server. Then use one of several online audio transcription services to transcribe the audio. We have been using such services to get quick, low cost transcriptions of public lectures and and podcasts, and have been impressed with their timeliness and accuracy.

Even factoring in the cost of reading time, this will almost certainly be cheaper and more error free than scanning and transcribing directly from the written original. It also provides significantly more flexibility in terms of pricing as there is a high level of competitiveness amongst audio transcription services at the moment – a level of competition that may not exist amongst specialist written services.

Categories
Interactive Media Web 2.0

The new Google Maps, Google Earth and Google Sky

Everyone is buzzing about the new features that have popped up with the easily embeddable GoogleMaps today. This is a big step towards making map mashups completely mainstream – increasing the popular acceptance of the map as a user interface.

For a look at how things might work for the museum and cultural sector take a look at this query. Scroll to the bottom and you will see a map showing all the places mentioned in the book, together with pop up page references! There’s obviously been a lot of parsing of OCRed text to pull out the place names but the result is pretty incredible.

Something a few have missed is the astronomy features now available in Google Earth called Google Sky.

Download the new version of Google Earth and you will find a new toolbar icon that toggles between Earth and Sky. Once in Sky mode you can find galaxies, constellations and planets – all of which link to data from NASA and other sources including Hubble telescope pictures. It is very impressive and lots of fun.

Next task is to look into making KML files to accompany our monthly night sky guide podcasts at the Sydney Observatory . . .

Categories
Museum blogging Web 2.0

Blogs as a ‘community strategy’

New Matilda has an short but interesting piece by Kevin Anderson, blogs editor at The Guardian. In the article he stresses that blogging is about generating and engaging the community, not just a new means of publishing. Rather than see blogging as a threat to traditional publishing, it should be viewed as a new strategy for engaging audiences and readers.

This has strong resonances with experiences of museum blogging. Blogs aren’t replacing traditional forms of official communication, but they are engaging audiences in new and effective ways.

Neil McIntosh and Jack Schofield launched The Guardian’s first blog in 2001, realising it was better to be part of the conversation than listen to it from a lofty perch. The Guardian now has blogs covering everything from currents affairs — on ‘Comment is Free’ — to sport, arts and culture, and most recently food and gardening.

But blogging is not a publishing strategy, it’s a community strategy. Being one of the world’s bloggiest newspapers has led to bloggers linking to our stories, helping us grow a grass-roots following in the United States, so that The Guardian now has more online visitors outside of the UK than inside.

One of The Guardian’s stated goals is to become the world’s leading liberal voice. And our website’s ‘Head of Communities and User Experience,’ Meg Pickard, has said that we also need to enable the world’s liberal voices.

The art of blogging is about building a community and coaxing people out from behind their keyboards.

Categories
Web 2.0 Wikis

Wikipedia, Wikiscanner, revealing the hidden power struggles over knowledge production

Last week featured a rather robust debate in the office about whether museums should encourage the use of Wikipedia, and, perhaps participate in adding and editing entries themselves. Now most Fresh + New readers will be familiar with the arguments – they’ve been around since Wikipedia began.

Of course what most anti-Wikipedians, if they don’t dismiss it outright, claim is that ‘Wikipedia is only as good as its last edit’. But to me that is missing the point. Wikis, and Wikipedia as an example of a wiki, are interesting because they reveal the history of edits, changes, revisions and re-versions. They reveal the collaborative and argumentative nature of knowledge production.

Well, almost as if to prove my point, along comes Virgil Griffith’s Wikiscanner which has gotten coverage in Wired and is struggling under the burden of the resultant high traffic load.

Wikiscanner basically matches the IP addresses of those doing edits with information about their network provider – known IP address ranges of government departments, corporations and the like. By doing this Wikiscanner is beginning to reveal the complex web of individuals, and increasingly, corporations that are using Wikipedia to argue and dispute versions of the ‘truth’. You can start to get an idea of the otherwise hidden agendas and power struggles over knowledge and information quite quickly . . . .

Griffith says he launched the project hoping to find scandals, particularly at obvious targets such as companies like Halliburton. But there’s a more practical goal, too: By exposing the anonymous edits that companies such as drugs and big pharmaceutical companies make in entries that affect their businesses, it could help experts check up on the changes and make sure they’re accurate, he says.

Categories
Conceptual Web 2.0 Web metrics

Valuing different audiences differently – usability, threshold fear and audience segmentation

It is important to realise that to deliver more effective websites we need to move away from a one-size-fits-all approach not only when designing sites but also when evaluating and measuring their success. We know that some online projects are specifically intended to target specialist audiences – a site telling the histories of recent migrants might require translation tools, and a site aimed at teenagers might, by design, specifically discourage older and younger audiences in order to better attract teenage usage.

Remembering, too, that some key museum audiences (regional, remote, socially disadvantaged) may have no online representation in online visit figures, and others may have limited and sporadic online interactions, because of unequal internet access, it is important to look at the overall picture of museum service delivery. Some audiences cannot be effectively engaged online. Others still may only feel confident engaging in online conversations about the museum using non-museum services – as I’ve written before – on their own blogs, websites, and social media sites.

If we acknowledge ‘threshold fear’ in our physical institutions, then we need to realise this applies online as well. The difference being that in the online world there are many many more less ‘fearful’ options to which potential visitors and users can easily flee. The ‘back’ button is just a click away.

The measure of the ‘value’ of visitors therefore need to differ across parts of the same website. We may need to form different measures for a user in the ‘visiting the museum’ part of the website to the ‘tell us your story’ section, even though in one visit they might explore both areas. Likewise, a museum visitor who blogs about their positive experience of a real world visit on their own family blog might be considered. Or a regionally-oriented microsite that gets discussed on a specialist forum might be more valuable – to that particular project – than a posting on a more diffused national discussion list.

Visit-oriented parts of the the website should be designed and created with known target audiences in mind, understanding that not everyone can visit the museum, and their success measured accordingly. It might be sensible to attempt to address ‘threshold fear’ by using images of the museum that are more people-oriented rather than object-oriented in order to promote the notion that the museum is explicitly a place for people.

When we were building our children’s website we specifically decided against creating a resource for ‘all’ children – that would have resulted in a too generic site – and targeted the pre- and post- visit needs of a known subset of visitors with children. We don’t actively exclude other visitors (other than through language choice, visual design, and bandwidth requirements), but we have actively attempted to better meet the needs of a subset of visitors. This subset will necessarily diversify over time, but we also understand that out on the internet there are plenty of other options for children.

The problem with traditional measurements are that every visitor to our online resources is homogenised into single figures – visits, time spent, pages viewed. Not only does this reduce the value of the web analytics, it does the visitor a great disservice. Instead, good analytics is about segmentation. This can be segmentation based on task completion and conversions, and understanding visit intentions.

So who is a ‘valuable’ visitor?

It depends on context.

For our children’s site we place a greater internal value on those who complete one of two main site conversions – spending a particular amount of time on the visit information areas; and second, those who browse, find, and most critically, download an offsite activity. Focussing in on these subsets of users allows us to implement evaluation and tracking. For those who complete the visit-related tasks we might offer discount coupons for visiting and track virtual to real-world conversions. What proportion of online visitors who look at visit information actually convert their online interest to a real world action? And in what time frame (today, this week, this month?). Of the second group we may conduct evaluation of downloader satisfaction – did they make they craft activity they downloaded? Was it too hard, too easy? Did they enjoy the experience?

What of the others who visit the children’s site? They are a potential audience who have shown an interest but for many reasons haven’t ‘converted’ their online visit. We can segment this group by geography and origin – drill down deeper and really begin to examine the potential for them to ever ‘convert’.

Other parts of our website – say our SoundHouse VectorLab pages – we may see as valuable users who simply use and linkback to our ‘tip of the day’ resources. Despite being primarily an advertisement for onsite courses run in the teaching labs, we do see a great value in having our ‘tip of the day’ resources widely read, the RSS feed subscribed to, and articles linked back to. However this has to be a secondary objective to actually taking online bookings for courses.

Postscript – I’d also suggest reading the 2004 Demos report ‘Capturing Cultural Value’ for some important philosophical and practical caveats.

Categories
Collection databases Web 2.0

OPAC2.0 – Latest features update

We’ve added a whole range of new features to our OPAC that we think further enhance its usability.

Tooltips

Each ‘feature’ on the search results and object view pages now has an explanatory tooltip. Given the OPAC has become quite complex and there is a lot going on on the screen now, we felt CSS tooltips offered a more practical solution than a ‘help’ screen or more text in the form of user documentation. More tooltips will be added this week to explain museum-centric language like ‘statement of significance’.

Failed search suggestions

Now when a search term is misspelled or return no result our system generates a series of possible ‘alternatives’. This is generated on the fly using a calculation called Levenshtein distance. This cycles through each letter of the misspelt word and then queries our table of successful searches for possible matches. These are then ranked and the top 8 variants are presented to the user. In order to make this reasonably quick we have had to rebuild quite a bit of our search technology.

Opensearch RSS with thumbnails

About two months ago our Opensearch feed was updated to include thumbnails in search results. We added the thumbnails to ensure that our feed delivered optimal results to the National Library of Australia’s Libraries Australia search. We also use this modified RSS to drive search results of Design Hub.

Categories
Social networking Web 2.0 Web metrics

Social media measurement – brand awareness and trust in the cultural sector

There has been a flurry of activity amongst web analytics companies and in the marketing world to devise complex ways of measuring social media activity. As much of this interest in devising a way of measuring and comparing social media ‘success’ comes down to monetising social media activity through the sale of advertising, these measures don’t easily translate to the cultural sector. Advertisers are after a ‘ratings’ system to compare the different ‘value’ of websites but as we know from old media (TV and radio), ratings don’t work well for public and community broadcasters who don’t sell advertising and have other charters and social obligations to meet.

We know that visits, page views and time spent aren’t the best ways of understanding our audiences or their levels of engagement with our content, and with social media it is all about engagement. If we aren’t selling advertising space to all those eyeballs focussing their attention on our rich and engaging content, then what are we trying to do?

I’d argue that it is about brand awareness. Not just brand awareness in terms of being top of mind when geographically close audiences are thinking of a cultural activity to do in their leisure time, but about linking the perceived authenticity of the information contained on your website to your brand. More and more there is ongoing research into how museums are perceived as ‘trusted’ information sources, and importantly politically impartial sources. But this perception relies upon an awareness on the part of the online visitor that they are indeed on a museum website.

This user awareness is, I argue, not a given, especially now that such a large proportion of our online traffic comes via search. Looking into the future, search will be an even greater determinant of traffic, even if your real-world marketing prominently displays your URL (as it should be doing by now!). Looking at your real world marketing campaigns around your URL you will probably find a spike in direct traffic but a similarly sized spike in brand name searches – we are finding this with the Sydney Design festival at the moment. The whole of Sydney is covered with street advertising from bus shelter posters to street banners, all promoting the URL. The resulting traffic is a mix of direct and brand name search based.

The problem is, now, the brand no longer is just represented in the online environment on our own websites.

One of the first things I talk about in my workshops and presentations is that even if your organisation is not producing social media about yourself, then your audiences almost certainly are. If you aren’t aware of what your audiences are saying about you, what they are taking photos of, or recording on their camera phones, then you are missing a unique opportunity to understand this generally highly engaged tip of your audience.

It is possible those who blog about their experience in your organisation, upload their photos and videos, are going to be those who are potentially your most (commercially) ‘valuable’ customers – high disposable income, high levels of interest and a desire to participate and communicate/advocate to others about your organisation.

They are probably the most likely to climb the ‘ladder of engagement’ from potential visitors through regular visitors to members and finally donors/sponsors. They may not always have positive things to say, but by hearing their gripes and grizzles, you are able to understand and address issues that impact how your organisation is going to be promoted through word-of-mouth. And word-of-mouth is going to almost always be the most ‘trusted’ type of marketing recommendation.

So how do we track these conversations that occur publicly but not on your organisation’s website?

Mia Ridge recently pointed to a great summary of the easiest to use ‘ego search’ tools and methods by which you can easily keep track of your audience conversations. Another favourite of mine for small scale tracking is EgoSurf.

Sixty Second View has compiled an ‘index’ of how these kinds of ego search results might be compiled to generate a figure to compare with competitors and other organisations. Their methodology, whilst very complex, focusses on assessing how connected the people who are talking about you actually are – this allows for a determination of effective reach, and the trust that may be accreted to those in the conversation.

(top level summary mine only)

a) Blogs that are talking about you – what are their Technorati rankings, how high are their Google PageRanks, how many BlogLines subscribers do they have etc

b) Multi-Format conversations – how popular/connected are the Facebook and MySpace people who are talking about your organisation

c) Mini-Updates – frequency and reach of Twitters

d) Business Cards – LinkedIn connectedness

e) Visual – Flickr influence and popularity can be used to determine how connected and visible posters images of your organisation are. This can be applied to YouTube as well.

f) Favourites – Digg, Del.icio.us connectedness

This approach is useful as it provides a detailed analysis of the spectrum of social media that your organisation is probably already represented in. It can reveal areas where your users are’nt talking about you, and it can illuminate areas of your own site that receive unexpected user attention. Not only that it focuses on who is talking about you. On the downside, it is a lot of work – but in undertaking even a cut down version of this methodology it will force you to examine the different impacts of types of social media.

For example, are all blog posts about your organisation equal? When you check the Technorati rankings of the commenting blogs you will find that some have greater reach and authority than others. The real world equivalent here is the different weightings your marketing team probably already gives to print media mentions in national broadsheets versus local weeklies; or the difference between a TV editorial and a local radio mention.

Is this really the job of the web team?

Unless your organisation has a marketing team that is expert in online marketing then the answer must be yes. Web analytics in five years time will be all about measuring offsite activity.

Categories
Conceptual Interactive Media Social networking Web 2.0

Open vs closed

As I have been thinking about my upcoming presentation at Web Directions South there have been a lot of interesting maneuvers out in the commercial web space.

First, a while back Facebook opened their platform to developers allowing content from other providers to interact with Facebook profiles in Facebook. This, coupled with the enormous media coverage that this move got, is in part the driver for Facebook’s phenomenal growth of late. ‘Facebook as a platform’ has made Facebook ‘sticky’ and given everyone who uses Facebook more reasons to go back to look at their profiles on a very regular basis. Whilst most of the Facebook applications are only marginally interesting (do we really need another chain letter-style application?), the best ones are those that turn a Facebook page into an aggregator of personalised content from other sites – Flickr, travel maps, Last.Fm, RSS feeds etc. People who have completely tricked-out Facebook profiles could (and this is what Facebook hopes) feasibly use Facebook as their home page and access everything they need via Facebook.

Now, Netvibes has come along and flipped this. Netvibes is a personalised aggregation portal a bit like iGoogle (formerly MyGoogle). We have been experimenting with it for the Culturemondo network (see the public Culturemondo Netvibes aggregation as an example)

Netvibes has developed a very nice Facebook widget, to bring Facebook’s own data into its network meaning that Facebook-specific data – notifications, friends and data from your profile can now be aggregated into Netvibes, making Netvibes the one stop ‘attention’ shop for tricked out Netvibes/Facebook users. As Mashable points out –

Facebook is now one of Netvibes’ biggest rivals. Before Facebook, who offered to aggregate your friend’s Flickr photos, YouTube videos, blogs and the rest? Netvibes, of course. In fact, Facebook profiles are now a lot like a Netvibes startpage.

So now that Facebook has stolen some of that sheen, they’d [Netvibes] obviously like to create a mini-Facebook within Netvibes, rather than losing users in the other direction. They want you to use Netvibes as your homepage, and visit Facebook only incidentally, rather than aggregating all your stuff at Facebook and never returning to Netvibes.

The tension is indicative of what’s happening with aggregators: they’re all motivated to keep you on their own platforms for as long as possible, rather than giving you absolute freedom to take your identity wherever you like. Right now, it’s hard to make money without owning the user’s identity in some way; user lock-in remains the strongest business model, even though superficially they exist to hand more control to you.

What is interesting here is that what is happening is much more than a battle over attention between two competitors. Facebook can close access to Netvibes but then risk a small proportion of users leaving its network (mostly the super tech-savvy, bleeding edge experimenters). On the other hand, Facebook’s own survival likely relies on them further opening their network – the initial steps towards openness, coupled with usability, is at least part of the reason why some users have started migrating profiles from MySpace, which remains defiantly closed.

The reality is, in the future, both Facebook, Netvibes, MySpace are all better off letting their users move freely between networks – that way they remain at least partially relevant, although in deep competition. Otherwise new audiences which are so critical to their business models and desirable to their advertisers, as Fred Stutzman points out, will go to whichever is ‘coolest’ at the time. Ross Dawson also comments on this, too, spotting a trend ‘towards openness’, pulling recent moves by Plaxo into the picture as well.

I’ll be coming back to this theme over the coming weeks. There are enormous opportunities for the cultural and non-profit sector here – if we can all adapt fast enough. Ideas of attention and brand are just as relevant for us as anyone, possibly more so with the limited budgets in our sector,