Categories
Collection databases User behaviour Web metrics

Actual use data from integrating collection objects into Digital NZ

Two months ago the New Zealand cultural aggregator Digital NZ ingested metadata from roughly 250 NZ-related objects from the Powerhouse collection and started serving them through their network.

When our objects were ingested into Digital NZ they became accessible not just through the Digital NZ site but also through all manner of widgets, mashups and also institutional website that had integrated Digital NZ’s data feeds.

So, in order to strengthen the case for further content sharing in this way, we used Google Analytics’ campaign tracking functionality to quickly and easily see whether users of our content in Digital NZ actually came back to the Powerhouse Museum website for more information on the objects beyond their basic metadata.

Here’s the results for the last two months.

Total collection visits from Digital NZ – 98 (55 from New Zealand)
Total unique collection objects viewed – 66
Avg pages per visit – 2.87
True time on site per visit (excluding single page visits) – 11:57min
Repeat visits – 37%

From our perspective these 55 NZ visitors are entirely new visitors (well, except for the 8 visits we spotted from the National Library of NZ who run Digital NZ!) who probably would never have otherwise come across this content so that’s a good thing – and very much on keeping with our institutional goals of ‘findability’.

For the same period, here are the top 6 sources for NZ-only visitors to the museum’s collection (not the website as a whole) –

(click for larger)

Remember that the Digital NZ figure is for around only 250 discrete objects and so we are looking at just under 1 new NZ visitor a day to them via Digital NZ, whereas the other sources are for any of the ~80,000 collection objects.

However, I don’t have access to the overall usage data for Digital NZ so I can’t make a call on whether these figures are higher, lower, or average. But maybe one of the Digital NZ team can comment?

Categories
Mobile User experience

On augmented reality (again) – time with UAR, Layar, Streetmuseum & the CBA

Jasper Visser from the Nationaal Historisch Museum in the Netherlands has nailed some of the problems with augmented reality in his recent blogpost – ‘Charming tour guide vs mobile 3D AR‘.

Jasper compares the analogue world experience of a guided architectural tour with the digital experience of using the Netherlands Architecture Institute’s UAR application to plot a similar ‘tour’. This isn’t really a fair comparison but it does raise some serious questions about appropriateness of technology and the kind of user experience we are trying to adapt/adopt/create.

The Netherlands Architecture Institute’s UAR application, built on Layar, is perhaps the best augmented reality application by (or for) a museum I’ve seen and tried thus far. It narrowly beats out the Museum of London’s Streetmuseum – largely because it looks to the future in terms of content as well as in technology.

As I laboured over my presentation at Picnic ’10, the problem with a lot of these augmented reality and mobile apps that museums are doing is that they face a huge user motivation hurdle – ‘why would you bother’? Further, many of the ‘problems’ they try to solve are more effectively/effortlessly solved in other more analogue ways.

Our very own Powerhouse AR experiment with Layar is clunky and honestly, beyond the technological ‘wow’, it doesn’t have a lot of incentive to boot it up that important second time. That might sound critical but needs to be put into the context of it being a) an experiment, b) and having no budget allocation.

Earlier in the year in London I couldn’t get the MOL’s Streetmuseum to work properly on my iPhone 3GS but on my last visit, now with some updates and an iPhone4, I was able to get some serious time in with it.

Streetmuseum has been a brilliant marketing campaign for the Museum of London. It has generated priceless coverage in global media and in so doing associated the Museum of London with the notions of ‘experimentalism’, ‘innovation’, ‘new technology’. And the incorporation of Streetmuseum into the campaign strategy for the launch of the excellent new galleries has been very effective and synergistic.

It has also, demonstrated that there can be an interest in heritage augmented reality – even if it doesn’t quite work the way you’d hope it would.

However, like all these apps – from a user experience perspective the app is clunky and aligning the historic images with ‘reality’ in the 3D view is an exercise in patience. The promotional screenshots don’t convey the difficulty in real world use. As a result the app 3D view, the most technically innovative part of the app, ends up being a gimmick.

However the 2D map view is far more useful and, for the most part, the 2D is very rewarding. And for the committed, walking around London and revealing the ‘layers of history’ can be compelling.

Compared to our the Powerhouse layer in Layar, though, Streetmuseum is, excuse the pun, streets ahead (not surprising given the investment). Streetmuseum’s eschewing of a platform approach of using Layar and building its own system might not be the most long term sustainable strategy but it certainly delivers a far better experience than Layar. Of course, it is such early days in this space that Layar isn’t exactly a long term strategy either.

Mac Slocum over at O’Reilly raises some similar issues.

That’s the problem with app-based AR: even when the app is interesting and the implementation is notable, it’s hard to get people (like me) to use it consistently. AR ambivalence is also tied to the bigger issue of app inertia. A company that pours resources into a custom app doesn’t get much return if that app is rarely launched; the user doesn’t develop an affinity for the brand, and that same user certainly doesn’t buy associated products. The app and its AR just sit there, waiting to be uninstalled.

In my Picnic ’10 presentation I briefly showed the CBA’s Property Guide app. Although this is far from a novel idea (in fact property prices were one of the first things in Layar), the implementation is rather good and points to several things for the cultural heritage sector to take note of.

First it addresses something with a clear existing demand – Australians’ obsession with property prices. Second, it manages to surpass your expectations of the available data – by providing, free of charge, access to ‘good enough’ data for almost every house in the street.

When I first booted up the CBA app I expected to get patchy data for my chosen area. Properties near me sell reasonably frequently but also many people stay in the same place for a long time. So you can imagine my surprise when I was able to see that the last time a place near me sold was in 1984 and for ‘between $30,000 and $40,000’ – as well as every single property up my street. That sort of data usually isn’t available – even in tabular form for purchase.

So how might that play out for cultural heritage AR?

Well, I think for a start it means cross-institutional applications and cross-institutional data. There is no technical reason why the same level of data that the CBA app has access to isn’t available for heritage.

Just thinking of the existing rudimentary ideas about these kinds of apps – the ‘Then & Now’: the local council archives are probably a good place to start and work up the food chain to the big institutions. ‘A photo and a title deed of every property’ . . . . it is only a matter of time.

But addressing the ‘demand’ issue is another matter altogether.

Categories
Social media

On chocolate cakes, journalism and co-curating museums

Here’s a great piece from the Nieman Journalism Lab on the New York Times’ community-sourced recipe book – dug out of tens of thousands of records in their archives.

If you change your working relationship to your audience, you will understand that audience in a new way. The tools that support those two steps also support collaborations that produce insights not likely to be found any other way, framed in genres altered by collaboration and by the social tools that made it possible. Tools, genres, partnerships, models of authority and active citizenship all change, and so does the community’s understanding of itself and its history at the same time.

For those who have learned how to look, the Internet reveals layers of inventive food culture liberated from traditional limitations — including the journalist’s earlier understanding of audience — by new speed of publishing, connectivity, innovation . . . Hesser’s team saw need, opportunity, and tools in place to create a new genre of participatory cookbook writing, too, on the Internet …an online platform for gathering talented cooks and curating their recipes…a new community-building venture…It would be democratic and fun…and together they would produce cookbooks without giving all the authority back to experts. Once again, Hesser had the experience of asking people to join in and finding that they loved being invited.

The parallels to the changes in museums – first rise of education and public programmes, and in recent times the rise of the social web and co-curation – are obvious.

It reminded me of John Fiske’s comments, predating the social web, way back in 1989, from Reading the Popular (Routledge);

The resources – television, records, clothes, video games, language – carry the interests of the economically and ideologically dominant; they have lines of force within them that are hegemonic and world in favour of the status quo. But hegemonic power is necessary, or even possible, only because of resistance, so these resources must also carry contradictory lines of force that are taken up and activated differently by people situated differently within the social system. If the cultural commodities or texts do not contain resources out of which the people can make their own meanings of their social relations and identities, they will be rejected and will fail in the marketplace. They will not be made popular.

(emphasis mine)

Categories
Policy

Mike Edson talk at Powerhouse Museum from 15/10/10

Mike Edson, Smithsonian Institution at the Powerhouse Museum 15/10/10 from Powerhouse Museum on Vimeo.

Last week we held a public talk by Mike Edson. Here’s the video from the night. In roughly 70 minutes Mike talks about the Smithsonian Commons and Web Strategy organisational change imperatives and initiatives. I was reminded of Ivan Chtcheglov’s much used quote from Formulary for a New Urbanism

You’ll never see the hacienda. It doesn’t exist.

The hacienda must be built.

About the talk:

Mike Edson, Smithsonian Institution’s Director of Web and New Media Strategy, talks about his work and the Smithsonian Commons, a new part of the Smithsonian’s digital presence dedicated to catalyzing learning, innovation and creativity through open access to Smithsonian resources, communities, and expertise. The Smithsonian Commons project is just beginning, but the commons concept and the strategy behind it reveal important ideas about reputation, risk, and the changing work of public institutions in the 21st century.

Michael’s talk is followed by a Q&A session.

Michael Edson was in Australia supported by the Powerhouse Museum with thanks to the New Zealand National Digital Forum.

Categories
Digital storytelling Geotagging & mapping open content

Sharing with SepiaTown – historical images re-mapped

Early in the year when I visited Josh Greenberg and the digital team at the New York Public Library, I was told about SepiaTown.

One of quite a few ‘Then & Now’ web projects (see also History Pin), SepiaTown puts historic images back on the (Google) map, also using Google Street View to connect the photography of yore with those of today.

We figured that we’d give SepiaTown a full collection of the geotagged images of Sydney from the sets we’d uploaded to our repository in the Commons on Flickr, and after a bit of to-ing and fro-ing we uploaded a datafile and waited.

We knew that quite a few of the geotags on these images were ‘approximations’, and that to properly do Then & Now, you also need to know the direction in which the photographer was facing. And we also knew that neither the metadata in Flickr or on our own system was enough.

So you can imagine our surprise when Jon Protas at SepiaTown popped into our inbox advising us –

We quickly discovered when we first dug into your collection’s geo-locations that many of them were mapped in a fairly general way, and fell short of our quality control levels. We spent the summer spot-checking each one and correcting the locations for almost all of the images you provided, and we are now confident that the vast majority of the images are now mapped within a 20 yard radius of the exact camera location (and are facing the right direction).

Some were quite tricky, but fortunately, site designer Eric Lehnartz, who is also our main uploader (and a bit of a geo-locating savant), was able to deduce even the more obscure and rural locations.

Wow. They’re improved, fixed, tweaked!

Not only that, SepiaTown are sending the corrected dataset back for ingestion into both our collection management system and thus also into Flickr.

Big thanks to Jon & Eric!

Here’s a few to try from the Powerhouse set (follow the link then click the Then/Now option):

Blaxland’s Tree
Sussex St, North from Market St
Erskine St, West from Kent St
The Spit, Middle Harbour

Check out their blog for other highlights. They have some fantastic images mapped in there.

Categories
Collection databases Developer tools

Launch of the Powerhouse Museum Collection API v1 at Amped

Powerhouse API - Amped

This weekend just gone we launched the Powerhouse Collection API v1.

For the uninitiated the API provides programmatic access to the collection records for objects that are on the Powerhouse website.

For the technically minded, Version 1 returns JSON, JSONP, YAML and XML through a RESTful interface – chosen mainly so that interested people can “make something useful inside an hour”. Upcoming versions of the API are planned to return RDFa. (Already Allan Shone has independently added YQL!)

Now you may be asking why this matters, given we’ve been offering a static dataset for download for nearly a year already?

Well, the API gives access to roughly three times the volume of content for each object record – as well as structure and much more. Vitally, the API also makes internal Powerhouse web development much easier and opens up a plethora of new opportunities for our own internal products.

The main problem with APIs from the cultural sector thus far has been that they are under-promoted, and, like the cultural sector in general, rather invisible to those who are best placed to make good use of the API. Having had experience with our dataset being used for GovHack, Mashup Australia (one of the highly commended was a Powerhouse browser) and Apps4NSW last year, we rushed the launch to coincide with Amped – the Web Directions free ‘hack day’ that was being held at the Powerhouse.

And, despite the stress of a quick turnaround (hence the minimal documentation right now!), we could not have had better timing.

Amped Sydney

Amped provided the perfect road test of the API. Carlos and Luke were able to see people using the product of their work and talk to them about their problems and suggestions. Nothing like combining user testing and stress testing all in one go!

Amped Sydney

Out of 250 people that attended the Amped, 24 teams submitted prototype projects. 13 of these projects used the new Powerhouse API!

So, what did people do?

Amped Sydney

The winning project for the Powerhouse challenges was a collection interface which pivoted around an individual visitor’s interests and existing personal data – aimed at being deployed as an entry experience to the Museum – and developed by Cake & Jar (Andrea Lau & Jack Zhao).

Honourable mentions and runners up went to a Where In The World Is Carmen San Diego?-style game using the museum objects as the key elements in a detective story built with multiple APIs and entirely without a backend; a quite spectacular social browsing game/chat client built using the Go language; an accessibility-enhanced collection browser for the visually impaired; a collection navigator that emphasised provenance over time and space; and an 80s dungeon crawl-style graphical adventure collection organiser loosely inspired partially by (the magical) Minecraft.

Amongst the others were a very entertaining ‘story generator‘ that produced Brion Gysin-esque ‘automatic writing’ using the collection documentation written by curators; a lovely mobile collection suggester using ‘plain English’ sentences as an entry point; and several collection navigators optimised for iPads using different types of interface and interaction design models (including My Powerhouse).

Judging

Now over to you.

Register for a free account and then create your access keys.

Read the (ever-growing) documentation. Then make stuff!

We’ll be watching what you do with great interest. And if you have any suggestions then email api [at] phm [dot] gov [dot] au.

Thank you to the inspired and pioneering work especially by our friends at the Brooklyn Museum, Digital NZ, and Museum Victoria. Their work with has been instrumental in informing our decisions around the API.

(All photos by Jean-Jacques Halans, CC-BY-NC.)