Developer tools

Coming back around to colour in 2013 via 2009 via 2005

The online collection experiment that changed a lot of the way I looked at museum collections was the Electronic Swtachbook that we launched at Powerhouse way way back in 2005. The first version took a selection of high resolution images of fabric swatches from the Powerhouse collection of swatchbooks and made them available for free download, asserting that their Copyright had lapsed and that they were now in the Public Domain. One twist was that, because they were not individually catalogued, we enabled user tagging. The experience with that project led directly to the development and launch of the Powerhouse’s then influential “OPAC 2.0” the following year.

Giv Parvaneh who worked as a developer on the Swatchbook and OPAC2.0 as part of my team back then left the Powerhouse and went to work in the UK. A few years later while he was at the BBC we circled back and he added some long discussed ‘colour search’ features to the Electronic Swatchbook in 2009.

Fast forward to today and at Cooper-Hewitt we released colour browsing on the Cooper-Hewitt’s prototype online collection site. Aaron Cope took a look at the colour analysis code that Giv eventually released on GitHub, made a few modifications and enhancements, building upon that good work and now it is live.

In keeping with the generous spirit of Giv’s code release, Cooper-Hewitt also released its code and method to the world.

There’s little value in keeping useful code, useful tools, and useful methods to yourself in the museum sector. As I’ve said many times before, it ends up just keeping everyone from moving forward.

Developer tools Interactive Media Interviews Mobile User experience

Interview with Rob Manson on Layar, streetARt and the AR Dev Camp

A little while back at the beginning of June we hosted the Sydney AR Dev Camp. Organised by Rob Manson and Alex Young, the AR Dev Camp was aimed at exposing local Sydney developers to some of the recent developments in augmented reality. A free event sponsored by Layar and the Powerhouse, it filled the Thinkspace Lab on a Saturday to network and ‘make stuff’. Rob and Alex also launched their new buildAR toolkit for content producers to quickly make and publish mobile AR projects using an online interface.

AR Dev Camp Sydney
(AR Dev Camp Sydney by Halans)

AR Dev Camp generated many discussions.

Some of these are covered and expanded on by Suse Cairns and Luke Hesphanol.

The ARTours developed by the Stedelijk Museum and presented as incursions into other spaces – including a rumoured temporary rogue deployment at Tate Modern – really demonstrate the way that AR popularises some interesting conceptual arenas. Indeed, just walking down Harris St that morning, booting up Layar and seeing a giant Lego man hovering over the Powerhouse was something that you’d rarely see. Margriet Schavemaker, Hein Wils, Paul Stork and Ebelien Pondaag’s paper from Museums and the Web 2011 this year explores these in detail.

I spoke to Rob Manson in March, as the event was being planned, about some of the changes in AR.

F&N: A lot has changed in both AR and Layar since we last spoke, way back when MOB released the PHM images in a Layar in 2009. Can you tell me about some of the changes to the Layar platform and other AR apps as you’ve seen them mature?

RM: I can’t believe how quickly that time has passed! But in a lot of ways we haven’t even started and the path in front of us is starting to get a lot clearer now.

Layar has continued with their main strength which is massive adoption (and those figures are just for Android!). It’s now the most dominant platform in the whole AR landscape. And just this week they announced Layar Vision, their natural feature tracking solution. Layar has become the default AR app that everyone refers to.

With this new version 6 of Layar you can now add image based markers, animation, higher resolution images and a much simpler improved user experience. And of course it supports a lot more interactivity than it did way back when we created the first Powerhouse layer – it now includes layer actions and proximity triggers. Our buildAR platform makes it easy for you to customise all of these settings and we’ve already announced full support for the new Layar Vision features.

Despite being an early adoption, the Powerhouse layer was loaded 2384 times by 853 unique users in 13 countries in just under 18 months. Whilst that may not sound like a lot, we’ve also had heavily promoted layers run by advertising agencies for major brands that did almost exactly the same numbers as the PHM layer. So on the whole I think the PHM layer has performed pretty well. Especially considering it was created quite early on and there’s not really a lot of reasons for people to return to the layer or share it with their friends.

Now Layar have also released the Layar Player SDK which allows us to embed the Layar browser within our own iPhone applications. This has opened up a world of new opportunities and means we can wrap layers in even richer interactivity and allow users to create and share media like photos, audio and videos. This is what led us to create

F&N:Obviously your StreetARt App is indicative of some these new changes – the ability to separate off as an App in its own right and have interactions.

Yes, we’ve created an App framework around the Layar Player SDK that integrates with our buildAR platform.

The response has been great. We’ve done very little if any promotion except for twitter, a blog post and being promoted as a featured layer and in our first month we’ve attracted over 25,000 unique users from 166 countries. Our total count is now well over 200,000 unique users from over 194 countries.

We’ve engaged with street art and graf communities through twitter and the response has been really good. We’re really outsiders that just enjoy the art and really wanted an easier way to find it ourselves. The artists that have used it have given us really positive feedback and seem happy to spread the love.

F&N: What happens to the aggregated dataset of geolocated works?

This is part of our new features road map. The first phase of social sharing with multi-device permalinks has been released. We’re now working on ways for people to import/manage photo sets from Flickr and to be able to map out and share their own sub-sets of the streetARt locations to create walking tours, etc.

Plus we want to focus on specific artists works, publish interviews and bubble up more dynamic content to the make the whole platform feel more alive.

F&N: How do you see it complimenting non-AR graf apps like All City and others?

There’s quite a few actually. There’s Allcity which was sponsored by Adidas. which was sponsored by Red Bull and most recently Bomb It which is an app based on or supporting a movie. And also the Street Art paid iPhone App.

We think there’s plenty of room for all of these apps I’m sure there will be a lot more soon too. However, I think there’s a bit of a backlash building around the sponsored apps as some people in the scene see this as just an exploitation of the graf/streetart community.

We considered this a lot when we built streetARt. In some ways people could point the same finger at us but we don’t charge for the App and we don’t sell sugary drinks or expensive sports clothes/shoes. We just want to find out what happens when you mix cool content with cool technology and so we hope people see our good intentions.

And of course we were the first to do it with AR!

F&N: One thing I’ve been finding challenging with AR, despite all the talk of ‘virtual and physical worlds merging’, is that the public awareness of the data cloud that surrounds everything now is still very low. I’d be interested on your thoughts as to how to make people aware that AR content exists out in the world at large.

I think that’s a critical point. Recently some artists published what they called the ARt Manifesto but David Murphy posted a really valid critique.

There IS an interesting debate to be had around “control” of the digital layers and where they can be overlaid onto the physical world. But the digital layer is an abundant, effectively infinite resource where the cost to create is continually dropping. The really scarce resource that we should all really be focused upon is “attention”.

Getting people’s attention, keeping it and then getting them to engage on an ongoing basis is the real challenge. That’s why we’re so happy with the results that streetARt has created too. Not only have we attracted tens of thousands of users from all around the world, we’ve also been able to attract hundreds of really engaged users that return on a regular basis, many of them almost daily. The key to this was populating streetARt with enough Creative Commons-licensed content to kickstart it. This made sure that most people would see some cool art right from their first experience. In locative media [getting the first experience right] can be a real challenge – so we started with over 30,000 images from over 520 regions around the world, and now the users are helping us grow that further. But the 90/9/1 [participation] ratio is a reality and you have to plan for it.

Developer tools Web metrics

Fixing document download and link tracking with the Google Analytics asynchronous tracking code

If you’ve been using the gatag.js from Good Web Practices in conjunction with your Google Analytics code for the past few years you may have noticed that it stopped working when you updated to the newer, better asynchronous Google Analytics tracking code.

What was nice about the gatag.js code was that it was quick and easy to implement and tracked downloads of PDFs and other file types as well as traffic following any outgoing links. For cultural institutions which are full of such PDFs and external links, tracking these as distinct ‘EVENTS’ in Google Analytics was very useful for understanding user behaviour on your site.

There’s not been a clean and simple fix for this problem and until I saw Stephen Akins’ solution using jQuery I thought we’d have to go back to other methods.

Our developer, Carlos Arroyo, made some minor modifications to Stephen’s code so that the way in which downloads and external links appeared in the reports would stay the same as those used by gatag.js allowing for historical comparisons.

First remove your references to gatag.js then place this after your Google Analytics asynchronous tracking code. (If you already load jQuery on your site then you probably want to check the version and you can omit the first section.)

(And of course you use this at your own risk!)

<script type="text/javascript">
		if(typeof jQuery != 'function'){
		var script = '<script type="text/javascript" src=""></script>';


<script type="text/javascript">

  			href = $(this).attr('href');
  			href_lower = href.toLowerCase(); 
  			if(href_lower.substr(-3) == "pdf" || href_lower.substr(-3) == "xls" || href_lower.substr(-3) == "doc" ||
  			   href_lower.substr(-3) == "mp3" || href_lower.substr(-3) == "mp4" || href_lower.substr(-3) == "flv" ||
  			   href_lower.substr(-3) == "txt" || href_lower.substr(-3) == "csv" || href_lower.substr(-3) == "zip") {
   				_gaq.push(['_trackEvent', 'Downloads', href_lower.substr(-3), href]);
  			if(href_lower.substr(0, 4) == "http") {
   				var domain = document.domain.replace("www.",'');
  				if(href_lower.indexOf(domain) == -1){
				href = href.replace("http://",'');
				href = href.replace("https://",'');
 					_gaq.push(['_trackEvent', 'Outbound Traffic', href]);
Collection databases Developer tools

Behind the Powerhouse collection WordPress plugin

Yesterday we went live with the first version of the Powerhouse Museum collection WordPress plugin. Rather than clutter that launch blogpost up with the backstory and some its implications, here’s the why and how, and, what next.

The germination of the WordPress plugin was the aftermath of the Amped Hack Day run by Web Directions at the Powerhouse where we launched the Museum’s collection API.

Whilst the API launch had been a success, Luke (Web Manager/Developer) and Carlos (Developer) and I were a little disappointed that although we’d launched a REST API, we had actually made it more difficult for the ‘average interested person’ to do simple programmatic things with our collection data.

Of course, we’d built primarily the API to make our own lives easier in developing in-museum applications, and the next wave of online and mobile collection projects you will be hearing about over the coming 12 months. But we’d also aimed to have the API broaden the external use of our collection data and solve some of the ‘problems’ with our existing ‘download the database‘ approach.

In fact, ‘download the database’ had worked well for us. Apart from the data being used in several projects – notably Digital NZ and one of the highly commended entries in 2009’s Mashup Australia contest – we’d found that the database as a whole item was being used to teach data visualisation and computer science in various universities both in Australia and overseas. We’d also found that people in the digital humanities were interested in seeing the ‘whole view’ that that data dump provided.

None of these groups were well catered for by the API and one of our team, Ingrid Mason, ended up convincing us to retain the ‘download the database’ option alongside the API, rather than forcing everyone through the API. Her argument revolved around the greater, and hitherto underestimated value of being able to ‘see the whole thing’.

At the same time, WordPress had become a defacto quick and dirty CMS for most of the Museum’s web projects. We’ve run annual festival websites (Sydney Design), whole venue websites (Sydney Observatory), exhibition microsites (The 80s are back), and experimental pilots (Suburb Labs) on WordPress over the past few years building up both internal skills and also external relationships to the point where the graphic designers we work with supply designs conscious of the limitations of WordPress. In each of these sites we’ve had a need to integrate collection objects and this has usually meant ugly PHP code in text widgets.

(Don’t be concerned – for larger and complex projects we have been migrating to Django)

[phm-grid cols=4 rows=1 v_space=1 h_space=1 thumb_width=120 thumb_height=120 random=true parameters=”title:computer”]

So in the weeks after Amped, Carlos spent time developing up a WordPress plugin based entirely on the API. This, it was seen, would serve two purposes – firstly, allow us to embed the collection quickly into our own WordPress websites; and secondly, to give interested non-programmers a simple way to start using our API in their own sites.

Late last year we sent the alpha version out to some museum web people we knew around the world for feedback and the Carlos tweaked the plugin in between working on other projects, before its first public outing in the WordPress plugin repository.

So where now?

The WordPress plugin is definitely a work-in-progress.

We’re keeping a keen eye out for people implementing it on their blogs and WordPress sites. (If you’ve implemented it in something you’ve done then tell us!)

Carlos has several features and fixes already on his radar that have come out of our own uses of the plugin – some of these are tied to limitations in the data currently available through the API.

If you’ve got feature requests then we’d love to hear them – and we’re secretly hoping that those of you who are deeply into Drupal or Expression Engine might port the plugin to those platforms too.

Send your feedback to api [at]

(Luke is also presenting a paper on the API exprience at Museums and the Web in Philadelphia this year)

API Collection databases Developer tools Museum blogging Tools

Powerhouse Museum collection WordPress plugin goes live!

Today the first public beta of our WordPress collection plugin was released into the wild.

With it and a free API key anyone can now embed customised collection objects in grids in their WordPress blog. Object grids can be placed in posts and pages, or even as a sidebar widget – and each grid can have different display parameters and contents. It even has a nice friendly backend for customising, and because we’re hosting it through WordPress, when new features are added it will be able to be auto-upgraded through your blog’s control panel!

Here it is in action.

So, if you have a WordPress blog and feel like embedding some objects, download it, read the online documentation, and go for it.

(Update 22/1/11: I’ve added a new post explaining the backstory and rationale for those who are interested)

Developer tools User experience

Playing with Google’s reading age tool

Google just released a new ‘reading level filter‘ in the Advanced Search section of their search – the part that probably only librarians actually regularly use.

I’ve run it on a few of our domains with interesting results.

Here’s our main

After seeing that I went off and ran it over a slew of other museums to see if I could spot any patterns. It seems that natural history museums have the highest proportion of ‘advanced’ whilst art museums bias towards the ‘intermediate’.

I also tried on of the new ‘events calendar’ sites we’ve been involved in building (behind the scenes post coming soon). Being a calendar site aimed at parents looking for holidays activities we want to make sure that it has the broadest possible appeal. Fortunately we seem to do rather well – 100% Basic!

I’m not sure how valuable this really is in the long run but it is another tool to experiment with. There’s been some fun analysis of different news (and other) sites using the tool over at Virtual Economics.

Collection databases Developer tools

Launch of the Powerhouse Museum Collection API v1 at Amped

Powerhouse API - Amped

This weekend just gone we launched the Powerhouse Collection API v1.

For the uninitiated the API provides programmatic access to the collection records for objects that are on the Powerhouse website.

For the technically minded, Version 1 returns JSON, JSONP, YAML and XML through a RESTful interface – chosen mainly so that interested people can “make something useful inside an hour”. Upcoming versions of the API are planned to return RDFa. (Already Allan Shone has independently added YQL!)

Now you may be asking why this matters, given we’ve been offering a static dataset for download for nearly a year already?

Well, the API gives access to roughly three times the volume of content for each object record – as well as structure and much more. Vitally, the API also makes internal Powerhouse web development much easier and opens up a plethora of new opportunities for our own internal products.

The main problem with APIs from the cultural sector thus far has been that they are under-promoted, and, like the cultural sector in general, rather invisible to those who are best placed to make good use of the API. Having had experience with our dataset being used for GovHack, Mashup Australia (one of the highly commended was a Powerhouse browser) and Apps4NSW last year, we rushed the launch to coincide with Amped – the Web Directions free ‘hack day’ that was being held at the Powerhouse.

And, despite the stress of a quick turnaround (hence the minimal documentation right now!), we could not have had better timing.

Amped Sydney

Amped provided the perfect road test of the API. Carlos and Luke were able to see people using the product of their work and talk to them about their problems and suggestions. Nothing like combining user testing and stress testing all in one go!

Amped Sydney

Out of 250 people that attended the Amped, 24 teams submitted prototype projects. 13 of these projects used the new Powerhouse API!

So, what did people do?

Amped Sydney

The winning project for the Powerhouse challenges was a collection interface which pivoted around an individual visitor’s interests and existing personal data – aimed at being deployed as an entry experience to the Museum – and developed by Cake & Jar (Andrea Lau & Jack Zhao).

Honourable mentions and runners up went to a Where In The World Is Carmen San Diego?-style game using the museum objects as the key elements in a detective story built with multiple APIs and entirely without a backend; a quite spectacular social browsing game/chat client built using the Go language; an accessibility-enhanced collection browser for the visually impaired; a collection navigator that emphasised provenance over time and space; and an 80s dungeon crawl-style graphical adventure collection organiser loosely inspired partially by (the magical) Minecraft.

Amongst the others were a very entertaining ‘story generator‘ that produced Brion Gysin-esque ‘automatic writing’ using the collection documentation written by curators; a lovely mobile collection suggester using ‘plain English’ sentences as an entry point; and several collection navigators optimised for iPads using different types of interface and interaction design models (including My Powerhouse).


Now over to you.

Register for a free account and then create your access keys.

Read the (ever-growing) documentation. Then make stuff!

We’ll be watching what you do with great interest. And if you have any suggestions then email api [at] phm [dot] gov [dot] au.

Thank you to the inspired and pioneering work especially by our friends at the Brooklyn Museum, Digital NZ, and Museum Victoria. Their work with has been instrumental in informing our decisions around the API.

(All photos by Jean-Jacques Halans, CC-BY-NC.)

Developer tools QR codes

Roll your own URL shorteners for your museum

Last week I got a tweet from Te Ara asking about URL shorteners as their favoured one,, had stopped accepting URLs.

So I’m happy to announce that we’ve implemented our own URL shortener – – for internal use only. Luke had been thinking about this for a couple of months and we’ve been lining up all the ducks before making it live. is based on a modified version of Yourls, an open source PHP-based URL shortening solution. Implementing it was pretty straight forward and you’ll start seeing shortened URLs of the sort popping up form time to time if you encounter Powerhouse links out in the world.

In fact the biggest challenge was finding a sensible domain to use. Some of the best options we had were stymied by registrar requirements but I think we’ve found a good one that makes sense to human readers – whilst still being short enough to be useful.

All our collection records are now accessible in the form –[object number]. For example, the 3830 steam locomotive can now be reached quickly and easily via

This is especially useful as we rethink the way in which we continue to roll out URLs in the galleries. Not only does the shortened URL make their inclusion on labels a little less intrusive, it also makes for simpler 2D barcodes (QRs etc).

Our upcoming Frock stars exhibition can be tweeted as and The 80s are back is simply

And of course this blog is now easily reached at

Developer tools User experience

Multi-lingual machine translation from the footer

There’s been a fair bit of chatter about machine translation of late and so when we noticed that the Museum of London team had rolled out the new Google Translate widget on their website we figured we’d give it a try and follow suit.

So lo and behold, now on the Powerhouse Museum main site you can skip to our persistent footer and be presented with a machine translated version of whatever page you are on – menus, titles and all in any of 39 languages from Afrikaans to Yiddish. It is all rather neat and even with the imperfections of the translation the speed and ease of implementation is hard to resist.

English version –

(Traditional) Chinese version –

Developer tools

Intgerating Twitter tweets into blog comments

Backtype has just released the very first 0.1 version of a WordPress plugin that integrates tweets and retweets as well as comments on other blogs into the comment stream of your original WordPress posts.

I’ve been trialling an install and you can see it in action on a post like this one. Notice that the tweets are interleaved with comments on the blog itself – it even deciphers shortened URLs. (And in case you were wondering which URL shortener is the best check out this article from Searchengineland – hat tip Chloe Sasson!)

This sort of cross-site conversation tracking is becoming increasingly important in a world where tweets are easier and more common than on-blog comments. I’ll be watching with interest to see how the plugin evolves.

A word of caution before you go and roll it out on all your blogs – consider the additional moderation that seeing every public tweet and offsite comment is going to create for you!