Exhibition technology

What does a student-curated digital/physical exhibition look like? Museums and the Network 2013

So tonight the students brave enough to take the class that Aaron Cope and I have led at Pratt this semester opened their exhibition. I say ‘brave enough’ because this was always going to be a seat-of-your-pants experimental class broadly titled “Museums and the Network: Caravaggio in the age of Dan Flavin lights”. It ended up covering everything theoretical from digital culture, media art theory, surveillance, and startups through to the more prosaic intricacies of map making, databases, web scraping, object labels and networked project management.

But graduate students in the information and library sciences are an eager and very talented bunch. And the chaotic tendencies of both Aaron and I were tempered by a stellar set of guests who parted their professional wisdom – Sherri Wasserman, John Powers, Dan Phiffer, Fiona Romeo, Virginia Gow, George Oates, Nicole Cama, Matt Knutzen, and John Allspaw.

After their first class project collected data from cultural institutions around New York to build network maps of philanthropy – – something very aligned with the ‘digital’ nature of the course, their main project forced them to start again and built a physical exhibition with tangible objects, but informed by their growing understanding of “the affordances of the networks that surround and envelop them”.

The exhibition, its topic, its objects, and its argument were all their responsibility and the one they ended up choosing to explore was ‘Communting and Communing’. The exhibition “explores several facets of the act of commuting on the NYC subway … we have organized an exhibition that explores the subway’s sights and sounds, the interactions that occur with people as well as objects and the virtual communities that come together as a result of their commuter experience.”

Here’s some photos from the opening.

Hand-recorded visualisation of happenings on a single end-to-end train journey
Hand-recorded visualisation of happenings on a single end-to-end train journey

Some found objects and the hardware running the MTA.WIFI backchannel
Some found objects and the hardware running the MTA.WIFI backchannel

Overheard conversations on Japanese fans with hyperlinks to computer-voiced conversations.
Overheard conversations on Japanese fans with hyperlinks to computer-voiced conversations

Fan detail and hyperlink
Fan detail and hyperlink

Array of found objects with geospatial metadata.
Array of found objects with geospatial metadata

Found objects detail and hyperlinks
Found objects detail and hyperlinks

More found objects and hyperlinks.
More found objects and hyperlinks

Text panel for sound clips and video loops
Text panel for sound clips and video loops

Backchannel label
Backchannel label

Aaron Cope visits the exhibition 'over the network' from a hotel room in Rotterdam (DISH2013)
Aaron Cope visits the exhibition ‘over the network’ from a hotel room in Rotterdam (DISH2013)

Of course, this course was about ‘the Network’ so the students have used Tumblr as their collection management system and exhibition catalogue. The ‘archive’ view of Tumblr provides a great way of visually browsing the objects and other media assets, whilst the standard view gives a more linear look complete with auto-playing subway soundtrack. The catalogue includes all the found objects, nicely accessioned and photographed with location metadata, as well as documentary and process evidence. There’s a Twitter account too.

The exhibition also included short URLs for every object bringing visitors back to additional information and in the case of the fans, supporting media. The commuter video loops were accompanied by audio soundtracks that can be downloaded for playback on your own subway journeys too. A final AV component was a subway Supercut! More of this content is going up to the Tumlr over the next few days.

For the exhibition backchannel, a public wifi darknet was set up using Dan Phiffer’s Occupy.Here projects its basis. This allowed visitors to post comments and images anonymously whilst in the exhibition.

If you’re in New York and would like to pop in and see it drop me a line and I’ll see what can be done.

And great work class of 2013!

Exhibition technology

On robot guides

The robot guides are coming.

This isn’t entirely new – as some Japanese museums have been, unsurprisingly, experimenting with this for over a decade. One of my fondest science museum memories was stumbling upon a ‘bipedal robotics conference’ inside the Miraikan in Tokyo sometime in the early 2000s.

But this is slightly different and combines the potential of the ‘Roomba-Curator’ (hat tip to Aaron Straup-Cope for that phrase) with the growing trend for ‘school incursions‘ (rather than ‘school excursions’) but delivered over high speed broadband.

The robot is in preliminary design but expected to be the height of an average adult, have a motorised base with wheels and a “head” that is a 360-degree, panoramic camera.

It will find its way around the museum and avoid bumping into visitors and objects using sensors and a sort of global positioning system.
The robot is initially for the use of school students, who will each control the robot’s camera head using computers as if in a video conference.

The camera can transmit many views of an object simultaneously – from above or the sides and zooming in and out – so each user can control what they see.

I like that this lets multiple students control their view and zoom on objects of their own choosing.

But I’d really like this if it was deployed to the collection stores – the behind the scenes areas where museums keep all vast numbers of the objects they don’t have on exhibition.

Imagine an informational overlay using a collection API to pull up data on shelves and shelves of objects.

It won’t be far off.

Exhibition technology Interactive Media User behaviour Young people & museums

The honeypot effect: more on WaterWorx, the Powerhouse Museum’s iPad interactive

Photography by Geoff Friend, Powerhouse Museum. CC-BY-NC-ND

Week one of our iPad interactive – WaterWorx – and the feedback has been great from visitors and teachers alike.

Just to prove how much of a honeypot the iPads are, here’s a time-lapse from the day that the exhibition was soft launched. You can see the early morning final touches being added to the space, followed by the flurry of the first school visitors, and so on.

You can see for yourself the significant dwell times and people coming back for another go. And that’s awesome.

We’ve been deploying minor fixes as we go and the OtterBox Defender cases that we have been adapted to protect the iPads are being pushed to their limits!

(If you missed our first post that describes the game itself then you need to travel back in time a few days)

Exhibition technology MW2009

MW2009 – Multi-touch: what does this technology hold for future musuem exhibits?


Hi I’m Paula Bray and I usually blog over at Photo of the Day.

Today, whilst Seb was slaving away giving two workshops in a row at Museums and the Web 2009 I spent the day with Jim Spadaccini and Paul Lacey in a great, full-day workshop called ‘Make It Multi-touch’ that showcased the custom built 50” touch-table. You can view it over at Ideum .

We got inside information on how this technology was developed from the initial prototype back in September 2008 that featured a dual mirror and two camera solution that resulted in the need to process complicated gestures and quickly. Two prototypes later is the final product you can see here. This technology can process simple to complex gestures known as ‘blobs’ (fingers reflected) which is fed to software that can process touch, drag and drop, pinch and expand, drawing, rotate and double tap features that are all intuitive to the user within a short time-frame. The aim is to provide an interactive social experience that is very different to the traditional computer based interactive exhibits that can tend to isolate the experience to one visitor.


What can we learn from the public about using museum collections and content through technology such as multi-touch? This form of technology may be a novelty for some at this stage but the future design of this product holds potentials for change amongst many museum applications.

Scenario: Multi-touch tables are available in a museum exhibition for the public to use and interact with exhibition content. Images of collection objects can be moved across the table, details of content can be zoomed in through simple “blob” (finger) movements. Descriptive information about the object can be shown through XMP metadata stored in the file. Location data can be retrieved and the user can create their own exhibit and learning experience. This is a very different user application that can change visitor’s experiece. Do we need to compete with devices that are currently available at home and make it social and educational in the museum? Does fixed navigation work anymore?


Multi touch technology has potential to change museums experience and it will be interesting to watch this technology develop. Will the public start to expect to come to museums to interact with exhibits in this new way?

This is definitely more than a “big-ass table”.

Post & photography by Paula Bray