Had a really interesting meeting with Tom Melamed from Calvium about what happens when the heritage app moves indoors and the issues around interior localization. This will be the second phase of my research project, and will be starting in earnest after the new Curzon Collection opens to the public after Chrismast, probably late January. At the moment it is a building site so it is hard to map anything in detail. Nevertheless, I want to try and start thinking about what the possibilities are now whilst I’m developing the first phase around the exterior of the building.
We’re discussing piloting 3-4 different technologies, or relationships between triggering method (RFID, QR Codes, Motion Sensors or using the phone’s interface to give manual instructions) and what is triggered (audio, video on the iPhone, hidden micro projectors, website).
The most simple way is to get the user to interact with the iPhone, for example a message saying when you are standing next to the cinema organ, press here… However, I’m keen to limit the interaction with the phone as much as possible so that the user can experience the physical environment and interact with the exhibits in the Curzon Collection, rather than be immersed in the phone. A slightly different use of the phone interface is to instruct the user to press something in the real world that manually triggers something…
Another idea is to have an RFID chip (the technology in the London Underground Oyster card) inserted inside a cinema ticket which then triggers things when you hold the ticket up to say the ‘usher’ logo.
The option of the QR Code is interesting, but again brings the user’s attention back to the phone screen, linking to a website – it might be an option to have at the end of the experience so they can explore stuff further on the website…
Other ideas centre around motion sensors which trigger things using simple USB switches, programmed in something like Flash (Tom said have a look at Phigets, or alternatively using some sort Arduino programming.
We explored the idea that the same triggering method could trigger different things, so a motion sensor could trigger a projection on time, and some localized audio another time, for example. I want to secrete a micro projector into the body of one of the projector exhibits and have it trigger a projection that looks like the projector has whirred into life – or to actually trigger the mechanism of the projector itself to move or both!
I’m also keen to try to do a digital Pepper’s Ghost:
Anyway, at the moment we’re just batting ideas around and I’ve asked Tom to think about any issues that he and Calvium would like to explore through the project. His initial thoughts were that he’d be interested in exploring the interaction between phone screen and the real world and trying to make a seamless relationship between the content on and off the device. He’s also interested in exploring the transition from exterior to interior localization in one app. I think a little field trip to the Curzon is in order….