I spoke at the National Library in Pretoria this morning about Augmented Reality and Libraries. [Disclosure: I’m a bit of a fraud. Neither a librarian, nor a seasoned AR guru]. As a heutagogist, I went to YouTube, because that’s where we go when we don’t know, and found three examples of Augmented Reality from Films.
The first example I found was Terminator 3, where Arnies vision, as used in T3, came in useful for finding Sarah Connor. The second was the meta-commentary on product placement in the start of Fight Club. Here the narrator (Edward Norton) describes his IKEA furnished apartment while each item is visually names, described and priced, very much like a catalogue. The third example was Tom Cruise’s character in Minority Report where he uses a gesture based interface (and a glove) to manipulate and interact with a mass of multi-media content on the screen.
The three films allowed me to gently introduce three types of Augmented Reality (without sounding too geeky). A common understanding of Location (T3), Pattern (Fight Club) and Surface (Minority Report) based augmented reality was established. Hollywood was unable to provide any ready examples of holographic AR and Outline AR.
So, now – the crux of the presentation. How we can use AR in a library? The three films allowed me to suggest some ideas. My campus has 15 libraries. With location data and an inbuilt gprs, a phone can be used to direct a lost student to the nearest library or to an appropriate person or section of the library. We could consider the possibility of applying pattern recognition to books, so when the user looks through their “layer” (whether it be a phone or set of googles), they see associated tags, reviews and comments. Or maybe we need to create augmented library desks, install augmented catalogues, and places book under a camera virtually stamp your borrowed book. Holograms of librarians may be taking things a bit too far.
Sadly, our expectations are running far ahead of reality. Mobile batteries, cameras and the other associated applications are not sufficiently standardised to be able to support such visions. AR markup is fragmented, there are no common standards to ensure that metadata is searchable, location-aware and federated. If we want to look at AR in libraries, we should be looking at other invisible, but present nodes on the network. The students that are using social media, like Twitter of Foursquare, to create a networked blanket of media and place. Foursquare, for example, collects experiential and location based data. A library, with a Foursquare account could, for example, encourage students to record their book activities and reward repeated visits to the library. While this realtime data cannot yet be visualized through the cameras lens, it could allow a librarian, for example, to answer a general query about library.
The appeal of drawing the net out into the phenomenal world appeals to our nomadic sensibilities. Hollywood visions of students that are able to take out their cell phones and interrogating their surroundings have great appeal But, if AR is going to go beyond sci-fi & cyberpunk visions of our expected futures, then the personally informative overlay available in the library needs to move beyond novelty and offer seamless and minimally intrusive data.