Since the use of the Internet to disseminate academic research began—PubMed is now 16 years old—it has been frequently asserted, “Print is dead.” However, many readers, this author included, still cling to hardcopy. I continue to receive several publications in the mail, even those that are simultaneously available on the iPad. There remains something in the experience of holding paper in our hands and turning the page that is deeply satisfying.
One of the advantages of digital media, however, has been its ability to move beyond static text, and display video and audio. Additionally, online comments and article linking provide a level of immediate interactivity and connectivity that print cannot match. Nevertheless, print and electronic media need not be exclusive of one another, and technologies are emerging that seek to replicate these digital advantages in print. For example, in April 2011, Neurosurgery introduced QR codes to print.1 By scanning a QR code using a smartphone, a print reader can access digital video, supplemental text or tables, without going to a desktop or opening a laptop.
Augmented reality (AR) has existed in various forms since attempts to enhance the film-watching experience in the 1950s.2 In 1998, ESPN introduced the “yellow line” first-down marker during telecasts of American football--an element of the game so prevalent that not seeing it when attending a game in person borders on disconcerting.3 With the advent of smartphones, AR entered a second stage of evolution. Using the GPS location of the user, an augmented reality application could display contact information for local businesses, restaurant reviews, or traffic updates. The addition of an AR iPhone application to Yelp's social review service in 2009 allowed users to display markers for local businesses on top of the camera view.4 This represented a true breakthrough in the use of AR within an existing social platform.
The revolution for AR in print was marked by the development of a mobile application triggered by visual recognition, rather than relying on GPS location. In 2011, Layar released an update to its mobile AR browser application that incorporated Layar Vision, ie, AR triggered solely by the detection of an image.5 Now AR could be activated in any physical location, as long as a defined reference image was visible.
In keeping with our desire to constantly push the technological boundaries of Neurosurgery, we are happy to announce that Neurosurgery has developed its own augmented reality layer. Using the Layar AR browser, a reader can access supplemental digital content, hear audio commentary, interact with the journal's social media networks, or visit the full-text mobile version of www.neurosurgery-online.com.
The cover of this issue, and a number of images within, has been activated for scanning. When you see the Layar icon in this journal (Figure 1), the page has been augmented with digital links and content viewable via the Layar mobile AR browser, a free application for iOS and Android platforms.
Step 1. Download the Layar browser for iOS or Android.
Step 2. Launch the Layar browser application.
Step 3. Point the browser at the cover of this month's (September 2012) issue of Neurosurgery, and hit the “Tap to View” button.
Step 4. After several seconds, the AR overlays will be projected onto the phone's camera view. The projections link to a variety of digital content and communication tools.
Duncan A. MacRae
Managing Editor, Neurosurgery
1. MacRae DA. Introducing QR codes: linking print and digital content via smartphone. Neurosurgery. 2011;68(4):854–855.
2. Rheingold H. Virtual Reality. New York, NY: Simon & Schuster; 1992.
3. Lake M. When the game's on the line, the line's on the screen. New York Times. January 27, 2000.