Published in 1993, Cybergeneration was a radical departure from R.Talsorian Games’ previous successful Cyberpunk line. Whereas the latter focussed on style over substance, high calibre firearms and heavily armoured Solos backed up by geeky NetRunners, CyberGeneration focussed on the kids of these embittered mercenaries and endowed them with nanotech-derived superpowers.
In Cybergeneration, the Net became actualised from Virtual Reality to Augmented Reality. Individuals would see augmented reality objects as readily as they would see real world objects – 3D objects in realspace. At the time I knew that IP addresses were ‘geotagged’ but this was long before we realised that GPS units could be embedded into superslim phones that were always net-connected.
This week we also saw Metaplace, one of the many 3D Virtual Worlds with User Generated Content, fall by the wayside. “It raised $9.4 million over two rounds of funding with that goal in mind, managing to get the buy-in from new investors Marc Andreessen and Ben Horowitz last October.” but they just announced their closure on Jan 1st, 2010. I think this is typical because Virtual Worlds require you to sit in front of a computer and limit your interaction through keyboard and mouse pointer. In a 3D world, the mouse pointer becomes a single fingertip by which you interact with the world. For Augmented Reality, we have to avoid the mobile phone screen becoming a keyhole by which we view the world. We have to be able to touch it and to hear it.
Earlier this week, Edo Segal, write a guest post on Techcrunch describing a cyberpunk story he wrote 16 years ago which involved augmented reality and I’d hesitate to link this with Cybergeneration (despite the identical publishing year).
Edo reckons the building blocks of an augmented reality system have to be more than we currently have, which amounts to little more than search. He sees the four main blocks as being:
- Realtime Web (Twitter, news flows, world events, and other information which relates to changes in the world)
- Published Information (sites, blogs, Wikipedia, etc.)
- Geolocation Data (your location and information layers related to it, including your past locations and that of your friends, as well as geo-tagged media)
- Social Communications (social graph updates, IMs, emails, text messages, and other forms of signal from your friends).
and he handily provides a diagram.
but he says something in the Techcrunch post which resonates:
One only needs look at a teenager today as they do their homework, watch TV, play a game, and chat while watching their Facebook stream to get a sense for humanity’s expanding affinity to consume ambient streams. Their young minds are constanty tuning and adapting to an age of hypertasking.
and I reckon that this is being unfair to some of us oldies. In March of this year, I met with Ewan McIntosh, one of the 4IP commissioners and part of the round-table chat included the admission that he watches TV with a laptop on his lap and a mobile phone on the arm of the chair beside him. This is how my household watches TV. The concept of not being connected while consuming information is alien to me. I want to look around the periphery of it, I want to dig deeper and, at the moment, technology is failing me.
I’ve said this before and I’ll say it again. Augmented Reality at the moment is a sham. It’s all search and toys. Either you’re pulling geotagged information from one of the search engines or content silos (and I include Wikipedia here) or you’re using geotags and fiduciary markers to drop toys here and there. It’s all smoke and mirrors. It’s not enough to be just tagging stuff and re-presenting it as your own. There has to be something novel coming out of it – even if it’s just the presentation of context.
The CyberGeneration example is a good one. Their solution, Virtuality, included the presentation of interactive tools in Virtuality – whether those be musical instruments or even keyboards and computers. We’re touching the edges of a world where hardware itself is virtualised and made into software. We’ve already seen this done in something as simple as the ‘Compass’ app on the iPhone – it’s a virtualised hardware solution projected onto a multi-purpose handheld display. At some point we’ll figure out how to internalise that display and I reckon that MIT’s Sixth Sense technology is probably one way of doing it (though I guess that having a projector on your chest is a limitation of the technology we’re currently using rather than a potential prototype)/
In early 2010, I’d like to invite some of the other players in AR-related technology in Northern Ireland, like Awakin, Filmtrip, the Design Zoo, ReDisc0very, Ulster MediaScapes and others to have a “DevDays” type event where we talk about the very possibilities of augmented reality solutions. Some of them read this blog, some don’t – but I’d like to hear of other folk in the province (and beyond) who are interested i talking and/or presenting in a BarCamp-esque situation.