Augmented Reality has been around as a concept for a long time. It has very often been described in images of the future as enhancing the real world with additional layers of digitially created information. As more people are becoming aware of virtual worlds and seeking to build within them they, in general, start with trying to recreate some element of Real Life. This may be representations of themselves as avatars, existing buildings and offices they frequent. real world metaphors such as chairs, tables, presentation screens. This is something I have observed as the willingness to engage with virtual worlds has extended past gamers and early adopters. The representation is focussed on the boundaries of the environment being used and on how to manipulate the building tools to create that vision, crafting for that environment.
We are seeing more uses of things from the real world crossing over into the non-game metaverse environments. e.g. tennis ball trajectories and scores from Wimbledon into Second Life.
Is this augmented mixed reality? Are we creating Augmented Reality for virtual worlds? Is there a continuous circle feeding real things and virtual things into representations of one another?
This circle of real to virtual and back again has become increasingly easy to demonstrate.
I have been impressed recently with things like ARTag. The simplicity by which we were able to explore Augmenting Real and Virtual means scope for many ideas to be tried out. With companies like fabjectory making the virtual objects into real objects on a commercial level we are able to glimpse a local fabrication future. Creating elements of what you need as physical objects when you need them from virtual resources has quite a future I think. Many of these things have existed before but not with the ease of access to the general population.
The ability to render information and data either in multiple ways with multiple transformations and augmentations seems very exciting. We seem to have many of the pieces of the puzzle available and commercially available. We have mobile phones, with colour screens, video cameras, GPS, wi-fi/3g connectivity. We do not have to solely rely on underlying programming interfaces, we are able to use other devices to instrument and understand a physical environment, such as cameras and GPS locations and many other types of real world sensor.
Taking real world properties and enhancing a real world experience with those has been done. From cave paintings on a wall to speedometers in cars we have been able to express things about the world to enhance our view.
Headsup displays and projections of additional information have been used in a military context and some high end car manufacturers. Taking information and representing it to augment reality.
The Wimbledon Second Life project was in part an extension of an idea to determine what was needed to rebuild the live experience in a virtual world from the technical data available. This extended the experience in a virtual world to people, who were not able to be at the live event. This is a the reason the championships website exists too Projecting forward it is possible to imagine re-rendering a sports event from the data captured using virtual world and game technology. Television does for us already, except is it not something that individual users have control over, it is a rerendering of a live event. Choosing from multiple TV camera angles goes some way to providing control for the viewer, but a complete photo-realistic virtual rendition powered by real life events does not seem that far fetched.
Considering how to completely recreate an event, re-render and provide additional information to those in a virtual world is a sliding scale of detail, more data, more pixels, more people. The fidelity of experience technically and socially both starting to increase as more effort is applied and as a market grows that expects a richer experience.
This gives us with some more options to consider. If you are at the real world event, and there is more to add to your experience we should look to enhance the real. using sporting events as the example we already see things like the Hawkeye ball trajectory replay rendered on a big TV screen for the crowd to experience. That is physical data turned virtual and then reinjected in the real world again. The same data is used on the web and in the virtual world. It is however just television, the same view for all the people at the venue. Someone in the crowd could have had a wireless laptop open accessed the web page or the virtual world and watched the replay, just as I tend to have the F1 live timing website open at home when watching the grand prix on the television.
During the Wimbledon fortnight I experienced an unusual mix of virtual and real life. Firstly there was the technology, we had the data and we had a virtual build. The data is about matches, about points in the game of tennis. Everthing is geared around the things that happen in playing the point in real life. We took some of that data that we use on the website and we pushed it into a virtual world to show points and ball trajectories. I was physically at Wimbledon with my collegues in the media bunker to meet our visiting corporate customers and I was also was present at the Wimbledon virtual venue to meet, greet and discuss.
I was augmenting the data in the virtual world with some personal experience and descriptions of atmosphere and explaining behind the scenes things in live context. I did realize that in trying to recreate the event we did not yet have the additional elements from the real world that added to the social experience, because up until now there had been no need to instrument them. We did not know which chair a player sat on during a break, or that someone was lying on the floor suffering from cramp. Those sort of atmosphere elements are the ones that add to the experience as opposed to reporting the data facts.
It is much easier to augment the real event with the overlay of data about the match than it is to overlay the atmosphere onto the virtual. Ok, so TV works, it shows the pictures, sounds and overlays tv graphics. At the event people can start to use commonly available technology to enhance their experience, mobiles, tv screens etc. It does work as it is today for sporting events, but there is always room for improvement.
One of the web trends is for mashups. People provide data and services, other people combine them in ways that make sense for them. That tends to be software based, e.g. maps with geotagged photos. Are we heading for a more sensory mashup set of concepts. Mixing the real and virtual in all sorts of ways?
Will we see things like slateit emerge for the real world as well as the virtual world?
Metaverses have opened up 3d, live online interaction and use of game style technology to a whole group of people who have never been near any of this. It is these people, who are not burdended with the pureness of gaming environments, or who do not use digital environments for escapism that start to ask the questions about integration. How does this technology help my business? How does this social change alter my customer service? How can I get my existing business working in a virtual world? How can my virtual world presences fit into my existing business. Can I make my brand experience seemless across all channels?
Helping people understand that a metaverse does not have to be a stand alone channel, just as real life and web do not have to be stand alone channels sounds like the next big challenge. They are not just games and they are also not just stand alone.
The BBC recently reported on some work done by the University of Edinburgh on mixing the real and virtual worlds. Their Spellbinder project allows you to take a photo of a well-known building which is then returned to you with additional information laid over the top - such as artwork adorning the walls.
Would be interesting to see if this could provide a basis for a real-life slateit.
Posted by: Nick O'Leary | Aug 16, 2007 at 10:03
@nick There are certainly a good few projects that are more public in how to capture and model things.
It is almost how to make the real world a service provider.
Wolf had posted this link to a BBC article by on the BT futurologists.
I think we will see a long list of bits and pieces and I reckon we can build a real world slateit from some easily available bits :-)
Posted by: epredator | Aug 16, 2007 at 11:22
Here at Georgia Tech, we've created a full-blow AR interface to SecondLife. We don't have a web page for the project yet (soon!) but have integrated live video capture and various kinds of spatial trackers into our modified SL client, along with developing a set of inworld conventions to define the relationship of a piece of land in SL and an area in the real world.
Our original goal was to create AR Machinima; the project is a collaboration between myself (Blair MacIntyre, in the School of Interactive Computing), Michael Nitsche and Jay Bolter (both in the School of Literature, Communication and Culture), along with Tobias Lang (a student visiting my lab from Ludwig-Maximilians-Universitaet in Munich; he's the one who actually built it!).
However, now that we've created it, we are starting to explore a whole bunch of things you might do when you can directly merge an area in SL with an area of the real world (e.g., games, distributed collaboration, mixing remote and local visitors to historic sites, etc.) As you point out in this article, the possibilities are endless! There are limits to using SL, of course, but it really gives one a taste of what might be possible.
We have an island (Augmented Reality) we are using for this work; feel free to drop by. (My avatar is "Blair Potluck")
Posted by: Blair MacIntyre | Aug 16, 2007 at 11:48
@blair That indeed sounds very cool work.
We have had a number of projects under our extreme blue programme (with IBMers and students) looking at similar sorts of ideas, but for a different goal.
Once the projects are done I will be able to post some links that may be of interest to you too.
The live action machinima falls nicely into my previous post about metaverse being as much about people and live performance as being about good 3d buildings :-)
Posted by: epredator | Aug 16, 2007 at 14:51
@Blair
I actually signed up for a Second Life account and fumbled my way through the settings (Janmar Kayo) so I could visit your island. I noticed there are a couple of floor plans with an XYZ pointer attached; do these reflect a real-world office somewhere? What are you doing on the island?
Posted by: Syntheticist | Aug 16, 2007 at 17:37
One of the goals with SLateIt was to experiment with large scale social augmented reality applications: you can build similar projects with RFID tags and HMDs, but you can't easily cover entire cities with RFIDs and give everyone an HMD for free (yet). Of course, even when you get the infrastructure for free in VR, you still need to make the application useful, easy to use and let people know about it. If only I had more time, or maybe some helpers...
Posted by: Jim Purbrick | Aug 16, 2007 at 18:11
Great to so a post in TN about Mixed Reality which for me will be one of the most exciting areas both driving uptake of virtual worlds from a commercial perspective. But it is also (as Blair said) a new form that affords creativity in collaborative play, distributed narrative and a rich layer into the a cross-media mix.
With my Director of The Laboratory of Advanced Media Production hat on we have been helping team create several Mixed Reality projects such as City Games, Inworld and Thursday's Fictions, the former looking at live players in a real city using locative mobiles playing with virtual players in Second Life. We also create MR collaborative play and our ARG/Quest wiki pages covers some of the recent parallel second life, real location based narrative/quest/ARG type games. LAMP also ran a series of seminars on Mixed Reality and there are some enhanced podcasts starting here on the topic featuring Tony Walsh and myself.
With my commercial Head of Virtual Worlds for The Project Factory who created the most visited brand/s in SL like BigPond and ABC, we are taking on more and more projects now that start to move into uncanny valley from an environmental perspective. Spaces that feel real vs models of icons. Personal I have a real problem with over representation but I love capturing as the original post said, the organic essence of a place or event and removing the shackles of being overtly detailed on the visual side. I concentrate more on making social spaces that echoes the kinds of interaction in the real ones, we have just recently completed a small section of Melbourne for example (to be launched soon), that feels real, could be used as a Mixed Reality base, but the accuracy comes not from data feeds but from the layers of emotional triggers in the sound, 'narrative' and movement. The Project Factory are now developing many formats through it's Format Factory incarnation, many projects that cross over between virtual worlds and TV/Film and see that as one of the key areas that will reinvigorate TV experiences and points back, as I said in the podcast above, to my BBC days when we were already piloting what we called at the time 'inhabited TV' but now the mechanisms are in place to make a lot of those ideas come to life.
A great area for TN to explore more I think.
Posted by: Gary Hayes | Aug 16, 2007 at 19:38
@jim I brought your slateit up as it does show the way on being able to experiment in an environment rich with sensors and information in a suitably digital form. We have a few Hursley projects using some virtual world technology exploring sensing to determine patterns of use whilst the world catches up and invents the sensors to match them. This is all good stuff :-) For many people they will experience slateit and may well say "how do we do that in real life?".
@gary thankyou for the links and the great comment. I love the phrase "inhabited TV". I do agree also that there is a danger of over representing information for the sake of it. Providing people are able to dial up or down their immersion at the boundary between real and virtual I dont think this will be a problem.
e.g. We experience stock market information as a simple news report on the BBC in the morning, derived from the same depth of information a market trader sees over their multiple screens.
Very often we see attempts to deliver every piece of information in one go. Those overly busy websites that we sometimes see, badly designed overwhelm many casual users. It may be though that engaging with other senses and moving to a 3d model may fit better with the human brain and allow us to use much of that processing power we loose using ordinary screens and text. Peripheral vision as one example. I force myself to have virtual peripheral vision with a second monitor just out of sight with feeds ticking past, to allow immersion in what ever it is I am doing and to get a sense of what else is happening.
Also well done on getting ahead of IBM with the Pond. We had better go and invent some more things :-)
Posted by: epredator | Aug 17, 2007 at 03:53
@epredator, I'll be interested to hear more about what you are up to. I've bounced ideas of a variety of IBM'ers in the past year, about how to use AR and MMOs for "real" things, but haven't gotten much traction. :) Some other companies are showing interest, though, so perhaps we'll be able to start pursuing those ideas soon! It is an exciting place to play!
Posted by: Blair MacIntyre | Aug 17, 2007 at 16:48
@Syntheticist, sorry I missed you when you came to visit the island. I saw your "friend" request when I logged in, but didn't know who you were! :)
Yes, the floor plans reflect two spaces in our building (the lounge outside my office, and the lab in which we do the AR/Facade project. Both had high accuracy trackers in them, so we are using them for samples.
Most of the rest of the island is "in progress". I would like to use it for teaching and lectures, as well as housing a AR/VR educational exhibit. I plan on having students in my Virtual Worlds class next semester build exhibits explaining AR/VR concepts in SL as one assignment; seems like a good pedagogical tool ("you have to understand it to explain it to others" and "using 3D to explain 3D means you REALLY need to understand 3D").
We want to use the space for more AR/SL things. I plan on doing an outdoor project this fall, for example, using GPS and some of our good orientation sensors. We are also exploring things with some historic sites in Atlanta.
Posted by: Blair MacIntyre | Aug 17, 2007 at 16:53
@Jim Purbrick, I think the ideas behind slateit are cool; exploring AR inside of SL is a fun way to mockup AR ideas. Doing large scale stuff is hard, as you point out, since things like RFID (even if it could be widely deployed) don't have the accuracy to support 3D registration. GPS is still to inaccurate, too. But, eventually, we'll get there! :)
Posted by: Blair MacIntyre | Aug 17, 2007 at 16:58
How's this for augmented reality? UCF's School of Film and Digital Media put on a play performance that went between the real world and the virtual world. Cool stuff!!
http://news.com.com/Virtual+art+heist,+vivid+impressions/2100-1043_3-6190449.html
Posted by: kg | Aug 22, 2007 at 11:36
肝癌
食管癌
直肠癌
结肠癌
胰腺癌
宫颈癌
乳腺癌
甲状腺癌
皮肤癌
肾癌
胆管癌
胆囊癌
脑瘤
前列腺癌
膀胱肿瘤
喉癌
淋巴瘤
舌癌
睾丸肿瘤
鼻咽癌
白血病
(5) 文学
小说
世界名著
Posted by: dfhdhxkfhck | Aug 23, 2007 at 05:11
Ian, there's only one reality. It's our physical world, bodies, wealth , health,interests. Your job,my income,and on what i'm spending my money and my time. Engaging in WVs for more than the pure fun of playing a fantasy , means two parts : a scammer and a loser. Unless you pay me to play your game.
Posted by: Amarilla | Aug 23, 2007 at 12:15