Wednesday, September 2, 2009

Virtual Reality and Out of Body Experiences

Although our 'memestreme RCW' project is creating an Augmented Reality AR app running on an iPhone, research into the field necessarily encompasses exploration of what's going down in the Virtual Reality VR camp. AR and VR have much in common and many of the techniques, interfaces, user experiences and content styles overlap, and as we found out last year, both can serve to image each others realities.
In last years project [http://pancultural-e.blogspot.com] where we constructed a proof of concept AR tour of the Alice Springs Telegraph Station ASTS, it was challenging to model our reports so that people could readily understand and visualise how our AR app was operating from the end users perspective.

So we built a structural analogue of the ASTS in the Virtual Reality confines of SecondLife 2L and programmed it so that as folk entered into the proximity of a space or object that had supporting media attached to it, they were given the option of playing that media. Apart from 'flying' in 2L, most of what you can do in 2L can be modeled across into AR applications. That being said cop an optic below on some of the research thats going on in some medical and neurological establishments exploring the the vaguaries of that which we call 'Out of Body Experiences'. Due to the nature of the enquires, and their stunning results, computer game designers are closely watching developments, as its seems there may well be profitable and engaging research spinoffs that can be picked up to enhance, big time, the user experience, read on.

Hurt my Avatar and I feel pain.
[New Scientist 15 August 2009 page 15]
The dream of many computer game designers has come one step closer to reality with the demonstration of a technique that allows people to identify more fully with a virtual body or avatar. It builds on previous research in which neuroscientists gave subjects the sensation that they were having an "out of body experience", and tricked people into experiencing the sensation that their avatar was being touched.
In the latest experiment, a camera filming each subjects back produced an image that was projected through a head-mounted display to generate a virtual body 2 metres in front of them. Repeated stroking of the participants back, combined with the site of the doppelganger being stroked, created the sensation that their virtual body was being touched (PLos ONE, DOI: 10.1.371/journal.pone.0006488).
Vibrating pads with flashing lights were positioned on the subjects' backs, so that they saw flashes on their virtual bodies at the same time as the vibrating pads on the real bodies were activated. The two did not always coincide, and when participants were asked to indicate at which point on their virtual body they felt the vibrations, some reported that the vibrations were at the site where the flash appeared, rather than where the pad was activated. The system could potentially help people get a feel for prosthetic limbs.


Out of Body Experience ABC Radio Nationals' Science Show with Robin Williams 18/04/09
Here is the 9 min segment broadcast earlier this year by ABC/rn this March. Naomi Fowler interviews BBC journalist Valentine Low who was sent to the Karolinska Institute in Stockholm to meet a bunch of neuroscientists. So, as well as pushing the envelope to uncover if these OBEs' can be integrated into computer realities of the living, there is also the lingering unanswered scientific question of all those folks who are resuscitated in hospital after having technically died. You've probably heard the stories from those that were pronounced clinically dead on the table but were still able to witness their trauma surgeon rushing around knocking over equipment say in their rush to breathe life back into the patient. Now here's the deal; no blood flow in the body, non in the brain, flatline on the ECG as well so no neuronal activity. So how come such minutely accurate reports of operating room views and events emerge from some of these resuscitated patients when their brain is flatlined and they are dead? The implication is that some form of sensory absorption is occurring, in the 'mind', as the patient is say, looking down from 2 metres above the operating table , and obviously being remembered/recorded in unnerving detail. Dude, where and what is the storage medium when this is happening? Yeah, I know its a little off track, but its part of the interview. FYI the cybernetic OBE stuff is at the beginning of the show.
Go here to read the transcript and/or download the podcast:
http://www.archive.org/details.php?identifier=Outofbodyexperience_1
+++++++++++++++++

+++++++++++++++++
Seriously though, Augmented Reality and Virtual Reality are twins or siblings at least. Both are examples of powerful cybernetic augmentations/amplifications of our human sentient and sensory capacity. AR superimposes 'just in time' information as a result of its sensing of our physical geospatial environment. VR does the same in a digital analogue of some real or totally fictitious reality e.g Second Life and a Full Immersion gaming simulation.] We are even extending this down to real nano levels, check out this stunning tour of the Allosphere from www.ted.com, where scientists walk inside a huge sphere that is an extension of various high-end imaging technologies such as scanning tunneling electron microscopes, and study such small scale phenomena as protein folding, electron spins etc from the perspective of a nanoscale human observer. Fantastic Voyage.
++++++++++++

No comments:

Post a Comment