Wednesday, October 14, 2009

iPhone accuracy

We ran some field tests in darwin with Bill Wade at CDU and noted there were discrepant readings across our various GPS sensors; iPhone 3GS, Garmin 60CS, Garmin 60CSX, and a bolt-on GPS toggle for a mid-range digital SLR camera.

I discussed this last week with an ongoing supporter of our project, Andy Roberts from NT Land Information Services, and was able to get very accurate survey point readings on our street to 'calibrate' the accuracy of our GPS sensing units. Armed with blueprints of points, accurate down to the centimetre level, my son and I went out and crunched the numbers across the Garmin 60CSx and the iPhone 3GS.
Cutting to the chase; we were able to get below 3 metres with the Garmin. The iPhone averaged out around 20 meters, got as accurate as 10 and twice was about 50 meters out [12 readings].  With the Garmin there is an Average function that allows one to leave the unit reading, over time, to sharpen its focus and we achieved around <3 metres after a 3 min sit. I don't think the iPhone has this higher end 'average' function ability to 'read over time'.  All things considered the iPhone does a pretty good job, the GPS transceiver is tiny from the pics I've web sourced, this is its 1st iteration [and will either get better next version and/or someone will offer an accurate GPS bolt on].  Now if they offered this bolt-on as Wi-Fi connected it would be ideal, fingers crossed.
The other thing to be aware of is the manufacturers guarantee which goes something like this for Garmin; for 95% of the time it will get you within 10 metres radius, for 4.5% of the time you'll get within 30 metres and for 0.5% of the time you may be as much as 300 metres out.  Nice waiver eh?
Another option we intend to interrogate is reverse geocoding the route entirely; that is lay the route out along the roads and tracks depicted on the map photographs, without having to log the route with a GPS navigator first.
As is mentioned in earlier posts, 60 to 78% of Australia, depending who you talk to, is without 3G signals.  Here in the Alice, surrounded by a desert the size of Europe... we loose our 3G connectivity 10 to 15Km out.  Hence our work and research on a system that will overlay Augmented Reality information to the outback/remote tourist or professional.

As well as the tests we ran in The Alice here's a London based scenario that backs up the conclusions we have come to.  Remember also that the iPhone is gaining extra location fix position information from cell phone towers.
I've just started testing the fidelity of the iPhone GPS in outback locations [no 3G signal ergo no cell phone tower triangulation assists.] Early results are confusing!!
You tube

Purple line is route taken White line is iPhone GPS track Cycle computer indicated total distance travelled as 3.56 miles, iPhone track indicated 3.4 miles

Friday, October 9, 2009

We are obviously in the early days of mobile phone AR. How do you see it developing?

Rob Manson and Alex Young from BuildAR had this to say, great description, my bold words emphasis.

"Well, I’m working on a broader research project on Pervasive Computing and I think this is a core part of that evolution. The interfaces are still quite clunky and having to hold up and wave around your phone is still quite a clumsy experience.

I think quite soon we’ll see more immersive display devices start to spread. I’m running a session on this at Web Directions South and we use this underlying theory to inform most of our business/product strategy development.

Basically the distance between the network and the user is collapsing. The distance between the display and the user is collapsing. And the distance between the physical interface (e.g. think of gestures) and the user is also shrinking. This means our overall experience of space and even who we are is changing.

This all seems a bit futuristic, but glasses with displays built-into them should start to spread quite soon, all powered by mobile devices. And there’ll be even more interesting options too. Just think how quickly iPhones and Bluetooth headsets have become common everyday objects.

The opposite side of this is the spread of wireless digital cameras.

Combine the two and you open the door to rich and immersive Augmented Reality where you can shift your perspective constantly and freely.

I think this is the start of something really fascinating!"

Taken from an interview listed on the Sydney Powerhouse museum's joint project. Read full article here.

Thursday, October 8, 2009

Traveling iClass

 The new breed of mobile phones such as iPhone 3GS & Android have three base characteristics that set them apart from their lessors:
  • Sensor Supported
    • Vison via camera and camcorder
    • Audio via microphone
    • Gravity/acceleration via accelerometers
    • Touch via the GUI
    • Electromagnetic - everything from GPS signals, WiFi transceivers, BluTooth, Cell Phone signals to Compass bearingsLocation-Aware
      • And this section is evolving rapidly as large companies and home coders alike apply 'collective intelligence' to assess what the market wants, and how to go about providing it.
      • Also I envisage a new breed of sensor apps coming online as wired and wireless sensor 'bolt-ons' are connected to the phones to provide perception across say temperature, microscopy, smell and even taste
  • Internet Connected
    • Within a given 3G signal umbrella you can source the web.
  • Location Aware
    • Via GPS signal processing and cell Phone triangulation

I'd been trying to get a suitable acronym from SS, IC and LA. Lassic, Ssicla and finally iClass appears out of the mix...

 just like the dyslexic guy who walked into a bra... and said Ouch!.  It was an iron bar.

Albert gives the nod to memestreme


Couldn't resist it!
http://www.hetemeel.com/einsteinform.php

Wednesday, October 7, 2009

BuildAR and Layar heads up.

With reference to the next blog down, we discussed potential VET training and employment outcomes for "Geo-Locative Media Studies". Given that most job descriptions making use of such training don't currently exist my blog entry may appear a trifle tenuous, but read on. I mentioned Layar which is:
"Layar Reality Browser adds 3D to its Platform {Oct 6 09}
Layar announced the addition of 3D capabilities to its augmented reality browser platform. With 3D, developers can tag real-life objects with 3D text, place 3D objects in real-world space, and create multi-sensory experiences. The addition of 3D enables Layar developers to create more realistic and immersive augmented reality experiences for mobile devices.
More information at www.layar.com/3D
Layar is GLOBAL. With lots of content layers for everyone. All the news is on the blog and in thepress section.":


Anyway Stop the Press, contemplate some emergent VET job opportunities and cop an optic on this Aussie start up BuildAR
"Click here for the full blog transcript of the excerpt provided below.

"On Saturday night at our (very rainy) Common Ground meetup in Sydney, Rob Manson and Alex Young from BuildAR demonstrated the first version of their augmented reality mobile toolkit using images from the Powerhouse’s geocoded photographs in the Commons on Flickr.
This work riffs around the early mashup from Paul Hagon where he combined the historic photos with Google’s Street View; and the ABC’s Sydney Sidetracks project.
But then makes it mobile – replacing the Street View with the actual view through the camera of your mobile phone.

I asked Rob a few questions -
F&N – What is this Augmented Reality thing you’ve built? What does it do?
The first service is BuildAR and it is a service built upon the Mobile Reality Browser called Layar.
Layar uses the GPS on your mobile to work out where in the world you are, then it uses the digital compass to work out which direction you’re facing (e.g. your orientation). From this it can build a model of the objects and places around you. Then as you hold up your mobile and pan around, it can overlay information on the live video from your camera that you see to highlight where these objects and places are.
BuildAR let’s you quickly and easily add, search and manage your own collection of Points of Interest (POIs) to create your own Augmented Reality layer. You can do this via a standard PC web browser, or you can do it via your mobile phone. You can create a free personal account and get started straight away creating your own private POIs or you can make public POIs that other people can view too. All it takes is a few clicks and they are shared or published in real-time.
You can also use the service to create fully branded and customised layers.
Follow the links for more info, very interesting. Coupled? entangled? with the research scurrying apace at Microsoft's PhotoSynth labs, Washington Universities CrowdFlow [similar to an app from Georgia university which uses live cctv data to populate virtual landscapes with real-time demographics {and google algorithms to auto blur faces and license plates }] and Sameer Agarwal's 'Building Rome in a day' project [machine driven suction of pics from public sites to auto compose 3D VR scapes], to name but a few, it is just getting a whole lot easier to envisage GeoLocative career options for our erstwhile student population.
Consider how many times, from what angles and from what altitudes our very own iconic Ayers Rock has been photographed... In may ways it will be easier to attain VR/AR 'models' from the timeless terrain of our sunburnt centre.  The unsealed roads are about the only things that traverse the terrain on a decade by decade time line as floods, termite mounds etc urge road deviations.
The 'granularity' [the volume and density of media available for a given coordinate] of most of the locations along the Red Centre Way will be much less than say the roof of the Cistine Chapel, and it will be a while before google street view arrives, but the research efforts and innitiatives above are talking about building Rome and Dubrovnic models from 50,000 pics, in a day.  I'd say right now there's going to be a heluva lot of pics from the Red Centre online.
And this brings us back to memestreme; its great having 3D models of the outback, now the scene without a meme is becoming set: all we need to do now is lay stories and ideas onto this magnificent backdrop. Bring it on.
===========



============
Click here for the fascinating  "ABC Innovation's Sidetracks" info page & you'll be able to read about; 
ABC Innovation has launched their Sydney Sidetracks project.
This is a lovely experiment in developing a mobile heritage application which takes some of the archives of ABC TV and Radio and combines them with static imagery and research from the cultural heritage partners – Powerhouse Museum, State Library of NSW, National Film & Sound Archives, Museum of Contemporary Art, the City of Sydney Archives, and the Dictionary of Sydney.
ABC have sensibly hedged their bets so the diverse content is available as an interactive website with a simple map interface, and as a multi-platform mobile Java application.
Whilst the mobile application is not yet location-aware, it does provide a simulation of the potential experience that awaits in a future version. The phone version can be ‘sideloaded‘ to a huge range of different devices. Being out and about with the content changes your experience of it greatly but suffice to say, mobile is still in a very immature phase – with significant usability issues to be overcome. Partially to get around these, a whole lot of the ABC Archives content can be downloaded, separately, to your phone to be accessed as podcasts.

Monday, October 5, 2009

Copyright conundrums [R]






During the Mobilizethis09 conference run by Bill Wade in Darwin last week the question of; "What is 'legal media content' these days?"... when, as part of our Uni studies and daily life we want to use Web 2.0 mashups to get our points across.

Great question, still no clear answer. The following three articles attest to the fact that change is afoot and will hopefully provide a good overview of what the current state of play is for; copyright, copyleft, digital commons, patent rights, IP etc.  All that 'stuff' we're used to thinking of as belonging to someone is now more or less dead in the water unless its being viewed and freely altered over the grid. This is what I refer to in later posts as memes [ideas with attitude] that culturally evolve as they are replicated, with variation, on the grid.

Its mid 2007 and I've been in touch with ABC radio national and asked them if it was okay to use  whole or excerpted bits of their Radio national program podcasts.  I made 4 inconclusive phone calls asking if I could use their content and I never really thought the question was clearly answered until the 5th call.  I explained again how I wanted to populate an Augmented Reality tour with some ABC/rn content, pointed him to the 2007-2008 blog and continued to describe what we were doing. Finally he said "Are you going to on-sell this?" - "No", "Then go ahead and use whatever you want and please don't alter the ABC metadata".  That was it, leave the meta intact and don't make money off it.

Other places like ted.com and youtube expect you to share, alter and evolve their content.  So long as their content is being viewed and/or morphed, they are doing good business.  Economics 101 in the weightless economy.

Anyway, the following 3 ABC/rn podcasts make for excellent listening, I have listed them in what I consider the priority order. The Alfred Deakin lecture is by three gents who really know what they are talking about, and although this was broadcast Dec 2007, its a really nice piece of work.

TASTE:
Alfred Deakin Innovation Lecture 29 December 2007
Are we missing out on the full benefits of science and technology because of outdated ideas about copyright and patenting? Could the key to feeding the world be locked up in a company fridge somewhere? Open-source software has transformed the internet, underpinning the phenomenal growth of Google, Ebay and YouTube. What can science learn from this revolution? In our rush to protect intellectual property, have we damaged our capacity to deliver solutions for the critical issues of the 21st century?
In this lecture, John Wilbanks, Executive Director of Science Commons at Harvard Law School, will describe how existing social and legal infrastructures are choking science, and how we can create new ways to share research. Brian Fitzgerald, Head of the Law School at Queensland University of Technology, will discuss the success of open source in the information technology world, and the lessons for other fields of science.
FULL:
++++++++++++++++

))))))))))))))))))))))))))))))))))))
This is a great overview of Advertising, and how it is dragging itself, kicking and screaming, into digital paradise. Three wonderful speaker and the 1st Professor Veran from Murdoch Uni WA, well worth listening to.
TASTE:
The changing face of advertising
A look at the way in which the Australian advertising industry is adapting to the challenges of the modern age of communication.(This program was first broadcast on 11 October 2007)
Antony Funnell: Welcome to another edition of the Media Report on ABC Radio National ... I'm Antony Funnell. Today's highlight program looks at future trends in advertising.
Like journalism, the industry is much disparaged...and we often overlook its importance in terms of underpinning the viability of much of the media we consume.
That is -- in blunt terms -- without the revenue generated by advertising there'd be very little to watch, hear or read (leaving public broadcasting aside of course).
So, today on the program, we'll look again at the way the advertising industry is adapting to the rapidly changing media environment of the 21st century.
FULL:
++++++++++++++++
))))))))))))))))))))))))))))))))))))
Internet Piracy Oct 2009
TASTE:
Oscar McLaren: The copyright industries say their enemies are everywhere, from multi-billion dollar internet companies to the millions of people around the world who pirate films and music on the net.
But also in the cross-hairs is a growing band of mash-up and remix artists and everyday computer users for whom the internet has sparked a wave of creativity.
Around the world, some say the copyright industry's war is already lost. At a conference in Canberra earlier this year, here's Harvard University's Professor Lawrence Lessig.
Lawrence Lessig: We have to recognise we can't kill this form of creativity, we're only going to criminalise it. There's no way we can stop our kids from engaging in this form of creativity, we can only drive their creativity underground. We can't make them passive, the way at least I was growing up, we can only make them pirates. And the question we have to ask is, Is that any good?
Oscar McLaren: Today, Background Briefing explores this question. Hello, I'm Oscar McLaren on ABC Radio National.
The battle is really about how copyright law should adapt in an age when everyone can be a pirate.
Should it crack down on every download of a film or song? Should it stop every unauthorised remix? Should it stand aside? Or should it find another way of regulating intellectual property.
There are enormous corporate interests involved on all sides, and the laws are complex. But at their most basic level they're meant to encourage creative people to produce work and release it to the public.
FULL
++++++++++++++++
<

))))))))))))))))))))))))))))))))))))


Friday, September 18, 2009

What would a VET Certificate IV Geo-Locative Unit look like?

With help from Simon Lismann at AFLF we have been researching how we may go about designing a unit of study that would serve as an Intro to Geo-Locative media work.  Our final report will go into sufficient detail but at this stage we are looking at drawing on competencies and skills taken from the following 3 VET study areas: New Media Studies, Surveying and Spatial Imaging all of which have their own established VET course structures etc.  We would then add the research from this project to complete a more fleshed out [possible/potential] curriculum offering in Geo-Locative Media Studies.
Now the thing about making up a media unit from a composite skills checklist drawn from Surveying, Spatial Imaging and New Media curricula is that there is no readily apparent occupational skeleton, as it were, to hang these skills on.

Surveyors do roads, buildings and home improvement schedules. Spatial Imagers, great sounding job description, are linked in one way or another to cartographers, helping enhance the veracity of all sorts of maps.

New Media graduates do, well, web-pages, graphic art and shit hot personal stationary.

So, where does that leave students schooled in the amorphous arts of Geo-Locative Multimedia?  The answer if feel, is hat it  places them firmly astride the Augmented Reality and Virtual Reality stremes.  

As I mentioned in a recent blog post, I feel that AR and VR are, if not non identical twins, then defiantly siblings. They represent the Yin and Yang of digitally altered states of reality, really. And I feel they are both about to take off, big time.
  • The 2009 Aug 22 edition of New Scientist ran a great article called 'Welcome to Appland'. The article that has a lot of information about the current and next wave of mobile phones that are different by way of the fact that they are: 
  • Location Aware, Sensor Supported and Internet Connected 
  • “What is clear is that Apps are set to become an ever greater part of our lives.  As the technology of handsets improves, the next wave of apps will join up the real and the virtual worlds even more.  Many will be based on ‘Augmented Reality’, which involves overlaying computer graphics on a view of the real world captured through the phones camera. In the Android marketplace, apps such a Wkitude and Layar already use the handsets video camera, directional sensors, location information and internet connection to allow users to look “through” their phones to see a virtually augmented building or landscape.  Once developer tap into the full capabilities of the latest version of the iPhone a flood of similar apps is likely to emerge in Apple’s App store, says Blair McIntyre of Georgia Institute of Technology in Atlanta, an authority on Augmented Reality” New Scientist, 22 Aug 2009 “Welcome to Applandpps 32 to 36
  • One…”cannot deny that the iPhone has changed things for everybody.  Variously described as the Jesus phone, a concierge, a Swiss army knife or, somewhat disturbingly, a fingertip secretary, the iPhone is currently the centre of the App world.  “the truly revolutionary thing that Steve Jobs managed to do with the iPhone was to persuade cellphone network operators to loosen their grip on what phones could do.  One of the consequences of this coup was the birth of the App store, which Apple alone controlled” “Apple made it easy for  anybody with programming know-how to create an App… Lured by the promise of riches , developers from large software houses to bedroom enthusiasts have created a massive market for Apps, virtually overnight” ibid
A VET Geo-Locative unit/s would blend Surveying, Spatial Imaging, New-Media with an introduction to AR and VR.

Surveying, spatial imaging and work with hand-held GPS devices and their laptop/workstation software will help nail down modelling of topographical reality accurately.  

The next bit is attaching information to the real world.  Geo-spatial and socio-spatial metatagging assists/ensures that media assets have the correct digital attributes to be locked onto reality [geo-located].  And at this point one can see how the duality of the AR-VR similarities can come into play.  AR is difficult to 'model' with existing 2D software. [How best do you 'show' someone how AR works in the real world?] We can dump associated content into a web-site, like a layer on Google Earth, or custom build web sites or Flash animations from say Aerial photos to give us a 2D rendered model.  
In our 07/08 AFLF 'Pancultural-e' project we produced a web-based model in order to demonstrate the nature of our Augmented reality Outback trial.  The model that really got people to sit up however, was when we rendered our AR tour into Second Life. In 'pancultural-e' we did a VR tour of the historic overland telegraph Station in Alice Springs.  
Its a smallish reserve and so it wasn't all that difficult to build [render] the 6 or 8 buildings onto an Island in 2L. [deserts are hard to come by in 2L], the buildings went up and it looked really cool.  [Here's a nighttime screen grab from the partially completed Telegraph Station, building not finished were 'parked' in the air]


When all building, landscapes and animals were finished however we hit something of a hurdle.  We had a couple of gig of tagged and massaged relevant multimedia that we'd used for the VR tour and the idea was just to 'place' it in the virtual buildings, along the trails etc and make it 'Interactive'.  At least it sounded simple I guess. Then as your Avatar cruises through, say ,the Telegraph Operations room and sees a flat panel display with a bunch of content menus, to get the media flavour of your choice, you select and savour the 'meme' of your choice.

But according to our guy from NZ whom we had contracted to render the 2L Telegraph Station, attaching the information onto the virtual environment was 'a bit tricky'. We managed to get some interactive media up there but not nearly as much as we would have liked. For the demonstration of how our AR worked though VR was the most , really good looking virtual environment but not much capacity for a user to interact with any of the memes [media].
We need to research if it has gotten easier to place interactive media around structures and landscapes in 2L as I'm sure the demand for this has to be on the up.  And it is precisely here where I see future work happening; mirroring interactive AR media into VR environments, getting your media production buck to bang twice. 

Now just to get the generation of virtual landscapes in a technically real perspective, cop an optic on the following vodcast about Photosynth. As fantastic as this demo is, it is technology from 2007. After you've watched the movie consider how many times, and from how many angles, seasons and weather conditions Ayers Rock has been uploaded to share sites.
+++++++++++++++++++















+++++++++++++++++++

Tuesday, September 8, 2009

Media corral

I don't know about other bloggers but I get hassles inserting audio excerpts into a blog, going back in it to edit text, then finding it won't publish as half the code supporting the media has dropped off the radar.  My attempted fix will be to put a bunch of media into this one spot and link to it from text in previous and future blogs.  See how it goes anyway;
===================
Serious Games. I don't know what 'reality' category to put this in, it isn't AR or VR, maybe Game Scenario Reality. For example. see how society can inform itself by enlisting 7000 volunteers to game role play : http://www.superstructgame.org/. Then consider that this game was written and run in 2007, listen then imagine how much further we can take this with iClass phones, great example of 'collective intelligence'.
'Serious Games' or 'Games for Good' have a greater purpose than just entertainment - they're being used to help us understand and solve current problems as well as to identify future threats.
Noah Falstein: Serious games are a new enough thing and a nebulous enough thing that people are still arguing about definitions, but the one that I find most accurate I think is - a game or something that uses game techniques and technology, for a purpose beyond entertainment. So it may well be entertaining. Many of the serious games are fun as well, but its main purpose is not the entertainment, but rather something else, often teaching or some sort of instruction, sometimes something like research or persuasion, there are quite a few different uses for them.
Audio Media Player

=====
Download mp3 here
http://www.archive.org/details/www.abc.net.au_rnGamesforGood/
Download PDF transcript with clickable links here [right click '83kb' hyperlink]
http://www.archive.org/details/www.abc.net.au_rnSeriousgames.pdf/
================================
New Media Horizons PDF 09 Report
http://www.archive.org/details/www.nmh.orgnewmediahorizons09report/
Grab this report, its well worth a read, and as well a saying whats wired now it suggests adoption timelines for emerging tech, apps and wares.
From phones to smartbooks, mobile devices with access to the Internet now make it possible to do all kinds of activities for work, study, and socializing — wherever one happens to be. In recent years, mobile phones have evolved to include innovative interfaces, GPS and wifi capability, and support for third-party applications. Small mobile Internet devices including netbooks and smartbooks offer another way to stay connected and work on the go: smaller than laptops but larger than mobile phones, these devices are compact and powerful.
Placed on the far horizon for Australia and New Zealand last year because of slow adoption rate and low availability of bandwidth, mobiles are moving toward adoption more quickly thanks to reduced costs for bandwidth and new plans that offer alternatives to hefty overage charges. Bandwidth and coverage is still a concern for consumers, however, and outside of urban centers, finding a signal is often difficult. In many cases, while students may own mobile Internet devices, the cost and availability of bandwidth prevents them from taking advantage of the full range of applications available to them.
===============================
The Internet of things
http://www.archive.org/details.php?identifier=ABC_rnAustraliaTheInternetofThings
"Imagine your toaster has an inbuilt computer and it can speak to your fridge. Now imagine your fridge talking to the computer at your local shopping centre. All without your involvement. More and more everyday objects are becoming internet connected. So are we about to witness a new phase for the internet? An internet where objects, not people, communicate: "
Audio Media Player
=======

========================
Do you read me HAL?
"Robots are among us. They might be on their way in to childcare and aged care as silicon carers too. Will the 'digital natives' born today be more comfortable with that prospect? And, many thousands have now been deployed in Iraq and Afghanistan, with billions being invested in the development of fully autonomous killing agents. Will they fight fairly? Could they be more ethical and humane than humans? Over a series of shows, Natasha Mitchell speaks to leading roboticists and thinkers about the brave new now."
Link for audio download
http://www.archive.org/details/www.abc.net.au_rnAustraliaDoyoureadmeHAL_Robotwars_moralmachinesandsiliconthatcares/
Audio Media player
=====

=====
PDF transcript with clickable links download
http://www.archive.org/details/www.abc.net.au_rnDoyoureadmeHAL_Robotwars_moralmachinesandsilicon.pdf/
===========================
The coming 'Singularity… or not?
"Next month,[Oct 09] thinkers worldwide will gather at the Singularity Summit [http://www.singularitysummit.com/] in New York, and you can see from the speakers that the scene is a curious mix of transhumanists (who want to use technological knowledge to live forever...and forever), futurists (who contemplate the possibilities of what might happen if we did), technologists and computer scientists (who could potentially make it happen), philosophers of mind (what will happen to our minds if we do?), and perhaps a few lifestyle evangelists thrown into the mix.
Here's how the conference describes the concept:
"The Singularity represents an "event horizon" in the predictability of human technological development past which present models of the future may cease to give reliable answers, following the creation of strong AI or the enhancement of human intelligence".
Futurist Ray Kurzweil writes:
"What, then, is the Singularity? It's a future period during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed. Although neither utopian or dystopian, this epoch will transform the concepts that we rely on to give meaning to our lives, from our business models to the cycle of human life, including death itself. Understanding the Singularity will alter our perspective on the significance of our past and the ramifications for our future. To truly understand it inherently changes one's view of life in general and one's own particular life"." ABC Radio National's 'All in the Mind' presented by Natsha Mitchell [http://www.abc.net.au/rn/allinthemind/stories/2009/2686321.htm] with the programs blog on this subject at: {http://blogs.abc.net.au/allinthemind/2009/09/what-on-earth-is-the-singularity.html#comments}
Audio media player
===========

=================
PDF Audio transcript download with clickable links
http://www.archive.org/details/www.abc.net.au_rnThecomingofthesingularityaudiotranscript.pdf/
PDF Audio transcript download of ABC's Natasha Mitchell 'All in the mind' blog with clickable links
http://www.archive.org/details/www.abc.net.au_rnThecomingofthesingularityaudiotranscript.pdf/

www.ted.com Ray Kurzweil's vodcast where he explains the math of how he came to his conclusion.
Video Media Player

==============================
The Tribe has spoken…
Seth Godin describes himself as an agent of change. He's also an author who's curious about the way we live our lives. His latest book is called Tribes. And his theory is that the internet and modern communication technology have made it easier for people to make meaningful groupings. He describes a tribe as a group that shares a culture and a mission and he says the world has entered a new age of tribalism.
Video Media Player
============

===========
Audio Media Player

==============
ABC PDF podcast transcript with clickable links
http://www.archive.org/details/www.abc.net.au_rnTheTribehasspoken....pdf

Wednesday, September 2, 2009

Virtual Reality and Out of Body Experiences

Although our 'memestreme RCW' project is creating an Augmented Reality AR app running on an iPhone, research into the field necessarily encompasses exploration of what's going down in the Virtual Reality VR camp. AR and VR have much in common and many of the techniques, interfaces, user experiences and content styles overlap, and as we found out last year, both can serve to image each others realities.
In last years project [http://pancultural-e.blogspot.com] where we constructed a proof of concept AR tour of the Alice Springs Telegraph Station ASTS, it was challenging to model our reports so that people could readily understand and visualise how our AR app was operating from the end users perspective.

So we built a structural analogue of the ASTS in the Virtual Reality confines of SecondLife 2L and programmed it so that as folk entered into the proximity of a space or object that had supporting media attached to it, they were given the option of playing that media. Apart from 'flying' in 2L, most of what you can do in 2L can be modeled across into AR applications. That being said cop an optic below on some of the research thats going on in some medical and neurological establishments exploring the the vaguaries of that which we call 'Out of Body Experiences'. Due to the nature of the enquires, and their stunning results, computer game designers are closely watching developments, as its seems there may well be profitable and engaging research spinoffs that can be picked up to enhance, big time, the user experience, read on.

Hurt my Avatar and I feel pain.
[New Scientist 15 August 2009 page 15]
The dream of many computer game designers has come one step closer to reality with the demonstration of a technique that allows people to identify more fully with a virtual body or avatar. It builds on previous research in which neuroscientists gave subjects the sensation that they were having an "out of body experience", and tricked people into experiencing the sensation that their avatar was being touched.
In the latest experiment, a camera filming each subjects back produced an image that was projected through a head-mounted display to generate a virtual body 2 metres in front of them. Repeated stroking of the participants back, combined with the site of the doppelganger being stroked, created the sensation that their virtual body was being touched (PLos ONE, DOI: 10.1.371/journal.pone.0006488).
Vibrating pads with flashing lights were positioned on the subjects' backs, so that they saw flashes on their virtual bodies at the same time as the vibrating pads on the real bodies were activated. The two did not always coincide, and when participants were asked to indicate at which point on their virtual body they felt the vibrations, some reported that the vibrations were at the site where the flash appeared, rather than where the pad was activated. The system could potentially help people get a feel for prosthetic limbs.


Out of Body Experience ABC Radio Nationals' Science Show with Robin Williams 18/04/09
Here is the 9 min segment broadcast earlier this year by ABC/rn this March. Naomi Fowler interviews BBC journalist Valentine Low who was sent to the Karolinska Institute in Stockholm to meet a bunch of neuroscientists. So, as well as pushing the envelope to uncover if these OBEs' can be integrated into computer realities of the living, there is also the lingering unanswered scientific question of all those folks who are resuscitated in hospital after having technically died. You've probably heard the stories from those that were pronounced clinically dead on the table but were still able to witness their trauma surgeon rushing around knocking over equipment say in their rush to breathe life back into the patient. Now here's the deal; no blood flow in the body, non in the brain, flatline on the ECG as well so no neuronal activity. So how come such minutely accurate reports of operating room views and events emerge from some of these resuscitated patients when their brain is flatlined and they are dead? The implication is that some form of sensory absorption is occurring, in the 'mind', as the patient is say, looking down from 2 metres above the operating table , and obviously being remembered/recorded in unnerving detail. Dude, where and what is the storage medium when this is happening? Yeah, I know its a little off track, but its part of the interview. FYI the cybernetic OBE stuff is at the beginning of the show.
Go here to read the transcript and/or download the podcast:
http://www.archive.org/details.php?identifier=Outofbodyexperience_1
+++++++++++++++++

+++++++++++++++++
Seriously though, Augmented Reality and Virtual Reality are twins or siblings at least. Both are examples of powerful cybernetic augmentations/amplifications of our human sentient and sensory capacity. AR superimposes 'just in time' information as a result of its sensing of our physical geospatial environment. VR does the same in a digital analogue of some real or totally fictitious reality e.g Second Life and a Full Immersion gaming simulation.] We are even extending this down to real nano levels, check out this stunning tour of the Allosphere from www.ted.com, where scientists walk inside a huge sphere that is an extension of various high-end imaging technologies such as scanning tunneling electron microscopes, and study such small scale phenomena as protein folding, electron spins etc from the perspective of a nanoscale human observer. Fantastic Voyage.
++++++++++++

Augmented Realities a personal introspective retrospective

.
Our ‘Memestreme RCW’ project, funded by AFLF, is an iPhone 3GS based ‘Augmented Reality’ AR application. The following brief history of AR may assist in your understanding of following articles in this our 2009 project blog.
Augmented Reality, AR as the name suggests is reality that’s augmented, Mechanically, I guess the first killer app was the telescope; check this out Galileo, this little app doesn't change your reality but it certainly enhances it, hope you can find a use for it, and don’t mention heliocentricity in your regular chats with the Pope. These days AR is more or less exclusively computer based and I’ll pull in a definition from Wikipedia at the end of this rant.
In 1990 Boeing Aerospace were experimenting with video-enabled goggles for their airframe electricians.  As electricians worked on sections of fuselage cabling, the correct wiring configuration was displayed on their heads-up display, this AR display thus aiding [augmenting] their personal technical know-how and paper based wiring diagrams.  The project was a success, wiring confusion/mistakes was all but eliminated, but due to the overall clumsiness of the hardware and software, relatively poor computer processor speeds and the sheer cost of the equipment the research results were archived and the equipment cannibalized for other projects.
Fast forward to 1992 when ‘The Lawnmower Man’ hits the movie screen with an AR techno-drama that turns a simple man into a genius, and in true Hollywood style, the augmented brain power of this computer-assisted-simpleton scenario catapults our unwitting dude into a demonic nerd with attitude bent on World Domination, I can’t spoil the ending cos he dies.  Notwithstanding, the high-end graphics, AR sensory suit and body harness used on set gave a leading edge view of how much more scientifically mainstream AR systems had improved and become.
A personal pivotal moment in AR interest arrived for me in June 2002 in a Jumbo jet.  I was on my way to deliver a presentation at the Calgary 2002 WIPCE conference ‘Towards a pedagogy of the distressed’ discussing the praxis and technologies I’d found useful delivering ITC competencies to Indigenous adults in remote regions of Central Australia. Anyhow, I’d bought the April 2002 edition of ‘Scientific American’ [Click here to download] to read in-flight.  I just remember excitedly reading the article on AR [Augmented Reality: A New Way of of Seeing] 4 or 5 times; and had some form of epiphany, Brave New World and Big Brother be damned, this was hot [and I've spent much time since then following developments]. 
The necessary technological infrastructure was in place, evolving sure but existent, the web was evolving potently and there was so much information out there that wanted to be free! It was just a fantastic article on the shape of AR to come, I feel its still a ground breaking article and well worth a read, given where we up to technically in 02 i.e. :
No Web 2.0 apps [ flickr, blogs, twitter etc] in 2002;
    ✓    google announced a major partnership with AOL to offer google  search to 34 million customers using CompuServe, Netscape and AOL.com,
    ✓    weak high-end mobile phones [“these days nearly everyone uses a mobile phone. The little devices have become far more than just a telephone. They can take photos, and even receive moving images.”] 
    ✓    packet switched radio apps such as home based Wi-Fi routers were expensive yet available and maturing apace,
    ✓    GPS hand-helds were emerging into the marketplace after successful residencies within science, industry and of course the military [Garmin: “From the time our first GPS handhelds supported the Coalition forces in the Gulf War…”] . 
Things were so exciting it was just like an All Bran New Day.
The 02 SciAm article quite accurately envisaged the environment in metropolitan cities today, a really neat piece of informed scientific speculation, in case you missed it above; [Click here to download]

Back to the present though, and the 3 key determinants factors, I feel, for both Virtual and Augmented Realities are: Location-Aware, Sensor-supported & Internet-connected LSI capability.  The ‘sensor-supported’ determinant is a little difficult to get your head around as, of the 3, this is the one that is not mandatory. One may pull down a heap of interesting and relevant augmentations via Geo-locatives and Internet-connected capacity alone, but if we take a closer look at what is becoming available its just a matter of time until we see 'bolt-on' sensory accoutrements for the well heeled mobile aficionado.

Consider ‘Appland’,a name coined by New Scientist,  [22 Aug 09 pps 32 to 36 ] In the online repositories of mobile phone companies there is a bewildering plethora of Apps currently available to you, coded by large industrial houses and bedroom code-cutters alike [user generated], and you can have any of them for a couple of bucks each via Google Android or iPhone.
And what is it that you want today? You’ve played with the streetmaps, as you were really lost that time, audio-sampled songs from the radio for auto location in iTunes just to see if it worked, wasted hours playing jelly car and even shown off your virtual Zippo lighter [very lickable eye candy but not very useful]. 

The thing is, as a nurse you may want a way to use you iPhone as a thermometer, as a fossiker of semi-precious stones you would love a handy way to optically gauge the refractive index of this quartz [or is it a gem?] in your hand, or as a marine biologist, a handy way to sense and record the salinity of the pond you are on would be ideal right now. ‘Sensory bolt-ons’  are set to change the way we do a lot of things. For the record I'm just running an FM transmitter and  stereo microphone bolt ons.
To more fully mesh with our wildly divergent personal needs, and because thousands if not millions of folk are writing Apps as we speak, AR and VR applications are emerging, evolving really, with mechanisms that sense the environment around us in weird and wonderful ways. Running such sensors presents you with enhanced informed choices based on the inputs of its sensory mechanisms, and the nature of its user.  This is where fuzzy logic and indeed AI systems are starting to kick in, and its going to be a helluva ride. I've already seen the iPhone referred to as 'the Jesus phone' and this user generated wave is going to goldly blow where no app has been before. at $3 a pop.

Augmented Reality is an available fact of life now then for any citizen residing in a 'mature' telecommunications location, and with the cash to buy a relatively inexpensive readily available 3G phone.

Its just a bit of a bummer then that I live in the Alice. 

As you can't possibly see from the image, there is a tiny 12 Km radius of 3G coverage around our fair Town. While we are at it I may as well throw in a whinge to Telstra as my broadband speed drops down to dial-up speed as soon as the Alice kids get home from school, that was nice, but I digress. We have written the iPhone application primarily for remote areas where the signal strength of the 3G network is zilch. In this first iPhone iteration we have an augmented reality application running on the latest iPhone 3GS.  What makes our project internationally unique however is that it operates without that 3G 'cloak of confidence' supplied to the majority of our fair Nations denizens living in places with a rich telecommunications infrastructure; we provide rich customised spatially relevant media to you and your iPhone when you are at work or play in the 3G barren outback.
And there are a few screenshots of the memestreme RCW iPhone a few posts down from this.

Confessions from speculative fiction tragic;

A flight of fancy.
Confessions from speculative fiction tragic;
Tuesday July 16th 2017. It had been a good day for Big Al our 55 year old protagonist from Wisconsin, all was well in his home town of Fennimore.  His company ‘The Lawnmower Man”  had just landed a fat contract, his work team were reliable  and he was now well due for that trip to Central Australia’s Red Heart that he’d dreamt about for ages.
He’d ‘done’ the Great barrier reef “Taste” tour but man most of the coral was bleached to monochrome or smashed to pieces by the extraordinary wind and wave battering thrown Northern Australia's’ way now that climate change was picking up speed. Ocean acidification wasn’t helping the coral either.  Climate refugees were also arriving in ever greater throngs to Australia's Northern beaches and he’d heard some rumors of frustrated hungry mobs looting shops in Cairns.

No, he’d stick with the Alice and its unique geographical legacy; it wasn’t overly troubled by whacky coastal weather; tsunamis, cyclones and sea rise situations hardly affected the Alice and a town with only 12 sets of traffic lights was mighty appealing, in a quaint sort of way.

He donned his ‘sense-suit’ settled into his multi-purpose recliner 'modded' for Augmented Reality  simulations and taken his augmentation med 20 mins ago [a weak mix of Rohypnol, Pentothal and Psyilocibin]. He felt relaxed and easy as he prepared to ‘let go’ his real self and transfer his cognitive identity into his Avatar.  Avatar Transitions were becoming easier with every psy-ware trip he’d made,  much as the experience with those  old computer generated “Magic Eye” prints that were around back then. His avatar body came up on his goggles and as objects touched ‘his’ avatar his chair made corresponding contact simulations, in a willing semi-hypnotic consensual transaction Big Al senses quiescently drifted into cyberspace. Once you got the hang of it it became second nature… ‘Wetware’ jacks were emerging but he couldn’t yet reconcile himself having to undergo 3 hours of Neurogenic surgery to interface with the machine 1 on 1. With similar laudable resolve he always ‘traveled’ sans cod-piece preferring the gravitas laden embrace of his wife and soul-mate these last 35 years, Lois.

With his ‘auto-cue’ prepped to jolt him back to reality in 90 mins he set sail, rocketed off actually, for the vistas of the NT’s Red Centre Way.  Morphing the rocket into a late model humvee from an altitude of about 2Km he achieved a leisurely 200KPH touchdown along the larapinta Drive Highway.  He’d set his NT entry time for sunrise and the Western Macs looked brilliant in the soft glow.  A quick jet-pack detour to the summit of Mount Gillen left him geographically orientated with the caterpillar like procession of these ancient ranges heading Westerly.  In a typically unreal time he was at Simsons Gap heritage precinct.  A new ‘scent of spinnifex’ was available from Odourphonics and the chipset in his recliner dutifully mixed the requisite chemicals and flash burned them to give him a pungent yet stimulating  synthetic approximation. A desert art painting took his fancy and he bought this also. Stroking the virtual Bearded Dragon lizard lounging on the rock didn't feel so real, the sense glove wasn't all it was cracked up to be... he should have shelled out that bit extra for the latest 'sense-suit', when he got back maybe.
Some time later as he stood in awe on the Western rim of Gosse’s Bluff he heard the familiar ‘re-entry’ primer-phonics signaling the sessions close, 3 minutes later he emerged into the late evening of Fennimore Wisconsin. The enjoyable ‘Taste’ tour was so impressive he’d even booked the “Full Regular Reality Tour” from the rim of Gosse’s Bluff.  A short drive to Madison airport tomorrow and he and Lois were on their way, really.