the history of the future, circa 1994

October 21st, 2010 | Gene | Comments Off

[From the archives, a high level prediction piece that I wrote 2^4 years ago. Some of this came up in a long chat I had with @anthropunk today, and it seemed appropriate to post (apologies to longtime readers who have seen this before). To give you some reference points, in 1994 Intel shipped the 75MHz Pentium processor, Apple shipped the Newton Message Pad, Marc Andreessen and Jim Clark founded Mosaic Communications Corp (soon to become Netscape), and David Filo and Jerry Yang founded Yahoo!  How far we’ve come, and yet…]

Some things about the technological landscape of the future are fairly certain, mapped out by the trends we see today. While we cannot predict the precise manifestations of products or their impact on society, we can extrapolate along fairly straight lines to imagine the lay of the land.

Microprocessors, semiconductor memory and magnetic storage will continue to plunge headlong down the spiral of shrinking dimensions and expanding performance. The central processing element of the personal computers of 15 years ago is now the central processor of your coffeepot. Fifteen years hence, a device of that complexity may well be the central processor in your credit card while the RISC and CISC marvels of today’s desktop workstations power learning toys and portable entertainment products. We understand this trend, and we fully expect it to continue.

Networks for communication among digital devices and systems will continue to proliferate. The imperative to connect and communicate will drive organizations and individuals alike to go ‘on-line’. Islands of disconnected computers will evolve to isthmi, peninsulae, continents of computing. Home PCs will aggregate into community networks. Enterprises will resemble Internets; Internets will become Meganets. Developments occuring in research laboratories right now will lead to low cost, low power wireless components, enabling a fabric of invisible connections among people and between devices.

Information will continue to move toward a digital lingua franca. Images, sounds and words are well on their way; film and video, coming soon. Tactile, olfactory information next, perhaps? Even a semblance of virtual experience is already becoming available in digital form. The physical world literally radiates information, much of it beyond human sensory capabilities; physical, biological and chemical sensors will increasingly translate the world into binary representations. Paper, canvas, real life — these media are not dead, but their roles stand to be augmented and reexamined due to the rapid incursion of digital bits into their traditional domains. In an era of television, radio survives and thrives, movies are still shown in theaters, newspapers are still delivered, books are still read. In the coming era of digital media, we will experience even greater richness and diversity of form.

Many aspects of the future of technology are rather more uncertain, yet they carry vast potential for change. The ability to model, fabricate and manipulate structures at molecular scale leads to new conceptual approaches for the chemical and biological sciences, and indeed for electronics, optics and mechanics as well. The mathematics of nonlinear dynamic systems and complexity, still in its infancy, begins to describe a world view where the future is undetermined but the brushstrokes of the next few seconds might be predictable, and where systems behavior emerges from the undirected interaction of individual entities.

Where does all this lead? The future is uncertain if nothing else. We can merely speculate that this backdrop of pervasive digital technologies and media will weave a dense fabric of information through our lives. Much as electric power snakes invisibly through every wall in the developed world, an information utility may become an expected part of the backdrop of day to day life, with information appliances providing the interface to its users. Perhaps ‘gratuitous computing’ describes a world where ordinary objects sprout features like consumer appliances run amok and mumble to one another in vague digital whispers as we pass. Perhaps the loose associations of people and places, objects and ideas and experiences which make up our identities will coalesce into a tangible web facilitated by technology. Perhaps people’s lives will be markedly improved by technology. Perhaps not. The world will continue to change, and the outcome is far from settled. Our part is to advance the state of the art, to foment change and forward progress, and to maintain a clear perspective on the value of our work to society.

HR

augmented reality 4 poets

October 21st, 2010 | Gene | 1 Comment

Earlier this month I attended THATCamp Bay Area, a 2-day head-on collision of scholars and practitioners in the humanities with a range of folks from the tech world. It was quite refreshing and challenging to (attempt to) wrap my mind around linguistics, environmental history, experimental poetics and art curation, just to name a few of the disciplines that were represented. Interestingly, I also discovered unexpected hidden connections that led back to the EVOKE Summit and forward to @ubistudio; more about these later perhaps.

My contribution to the fray was a session named “Augmented Reality 4 Poets”, a hands-on workshop on creating basic mobile AR using the Layar platform and Hoppala CMS service, no programming required. It worked out pretty well, and I wanted to share the materials here. I’ll likely reprise some of this in a session at ARDevCamp in December, and possibly at other future events. Anyway, here’s the tutorial. I’ve kept it simple on purpose — both Layar and Hoppala have additional capabilities you should take the time to explore. Also, you’ll see that for THATCamp I made the shared @ubistudio accounts available, but if you want to go through this on your own, you will need to sign up for a Layar developer account and a Hoppala login (it’s easy).

Mobile Augmented Reality for Non-Programmers
A Simple Tutorial for Layar and Hoppala

1. What you need to create your first mobile AR layer:

* A smartphone that supports the Layar AR browser. This means an iPhone 3GS or 4, or an equivalent Android device that has built-in GPS and compass.
* The Layar app, downloaded onto your device from the appropriate app store.
* A computer with web access.
* A developer account with Layar and a login at Hoppala. For this tutorial, you will use our shared ubistudio account. Later, you can request your own at http://site.layar.com/create/start-now/

2. Get connected:

The ubistudio credentials we will be using today are: [redacted]

You should use these credentials to sign in at 3 places:

* The Hoppala website: http://augmentation.hoppala.eu
* The Layar developer website: http://layar.com/publishing
* The Layar app on your device

Because these are shared credentials, you will see other people’s layers in these environments [only true for the shared tutorial account]. PLEASE DON’T TOUCH ;-) There is no undo or undelete!

3. Get started:

Log into Hoppala. You should see the Dashboard, a simple list of layers with Titles, Names and POI URLs.

Hoppala dashboard

At the bottom right of the page, click “Add layer service” to create a new layer. A new line will be added to the list, with “Untitled” and “noname”. On the far right of that line, click the pencil icon and give your layer a new title and name. The name needs to be lowercase alphanumeric. Click the Save button.

Next, click on the name of your new layer. You should see a Google Map. Navigate to our location and zoom in.

Hoppala map view

To add a point of interest (POI), click “Add augment” at bottom right of the page. This will add a basic POI in the center of the map. You can drag it to the location you want.

To customize your new POI, click on it and a popup will launch. The popup has 5 tabs, and we’ll mostly care about the first 3. Each tab is a form we will use to enter data about the POI.

Hoppala POI menu (click for larger view)

GENERAL

* Title and description fields can be whatever you want. Footnote is not editable
* Image is the picture that is displayed for the POI’s information panel in the mobile app view. You can use one of the images already loaded, or you can upload your own from your computer.
* POI Icons are what show up in the AR view for basic POIs. Choose ‘default’, and select a Custom Icon from the drop down list. You can also upload your own.
* BE SURE TO CLICK THE SAVE BUTTON and wait for the confirmation.

MODEL

* For basic POIs, don’t worry about this.
* For images or 3D models, select the appropriate Type.
* Use the pulldown menus to select a preloaded image or model. You can also upload your own. 3D models need to be in a custom Layar l3d format.
* BE SURE TO CLICK THE SAVE BUTTON and wait for the confirmation.

ACTIONS

* In the Layar browser, you can have actions triggered from POIs. These can include going to a website, playing an audio or video, sending an email or text, and making a call.
* If you make changes, BE SURE TO CLICK THE SAVE BUTTON and wait for the confirmation.

You can add more POIs, or move on to testing the layer.

4. Testing your layer:

Log into the Layar developer site. You will see a table of existing layers.

Layar Developer Site (click for larger view)

To add your new layer, click the “Create a layer” button. You will see a popup form.

* Layer name must be exactly the same as the name you chose in Hoppala.
* Title can be a friendly name of your choosing.
* Layer type should be whichever type you have made.
* API endpoint URL is the URL for your layer copied from the Hoppala dashboard (the long ugly one).
* Short description is just some text.

Click Create layer and you should be done!

(There are more editing options, but you can safely ignore them for now).

Start up the Layar app on your mobile device. Be sure you are logged in to the developer account , or you will not see your unpublished test layer. Select “YOURS”, and then “TEST”. You should see several test layers, including your own [different versions of the Layar app may put the TEST listing in different places, so you may need to poke around a bit]. Select your layer and LAUNCH it. Now look for your POIs and see if they came out looking the way you had expected.

Congratulations, you are now an AR author!

HR

more signs of the new revolution in personal computing

October 19th, 2010 | Gene | Comments Off

Apple reported their Q4 2010 results last night, and the numbers offer an instructive view of the new revolution in personal computing. Quarterly revenue of $20.34B, and operating profit of $4.31B. From the AAPL 8-K filing, the revenue by product line looks like this (units are $M):

AAPL-4Q10-450px

At $4.87B, Apple’s traditional personal computer segment of desktops and notebooks (‘portables’) continues to grow, but today represents just 24% of total revenue. New personal computing, represented by the iPhone, iPod and iPad ecosystems, is now 70% of the business. Year over year, Apple’s traditional PC business grew 22%, and new personal computing grew 99.3% for a combined growth number of 67%.

Compare these results to the most recent quarter for HP’s Personal Systems Group, the world’s leading PC company:

hp-psg-3q10-450px

In HP’s Q3 (May-July), notebooks, desktops and workstations represented 98% of the $9.9B business segment, and total operating profit was $469M. Year over year growth was 17%. To be fair, HP’s Q3 is a seasonally weak quarter and the 4th quarter will likely show a jump in revenue; also, results from the Palm acquisition will show up in Q4. HP will report in mid-November and we’ll revisit the comparison at that point.

Until then, at double the revenue, 4x growth, and 9x profitability, the new revolution in personal computing appears to be going rather well.

HR

@ubistudio project: mobile AR layer for 2010 01SJ biennial

September 15th, 2010 | Gene | Comments Off

One of the goals of the @ubistudio is to actually do projects with new media technologies, not just talk about them. In that spirit, we made a mobile augmented reality experience for the 2010 01SJ Biennial that takes place this weekend, Sept 16-19, 2010.

It’s a fairly simple layer, developed on the Layar AR browser and featuring basic points of interest (POIs) for many of the public artworks and venues of the 01SJ festival. Here’s a screen shot of our layer in action on an iPhone 3Gs:

01sj-shot-land2

Among the many artworks featured are “Play Me I’m Yours“, ~20 street pianos created by artists Luke Jerram; Poetics of Dis-communication by Patrick Manning, and ZOROP by Ken Eklund and Annette Mees. You’ll also find the major venues and outdoor performances, to say nothing of the stops where you can catch the ZOROP Mexican Party Bus!

We submitted the layer to the Layar developer program, and it was approved earlier this week. If you’re at 01SJ and have a newer iPhone or Android phone, please check it out and let us know how you like it. You’ll need to download the free Layar app for your phone if you don’t have it already. Then just search for “01SJ” and you should be able to find it easily. All of the interesting points are in the downtown San Jose area, so if you’re not in that area you won’t see much ;-) If you have questions or feedback, ping us on Twitter: @ubistudio or just get involved by coming to our next Ubiquitous Media Studio meetup.

HR

@ubistudio: Introducing the Ubiquitous Media Studio

July 13th, 2010 | Gene | Comments Off

As promised during my talk at ARE2010, I’m launching a new project called the Ubiquitous Media Studio, a.k.a. @ubistudio. The idea is to gather an open network of technologists, artists, experience designers, social scientists and other interested folks, to explore the question “If the world is our platform, then what is our creative medium?” I’m provisionally calling this notion “ubiquitous media”, building on initial research I did in this area several years back. The idea is also very much inspired and influenced by my friends at the most excellent Pervasive Media Studio in Bristol England, who you should know as well.
button-ubi So what is ubiquitous media? I don’t know exactly, thus the exploration. But it seems to me that its outlines can be sensed in the choppy confluence of ubicomp, social networks, augmented reality, physical computing, personal sensing, transmedia and urban systems. It’s like that ancient parable of the blind monks trying to describe an elephant; the parts all feel very weird and different, and we’re trying to catch a glimpse of the future in its entirety. When you look through an AR magic lens, ubiquitous media is in there. When your kid went crazy over the Pokemon and Yu-Gi-Oh story-game universes, it was in there too. When you snap your Nike+ sensor into your running shoe, you’re soaking in it. When you go on a soundwalk or play a mediascape, there’s more than a bit of ubiquitous media in the experience.

Blind-monks-450x337

Anyway, we are going to investigate this, with the goals of learning new creative tools and applying them in creative projects. And “we” includes you. If you’re in the Bay Area and you think you might be interested, just jump right in! We’re having a little get-together in Palo Alto:

@ubistudio: Ubiquitous Media Studio #1
Thursday July 22, 2010 5:30-8:30PM
Venue: The Institute for the Future
Details & RSVP: http://meetup.com/ubistudio

I hope you’ll join us. You can also stay connected through @ubistudio on Twitter, and a soon-to-be-more-than-a-placeholder website at ubistudio.org.

HR

Beyond Augmented Reality: Ubiquitous Media

June 19th, 2010 | Gene | 1 Comment

Here are the slides I presented during my talk at ARE2010, the first Augmented Reality Event on June 3, 2010 in Santa Clara. Many thanks to all who attended, asked questions and gave feedback. For interested Bay Area folks, I will be organizing some face to face gatherings of the Ubiquitous Media Studio to explore the ideas raised here. The first one will be in July; follow @ubistudio on Twitter for further details.

HR

why a twitter overlay on your internet tv is a bad idea

May 27th, 2010 | Gene | Comments Off

twitter-idol-spoiler

HR

ARE2010: kicking off the augmented reality summer of love

May 18th, 2010 | Gene | Comments Off

ARE2010 – the Augmented Reality Event – is just around the corner on June 2-3. In case you missed the memo, this is going to be an outstanding conference! I’ll be giving a deep dive talk on Experience Design for AR, expanding on what I presented at Web2Expo earlier this month. More importantly, there will be over 80 great speakers from the AR world, including keynotes by los luminarios Bruce Sterling, Will Wright, Jesse Schell and Blaise Aguera. Don’t miss this, seriously. And when you register, use this ARE2010 special discount code: E195 to get the full 2 days for just $195. It’s a freakin’ bargain, folks. Be there.

ARE2010_conference

HR

Experience Design for Mobile AR: my Web2Expo slides

May 5th, 2010 | Gene | 2 Comments
HR

my talk on mobile AR experience design at Web2Expo

April 27th, 2010 | Gene | Comments Off

I’m presenting a session at Web2Expo in San Francisco on May 4th, titled “Challenge, Drama & Social Engagement: Designing Mobile Augmented Reality Experiences“. Here’s the blurb:

Mobile augmented reality adds digital overlays and interactivity to the physical world using the sensors and display of your smartphone. Design of mobile AR experiences is complex and takes us well beyond the browser-based web. This session will give you a mix of practical knowledge and new ideas for creating AR experiences, drawing from web design, 3D graphics, games, architecture and stagecraft.

The next generation of mobile augmented reality applications will go well beyond simply overlaying points of interest, floating post-its and 3D models on the video display of your phone. Mobile AR is becoming a sophisticated medium for immersive games, situated storytelling, large-scale visualization, and artistic expression. The combination of physical presence, visual and audio media, sensor datastreams and social environments blended together with web services offers tremendous new creative possibilities. However, the design challenges of creating engaging, exciting and compelling experiences are quite significant.

Research on the design of technology-mediated experiences has shown that compelling experiences often involve a mixture of physical and mental challenge or self-expression, a sense of drama, sensory stimulation, and social interaction. These elements can give us a physical “buzz” by activating the release of adrenaline, endorphins and related neurochemicals.

Mobile AR puts us “where the action is”—in motion through the physical world, surrounded by other people, in a stimulating environment. AR applications additionally provide challenges, stories, information and communication. Factors that AR experience designers need to consider include:

  • Goals of the AR experience
  • Users’ cognitive model of the system
  • Physical environment and context of the experience
  • Social context of the experience
  • Design of interaction models and experience mechanics
  • Story, goals and outcomes
  • Immersion and flow
  • Design of visual and audio assets
  • Non-player characters (“AIs”)
  • Tracking and analytics
  • Technical capabilities and limitations of the AR system
  • Managing the production process (designing an AR experience has much in common with producing a movie on location)

Should be fun, ping me if you’re going to be at the conference!

HR