Minority Report Interfaces: Coming to a Screen Near You

Dale Herigstad, Chief Creative Officer at design firm Schematic, spoke today at the XML Auckland conference. Herigstad worked with Steven Spielberg on the conceptual design for the film Minority Report, including designing the hologram screens on which Tom Cruise used his hands to navigate.

The subject of Herigstad’s presentation today was new forms of User Interfaces for Web, TV and other media. Examples of the interfaces he discussed were touch screen and “distance gestures” – the latter being what Cruise was doing in Minority Report. Herigstad showed some real world examples of distance gestures, mostly from the TV/movie industry.

Note: Big thanks to Kaila Colbin’s excellent live-blogging of Herigstad’s talk at XML Auckland, on which much of this post is based.

Designing for Distance

Herigstad said that the audience nowadays is everywhere: watching TV, using Facebook on their iPhone or computer. Whatever screen wherever – this is where the audience is going. What Schematic is doing in a lot of its design work is considering the distance between a user and the screen, in various contexts. He noted some instances of this analysis:

  • Personal media is 1-2 foot navigation – computer, iPhone, etc. Personal devices are where the audience is really close and can actually touch the screen.
  • The traditional TV experience is 10 foot navigation; includes friends and family.
  • Public media is screens that the audience doesn’t own. People can walk up and interact with them. This type of screen can be anywhere from 2-200 foot navigation, and could also include layered navigation (somebody close, somebody far).

Example of DVD content augmented with added interactive media, which could be downloaded from the Web.

Cutting Edge Media Design Concepts

Herigstad drew a big line between distance gesture (TV and public media) and touch gesture (personal media). He explored some of his firm’s current interface design concepts. With thanks again to Kaila Colbin’s real-time notes, here is a summary:

  • Using perspective. Think football game graphics that zoom in and out on your TV screen (see screenshot to the right).
  • Products as experiences. They’re less about a product (computer, phone, etc) and more about what the user is doing.
  • “Your interface is your brand”.
  • Time: now and next. A lot of projects they’re doing are looking at designing concurrently something for now (current reality) and something visionary (your brand in the future when some of these limitations go away). He later discussed the following time chain: archive –> recent –> now –> next –> promo.
  • Utilize z-space – dive in, pull back out. Flash is being used a lot now on set-top boxes.
  • Hand gesture as input. With his work on Minority Report, his job was to figure out what it will be like to interface with the computer in the future. He noted that one of the inspirations was sign language.
  • Pure gestural navigation for TV. They’re working with Prime Sense, based in Israel. He showed a brief video of controlling volume using a hand gesture. He loves the purity of not having a remote device. But it’s not just for entertainment industry, Herigstad said that it has other implementations – for example doctors in surgery. The overall concept he explained as “training a machine to respond to the language of your hands.”
  • Brainwaves as inputs they want to get data coming out of your head! (hopefully Marshall doesn’t read that bit)
  • Screens as wallpaper. Video will not be “furniture” anymore, but part of the background. For example watching a movie across a wall.
  • Dynamic Assemblage. Instead of watching online media piecemeal – e.g. YouTube videos found through Google or Digg – Schematic is exploring “advanced metadata” that will assemble your viewing experience automatically. He noted that currently when we watch television, there’s a careful production process that happens behind the scenes to craft the branding, promotions, credits, the show itself, etc. He sees that in the future the system will understand these parts, the crafting and user preferences so it will be able to assemble media for you. So ‘dynamic assemblage’ means that online experiences could look like television in the near future, but assembled automatically.

Minority Report UIs – Not That Far Off?

Herigstad finished by talking about what it means to design for cross media. From a television standpoint, it’s very common to have a list of things on the left and some more detail on the right. So some of the interaction concepts they’re working on can utilize this. For example with touch gestures, you can touch an item on the left and it would open on the right. With hand gestures, you could gesture at an item on the left and flick it over to the right.

Unfortunately Herigstad couldn’t show us some of the things he’s working on that use the above concepts, as he’s under NDA for a lot of the implementations. But the concepts he discussed today are very thought provoking and give us a glimpse of what media (particularly television and movies) will look like in the future. Because almost all types of media will be on a Web platform in the future, it follows that these concepts will also be very important in the development of Web technology.

Originally published on ReadWriteWeb (archived copy)

Consulting

Make your site AI-ready

I help publishers and tech companies adapt to the agentic web — from AI discoverability to on-site assistants and Web AI strategy.

Explore consulting →