Tangible/Intangible

The nukulele

Read in Principles for Designing Computer Music Controllers, the Nukulele is a nice modified ukulele!

The Nukelele (thanks to Michael Brooke for the name) was constructed in Bob Adams’ Interval Research Expressions project. While collaborating on other Expressions projects such as “the Stick” and the “ Porkophone,” the Nukelele was a personal experiment to design, implement, and test a new controller as rapidly as possible. The Nukelele was intended to match the expressiveness of a true stringed instrument, by using audio directly from a sensor to drive a plucked string physical model. Two sandwiched linear force sensing resistors under the right hand served to provide pluck/strike position information, along with the audio excitation for the string model.

gametrack: a theremin.like game controller

This gametrak seems interesting. This game controller comes in the form of a pair of fingerless gloves you strap on your arms. It only tracks the movements of the player's arms and hands.

Gametrak™ is a revolutionary new videogames control system and games, which captures your movements and puts YOU in the game! Unlike cameras, infra-red, RF systems or tilt technologies, Gametrak™ allows movement forwards and backwards as well as up, down, left and right. With Gametrak you can punch your opponents with your hands; sports games let you pick up and play using real golf clubs or tennis racquets – you can even bounce virtual basketballs!

Here is the Meta Trumpet

The Meta Trumpet is developed by Johnatan Impett. A traditional trumpet was instrumented to record performance gestures, and converted these to MIDI. Described by CJ Bolland (.ppt) as:

Traditional instruments can be adapted into MIDI controllers for multimedia by incorporating additional sensors without compromising the usual interaction mode. Jonathan Impett’s Meta-trumpet (top) is a good example, breath & pressure sensors, switches, tilt sensors have been incorporated into the instrument without adding significantly to the players cognitive demands.

Some songs .(mp3): 1, 2 or 3 Breathing and using the valves... another way to interact with a machine...

GhostRadar: a ghostbuster handheld device + collaborative cartography

Seen in the TIME:

a Japanese gadget company has designed the very thing for you — the world’s first portable ghost radar. The device, which fits neatly into a pocket, promises to alert its owners to the presence of eight different types of spectre, from “lost souls” to “evil spirits”.

Using a variety of carefully calibrated sensors — one of which claims to detect human fear — the machine will then inform users whether the ghost is malevolent or benign.

It's designed by a japanese company called Solid Alliance, this Ghost radar The interface is pretty rough on the website screenshots, here is an example of how the radar displays the haunted areas: There seems to be a collaborative cartography project as well: the website depicts plenty of maps with what I expect to be haunted places. I like the Time's point:

Solid Alliance, the technology company that makes the gadget, has worked closely throughout the design process with GRX, a virtual study group that investigates “paranormal symptoms”. The same group is now collating a giant online database of what the first users of the radar have found.

It's cool to have paranormal users groups to run specific end-user studies!

A dentist simulator

I don't if giving angry teenagers this kind of virtual dentist simulator would either interest ot calm them down but it seems that medecine-related simulators are a very thriving domain for VR/tangible computing specialist (as attested by the wonderful bovine palpation simulator). Look at this nice dentist simulator, they use "computerized dental dummies", it's developed by denx: Now let's think about this kind of interface could be hacked, tricked or modified: this wired driller could be a powerful controller for weird gaming purposes.

BattleBoard 3D: an augmented reality game with LEGO

BattleBoard 3D (BB3D) is a simple augmented reality game prototype conducted by Sune Kristense, Bjørn W. Nielsen, Troels Lange at interactivespaces

The original source of inspiration for the BB3D project was a sequence in the movie Star Wars Episode IV where two characters are playing a game of chess. The different pieces of the chess game are alive, moving and making comments of the game, and when a strike is executed a battle between the actual pieces are shown on the field of the board. The vision with the BB3D prototype was to make this scenario come through outside the world of fiction film in the actual settings of traditional board game playing. Physical pieces are dismantled and recombined to invoke a battlePhysical pieces are dismantled and recombined to invoke a battle

In BB3D the physical pieces are associated with animations which show the virtual representation of pieces and the outcome of occasional battles. This kind of augmentation provokes new ways to interact with computers, which enables the user/player to maintain the same kind of interaction as known from classical board games – the interaction is purely based on physical interaction. (...) Through VR goggles enhanced with a Web cam or with a Web cam and a display, the player can experience the battles in the virtual world. The game is played on a six times seven squared board and each player starts with seven pieces. (...) The construction theme in LEGO gives the possibility of combining pieces to create a new unique pattern which invokes the battles between pieces. This is done by breaking off the halves of the pieces carrying information and assembling new halves again. Future research may be to support players to build their own physical pieces in LEGO and thereby defining their own unique markers and to utilize a different kind of goggle to minimize the fumbling when moving pieces.

Why do I blog this?I like this idea of mixing augmented reality and physical pieces of LEGO!

Nouse: Use your nose as a joystick or mouse

"Nouse” use your nose as a joystick or mouse is an interesting project about alternative interaction carried out by D.O. Gorodnichy , S. Malik , G. Roth at the Computational Video Group, IIT, National Research Council, Ottawa.

Nouse stands for “Use your nose as a mouse” and is the name of the technology which allows one to operate the computer with the motion of his/her nose. (...)It is based on tracking the so-called convex-shape nose feature, which is the extremum of a convex surface of the nose tip. Thus defined nose feature is rotation and scale invariant, it is seen at all times regardless of the face orientation. It also can be tracked with the sub-pixel precision guaranteed, and as such it makes it possible to operate with your nose as with a mouse (or a pen) or a joystick (or a pointer).

A video (avi) here.

Finally a breath-based joystick!

I finally managed to find a breath-base joystick:Evreinov G., Evreinova T. "Breath-Joystick" – Graphical Manipulator for Physically Disabled Users. In Proceedings of the 7th International Conference on Computer Helping People with Special Needs. ICCHP2000, July 17-21, 2000, Karlsruhe, Germany, pp. 193-199. Unfortunately there is no picture of the device.

Manipulator of joystick’s type was implemented as device having a high sensitivity to human respiration flow. A circuit diagram of the breath-joystick is shown in Figure 1.

Six thermo-transducers are located in front of user’s mouth on a surface having special elements of construction, which are dividing and selecting necessary components of directed air stream. Thus, a prototype implements a function of two-dimension thermo-converter of coordinates, while two others thermo-transducers must emulated a function of buttons. Thermo-transducers have a temperature little bit above than environmental one (about 40?C), it allows to remove undesirable influence of a water vapor, being in air flow.

Retro phones of the future

I am late on this but I like the idea: Pokia proposes interesting retro phones interface to be plugged into your bleeding edge cell phone! I find this interesting since mobile tech (as a subset of technology in general) is sometimes frightening for lots of users, here they offer a funny way to get back to the past. I would like to know more about current use of those old-school handset!!!

A compelling fingerprint maze

A very nice and calm project by David Lu, Amy Franceschini and Michael Swaine: Fingerprint maze. The point of this installation is to let one wander through a 3D labyrinth made from one's own scanned fingerprint. They use this hand crafted fingerprint scanner: This is how it works:

The scanned fingerprint is saved to another computer running the Fingerprint Maze game. An OS X application, written in C++ and OpenGL, picks up fingerprint files and renders them in 3D. For each dark pixel it finds in the image, it places a translucent cube in virtual space. The labyrinth can be navigated from above, or explored at ground level, as seen here. What we made is something between copy machine art and generative architecture. At left is what resulted when Amy kissed the surface of the scanner. I saw this project as an opportunity to encourage reflection about fingerprinting and identity, which are very interesting issues in the current political climate, in a very neutral, understated way—a non traditional, non-violent video game.

Oroboro: a collaborative music controller

Via reg:exp, Oroboro by Björn Hartmann and Jennifer Carlile is:

OROBORO is a novel collaborative controller which focuses on musical performance as social experience by exploring synchronized actions of two musicians operating a single instrument. Each performer uses two paddle mechanisms – one for hand orientation sensing and one for servo-motor actuated feedback. We introduce a haptic mirror in which the movement of one performer’s sensed hand is used to induce movement of the partner’s actuated hand and vice versa.

This is an amazing example of how tangible interaction techniques coudl support innovative joint activities. Applied to other context (i.e. ... video games) this would be nice.

IEEE on interactive sonification

IEEE Multimedia's last issue is about interactive (with) sonification. The point is that sonification "presents information by using sound (particularly non speech), so thatthe user of an auditory display obtains a deeper understanding of the data or processes under investigation by listening". Interactive sonification is then defined as "the use of sound within a tightly closed human–computer interface where the auditory signal provides information about data under analysis, or about the interaction itself, which is useful for refining the activity".This issue provides the reader with a good review of cutting edge projects:

  • Zhao et al. report on “Interactive Sonification of Choropleth Maps.” The extension of visual maps is not only interesting for blind people, it also inspires us to consider the extension of other visual techniques into the auditory domain.
  • Fernström, Brazil, and Bannon present in their article, “HCI Design and Interactive Sonification for Fingers and Ears,” an investigation of an audio-haptic interface for ubiquitous computing. This highlights how human beings can use the synergies between data presented in different modalities (touch, sound, and visual displays).
  • In their article, “Sonification of User Feedback through Granular Synthesis,” Williamson and Murray-Smith report on the progress in the domain of high-dimensional data distributions, one of the most appropriate applications of sonification.
  • From a completely different angle, Effenberg discusses in his article, “Movement Sonification: Effects on Perception and Action,” the enhanced motor perception in sports by using an auditory display. Effects on perception and action are reported from a psychophysical study.
  • In “Continuous Sonic Feedback from a Rolling Ball,” Rath and Rocchesso demonstrate the use of an interface bar called the Ballancer. Although this interface is not yet used to explore independent data, it is an ideal platform for studying the interaction at the heart of an auditory interaction loop.
  • Hinterberger and Baier present the Poser system in “Parametric Orchestral Sonification of EEG in Real Time.” The electroencephalogram is an interesting type of signal for sonification because it involves temporal, spectral, and spatial organization of the data.
  • Finally, in “Navigation with Auditory Cues in a Virtual Environment,” Lokki and Gröhn show how sonification can enhance navigation and operation in spaces that so far have only been explored visually.

My favorite is Fernström, Brazil, and Bannon's project.

Raytheon, Minority Report and gesture-based technology

An interesting story in the Wall Street Journal: 'Minority Report' Inspires Technology Aimed at Military.

In the futuristic movie "Minority Report," Tom Cruise gestures with his gloved hands to sift through crime-clue data that are displayed on giant screens. With the twist of a wrist he can move information from one column to another or delete items. (...) Raytheon then hunted down the scientist who was behind the movie technology, John Underkoffler. Raytheon decided to fund an effort to try to turn his film fantasy into reality and explore its potential for speeding up intelligence analysis (...) The fruits of that investment are housed in a darkened room in a converted Los Angeles factory. There, a man wearing reflective gloves uses hand gestures to manipulate pictures projected on a panoramic screen. He slides an index finger forward to zoom in on a street scene; swivels a horizontal hand to the right to scroll through a video; sweeps both hands to the left to clear the screen. Raytheon believes such "gesture technology" can help solve one of the military's biggest problems: information overload (...) Raytheon isn't alone in chasing the command post of the future. And it isn't the only company injecting Hollywood into this race. Silicon Graphics Inc., which is known for special effects in movies, is working with the Army to develop the computing firepower that command centers will need.

And on a different note, video games are also looking in that direction:

Raytheon is working on more immediate applications, such as a device called a Common Tactical Blackboard to offer a portable bird's-eye view of a battle zone and software that suggests combat responses. But Mr. Underkoffler retains the right to pursue commercial uses, such as command-and-control operations for railroads and ports, and virtual wind tunnels for industrial designers. Videogames are also in the mix. With similar but less advanced technology, Sony Corp. already markets the EyeToy, in which a camera captures a person's movements and incorporates them into the game on the TV.

Nabaztag: a WiFi rabbit as awareness tool

Nabaztag is a WiFi rabbit designed by Violet (a company focused on the design of products and services based on calm and emotional technologies). This rabbit can access the Internet. The most interesting part is that the colors change deppending on various parameters: the weather, car traffic or reception of emails. There is also different sounds as well as ear movement modified by those variables! It can also communicate with other rabbit located elsewhere thanks to a coded language you can create (like: a specific posiiton of the ear to show that you're busy...). 95 euros! It's designed by the same team who did the cool DAL, a wireless light that emits emotion via Wifi connection to the internet. I like this approach, calm technology is something of interest. I am wondering whether they make user experience analysis to understand more the usage of such device! This french company rocks. I strongly believe in this stuff. My concern is how to make user understanding the different kinds of information. How do they get it? Are there already any hacks or tricks (well the interaction is pretty low since it's not intended to offer lots of interaction but since it's wifi enabled there must be some tricks ;) ). I hope there next device will offer device-to-device interactions (with copresent rabbits). I imagine crazy scenarios with this.

Homeplay: a trackball to explore a town

While googling I ran across Homeplay, a less known project of collectif fact. I like the concept a lot!

Spectator has a trackball in a hand and stands in front of the model of a town. On top, on the roof of the building, images are projected, those can be images of inside apartments, looking from the top, or webpages of furnitures products (trademarks). Spectator can move from one to the other apartment or go downstairs. Furthermore, spectator can create his own apartment by drag and drop words meaning objects, furniture or actions. This artwork is a reflexion about representation modes connected to physical, mental and virtual architecture.

The installation is explained here.

Mobile Music Technology: 2nd international workshop

There is a smart workshop about Mobile Music Technologorganized by the great Future Application Lab in association with NIME 2005 in Vancouver, May 25. The organizer are Lalya Gaye (Hi lalya!) and Lars Erik Holmquist + Atau Tanaka (great musician!).

In the late 1970's, the Walkman liberated recorded music - it allowed you to carry the listening room with you. Today, iPods and mobile phones allow new forms of private and social music experiences. What are the trends in mobile music technology? What kinds of new modes of musical interaction are becoming possible? Will peer-to-peer sharing and portable MP3 players destroy the music business - or will new technology let artists reach more people than ever before?

The programme will consist of presentations, interactive posters and hands-on break-out sessions. Accepted papers and interactive posters include:

* Papers: - "From Calling a Cloud to Finding the Missing Track: Artistice Approaches to Mobile Music" by Frauke Behrendt - "Location 33: A Mobile Musical" by William Carter and Leslie S. Liu - "The New Cosmopolites: Activating the Role of Mobile Music Listeners" by Gideon D'Arcangelo

* Interactive Posters: - "Solarcoustics: CONNECT" by Morgan Barnard - "Experimental Design for the Musicology Mobile Music Player" by Trevor Pering - "Mobile User Interface for Music" by Takuya Yamauchi and Toru Iwatake

The number of participants is strictly limited. To register, please FIRST contact Lalya Gaye, lalya@viktoria.se to confirm there is space. After your participation has been confirmed use the main NIME registration page to register and pay: http://hct.ece.ubc.ca/nime/2005/registration.html

Audio Clouds: head, hand and device gestures for input on mobile devices

(via). Still the same lab in Glasgow, they also have a nice project related to the investigation of 3D audio on wearable computers to increase display space plus how head, hand and device gestures may be used for input on mobile devices. It's called "Audio Clouds". There is news on the BBC about it.

"The idea behind the whole thing is to look at new ways to present information," Professor Stephen Brewster told. (...) "We hope to develop interfaces that are truly mobile, allowing users to concentrate on the real world while interacting with their mobile device as naturally as if they were talking to a friend while walking." "Lots of times, you need to use your eyes to operate a gadget - even with an iPod, you need to take it out of pocket to look at screen to control it. "If you could do something with your hands, or other gestures you would not have to take it out of your pocket," explained Professor Brewster. The researchers have developed ways to control gadgets, such as personal digital assistants (PDAs) and music players, using 3D sound for output and gestures for input. (...) Professor Brewster and his Multimodal Interaction Group realised that they could get other information out of accelerometers too. The actual variations in a person's gait could be read and harnessed for different uses.

This kind of stuff is now closer to the market. Phone companies are up to releasing similar projects. I am eager to see people waving in the streets just to zip files or to shuffle songs in their ipods!

A Bovine Rectal Palpation Simulator for Training Veterinary Students

No it ain't spam. This haptic cow simulator done by the Glasgow Interactive Systems Group and the Faculty of Veterinary Medicine (University of Glasgow, UK):

Bovine rectal palpation is a necessary skill for a veterinary student to learn. However, lack of resources and welfare issues currently restrict the amount of training available to students in this procedure. Here we present a virtual reality based teaching tool - the Bovine Rectal Palpation Simulator - that has been developed as a supplement to existing training methods. When using the simulator, the student palpates virtual objects representing the bovine reproductive tract, receiving feedback from a PHANToM haptic device (inside a fibreglass model of a cow), while the teacher follows the student's actions on the monitor and gives instruction. We present a validation experiment that compares the performance of a group of traditionally trained students with a group whose training was supplemented with a simulator training session. The subsequent performance in the real task, when examining cows for the first time, was assessed with the results showing a significantly better performance for the simulator group.

Definitely a tangible-oriented design! The corresponding scientific paper: Baillie, S., Crossan, A., Brewster, S.A., Mellor, D. and Reid, S. Validation of a Bovine Rectal Palpation Simulator for Training Veterinary Students. In Proceedings of MMVR 2005 (Long Beach, CA, USA).