"Your room as your browser" is the motto of a project led by Philips Research:
Philips has developed a common language for describing experiences within an Ambient Intelligence environment: Physical Markup Language (PML). (...) An Ambient Intelligence system can interpret a description in PML in such a way that the devices in its network can jointly use their individual capabilities to render that experience at a given location.In effect, your whole room becomes a 'browser' that brings the experience to life. For example, PML-enabled lights add to the experience by getting brighter or dimmer, or changing colour. A PML-enabled hi-fi provides an appropriate soundscape. Almost any device can be PML-enabled: the possibilities are only limited by the imaginations of their manufacturers. Suppose a room is rendering an experience described as 'warm and sunny': the lights, the TV, the central heating, the electronically controlled blinds and (a little further into the future) even the ceiling, walls and floor coverings could all contribute to creating it.
The example they give is the following:
Ambient Intelligence, with its network of cooperating devices, offers the promise of providing us with exciting new experiences in the home. Suppose, for instance, that while you are reading a book or watching a movie the whole room around you begins to reflect the imaginary scene?
Why do I blog this? It's very close to their ambx idea (which is actually a spin-off from Philips): amBX-enabled games will provide gamers with the ability to use light, colour, sound, heat and even airflow in the real world during gameplay.
Would it be possible to connect this to Wil's taxonomy of his room?