fbpx
Wikipedia

Tangible user interface

A tangible user interface (TUI) is a user interface in which a person interacts with digital information through the physical environment. The initial name was Graspable User Interface, which is no longer used. The purpose of TUI development is to empower collaboration, learning, and design by giving physical forms to digital information, thus taking advantage of the human ability to grasp and manipulate physical objects and materials.[1]

Reactable, an electronic musical instrument example of tangible user interface
SandScape device installed in the Children's Creativity Museum in San Francisco

This was first conceived by Radia Perlman as a new programming language that would teach much younger children similar to Logo, but using special "keyboards" and input devices. Another pioneer in tangible user interfaces is Hiroshi Ishii, a professor at the MIT who heads the Tangible Media Group at the MIT Media Lab. His particular vision for tangible UIs, called Tangible Bits, is to give physical form to digital information, making bits directly manipulable and perceptible. Tangible bits pursues the seamless coupling between physical objects and virtual data.

Characteristics edit

There are several frameworks describing the key characteristics of tangible user interfaces. Brygg Ullmer and Hiroshi Ishii describe six characteristics concerning representation and control:[2]

  1. Physical representations are computationally coupled to underlying digital information.
  2. Physical representations embody mechanisms for interactive control.
  3. Physical representations are perceptually coupled to actively mediated digital representations.
  4. Physical state of tangibles embodies key aspects of the digital state of a system

Eva Hornecker and Jacob Buur describe a structured framework with four themes:[3]

  1. Tangible manipulation: material representations with distinct tactile qualities, which are typically physically manipulated. A typical example is haptic direct manipulation: can user grab, feel and move important elements in the interface?
  2. Spatial interaction: tangible interaction is embedded in real space; interaction occurs as movement in this space. An example is full body interaction: can the user make use of their whole body?
  3. Embodied facilitation: the configuration of material objects and space affects how multiple users interact jointly with the tangible user interface. Examples include multiple access points: can all users in the space see what is going on and interact with central elements of the interface?
  4. Expressive representation: expressiveness and legibility of material and digital representations employed by tangible interaction systems. An example is representational significance: do physical and digital representations have the same strength and salience?

According to Mi Jeong Kim and Mary Lou Maher, the five basic defining properties of tangible user interfaces are as follows:[4]

  1. Space-multiplex both input and output.
  2. Concurrent access and manipulation of interface components.
  3. Strong specific devices.
  4. Spatially aware computational devices.
  5. Spatial re-configurability of devices.

Comparison with graphical user interfaces edit

A tangible user interface must be differentiated from a graphical user interface (GUI). A GUI exists only in the digital world, whereas a TUI connects the digital with the physical world. For example, a screen displays the digital information, whereas a mouse allows us to directly interact with this digital information.[5] A tangible user interface represents the input directly in the physical world, and makes the digital information directly graspable.[6]

A tangible user interface is usually built for one specific target group, because of the low range of possible application areas. Therefore, the design of the interface must be developed together with the target group to ensure a good user experience.[7]

In comparison with a TUI, a GUI has a wide range of usages in one interface. Because of that it targets a large group of possible users.[7]

One advantage of the TUI is the user experience, because it occurs a physical interaction between the user and the interface itself (E.g.: SandScape: Building your own landscape with sand). Another advantage is usability, because the user knows intuitively how to use the interface by knowing the function of the physical object. So, the user does not need to learn the functionality. That is why the Tangible User interface is often used to make technology more accessible for elderly people.[6]

Interface type/attributes Tangible user interface Graphical user interface
Range of possible application areas Build for one specific application area Build for many kinds of application areas
How the system is driven Physical objects, such as a mouse or a keyboard Based on graphical bits, such as pixels on the screen
Coupling between cognitive bits and the physical output Unmediated connection Indirect connection
How user experience is driven The user already knows the function of the interface by knowing how the physical objects function The user explores the functionality of the interface
User behavior when approaching the system Intuition Recognition

[7]

Examples edit

A simple example of tangible UI is the computer mouse: Dragging the mouse over a flat surface moves a pointer on the screen accordingly. There is a very clear relationship about the behaviors shown by a system with the movements of a mouse. Other examples include:

  • Marble Answering Machine by Durrell Bishop (1992).[8] A marble represents a single message left on the answering machine. Dropping a marble into a dish plays back the associated message or calls back the caller.
  • The Topobo system. The blocks in Topobo are like LEGO blocks which can be snapped together, but can also move by themselves using motorized components. A person can push, pull, and twist these blocks, and the blocks can memorize these movements and replay them.[9]
  • Implementations which allow the user to sketch a picture on the system's table top with a real tangible pen. Using hand gestures, the user can clone the image and stretch it in the X and Y axes just as one would in a paint program. This system would integrate a video camera with a gesture recognition system.
  • jive. The implementation of a TUI helped make this product more accessible to elderly users of the product. The 'friend' passes can also be used to activate different interactions with the product.[10]
  • a projection augmented model.
  • SandScape: Designing landscape with TUI. This interface lets the user form a landscape out of sand on a table. The sand model represents the terrain, which is projected on the surface. In real-time the model projects the deformations of the sand.[6]

Several approaches have been made to establish a generic middleware for TUIs. They target toward the independence of application domains as well as flexibility in terms of the deployed sensor technology. For example, Siftables provides an application platform in which small gesture sensitive displays act together to form a human-computer interface.

For collaboration support, TUIs have to allow the spatial distribution, asynchronous activities, and the dynamic modification of the TUI infrastructure, to name the most prominent ones. This approach presents a framework based on the LINDA tuple space concept to meet these requirements. The implemented TUIpist framework deploys arbitrary sensor technology for any type of application and actuators in distributed environments.[11]

State of the art edit

Interest in tangible user interfaces (TUIs) has grown constantly since the 1990s, and with every year, more tangible systems are showing up. A 2017 white paper outlines the evolution of TUIs for touch table experiences and raises new possibilities for experimentation and development.[12]

In 1999, Gary Zalewski patented a system of moveable children's blocks containing sensors and displays for teaching spelling and sentence composition.[13]

Tangible Engine is a proprietary authoring application used to build object-recognition interfaces for projected-capacitive touch tables. The Tangible Engine Media Creator allows users with little or no coding experience to quickly create TUI-based experiences.

The MIT Tangible Media Group, headed by Hiroshi Ishi is continuously developing and experimenting with TUIs including many tabletop applications.[14]

The Urp[15] system and the more advanced Augmented Urban Planning Workbench[16] allow digital simulations of air flow, shadows, reflections, and other data based on the positions and orientations of physical models of buildings, on the table surface.

Newer developments go even one step further and incorporate the third dimension by allowing a user to form landscapes with clay (Illuminating Clay[17]) or sand (Sand Scape[18]). Again different simulations allow the analysis of shadows, height maps, slopes and other characteristics of the interactively formable landmasses.

InfrActables is a back projection collaborative table that allows interaction by using TUIs that incorporate state recognition. Adding different buttons to the TUIs enables additional functions associated to the TUIs. Newer versions of the technology can even be integrated into LC-displays[19] by using infrared sensors behind the LC matrix.

The Tangible Disaster[20] allows the user to analyze disaster measures and simulate different kinds of disasters (fire, flood, tsunami,.) and evacuation scenarios during collaborative planning sessions. Physical objects allow positioning disasters by placing them on the interactive map and additionally tuning parameters (i.e. scale) using dials attached to them.

The commercial potential of TUIs has been identified recently. The repeatedly awarded Reactable,[21] an interactive tangible tabletop instrument, is now distributed commercially by Reactable Systems, a spinoff company of the Pompeu Fabra University, where it was developed. With the Reactable users can set up their own instrument interactively, by physically placing different objects (representing oscillators, filters, modulators...) and parametrise them by rotating and using touch-input.

Microsoft is distributing its novel Windows-based platform Microsoft Surface[22] (now Microsoft PixelSense) since 2009. Beside multi-touch tracking of fingers, the platform supports the recognition of physical objects by their footprints. Several applications, mainly for the use in commercial space, have been presented. Examples range from designing an own individual graphical layout for a snowboard or skateboard to studying the details of a wine in a restaurant by placing it on the table and navigating through menus via touch input. Interactions such as the collaborative browsing of photographs from a handycam or cell phone that connects seamlessly once placed on the table are also supported.

Another notable interactive installation is instant city[23] that combines gaming, music, architecture and collaborative aspects. It allows the user to build three-dimensional structures and set up a city with rectangular building blocks, which simultaneously results in the interactive assembly of musical fragments of different composers.

The development of the Reactable and the subsequent release of its tracking technology reacTIVision[24] under the GNU/GPL as well as the open specifications of the TUIO protocol have triggered an enormous amount of developments based on this technology.

In the last few years, many amateur and semi-professional projects outside of academia and commerce have been started. Due to open source tracking technologies (reacTIVision[24]) and the ever-increasing computational power available to end-consumers, the required infrastructure is now accessible to almost everyone. A standard PC, webcam, and some handicraft work allows individuals to set up tangible systems with a minimal programming and material effort. This opens doors to novel ways of perceiving human-computer interaction and allows for new forms of creativity for the public to experiment with.[citation needed]

It is difficult to keep track and overlook the rapidly growing number of all these systems and tools, but while many of them seem only to utilize the available technologies and are limited to initial experiments and tests with some basic ideas or just reproduce existing systems, a few of them open out into novel interfaces and interactions and are deployed in public space or embedded in art installations.[25]

The Tangible Factory Planning[26] is a tangible table based on reacTIVision[24] that allows to collaboratively plan and visualize production processes in combination with plans of new factory buildings and was developed within a diploma thesis.

Another example of the many reacTIVision-based tabletops is ImpulsBauhaus-Interactive Table[27] and was on exhibition at the Bauhaus-University in Weimar marking the 90th anniversary of the establishment of Bauhaus. Visitors could browse and explore the biographies, complex relations and social networks between members of the movement.

Using principles derived from embodied cognition, cognitive load theory, and embodied design TUIs have been shown to increase learning performance by offering multimodal feedback.[28] However, these benefits for learning require forms of interaction design that leave as much cognitive capacity as possible for learning.

Physical icon edit

A physical icon, or phicon, is the tangible computing equivalent of an icon in a traditional graphical user interface, or GUI. Phicons hold a reference to some digital object and thereby convey meaning.[29][30][31]

History edit

Physical icons were first used as tangible interfaces in the metaDesk project built in 1997 by Professor Hiroshi Ishii's tangible bits research group at MIT.[32][33] The metaDesk consisted of a table whose surface showed a rear-projected video image. Placing a phicon on the table triggered sensors that altered the video projection.[34]

See also edit

References edit

  1. ^ Ishii, Hiroshi (2008). "Tangible bits". Proceedings of the 2nd international conference on Tangible and embedded interaction - TEI '08. pp. xv. doi:10.1145/1347390.1347392. ISBN 978-1-60558-004-3. S2CID 18166868.
  2. ^ Ullmer, Brygg; Ishii, Hiroshi (2000). "Emerging Frameworks for Tangible User Interfaces" (PDF). IBM Systems Journal. 39 (3–4): 915–931. Retrieved 17 February 2024.
  3. ^ Hornecker, Eva; Buur, Jacob (2006). "Getting a Grip on Tangible Interaction: a Framework on Physical Space and Social Interaction" (PDF). Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. CHI (06). doi:10.1145/1124772.1124838. Retrieved 17 February 2024.
  4. ^ Kim, Mi Jeong; Maher, Mary Lou (30 May 2008). "The Impact of Tangible User Interfaces on Designers' Spatial Cognition". Human–Computer Interaction. 23 (2): 101–137. doi:10.1080/07370020802016415. S2CID 1268154.
  5. ^ http://tmg-trackr.media.mit.edu:8020/SuperContainer/RawData/Papers/485-Radical%20Atoms%20Beyond%20Tangible/Published/PDF 19 September 2012 at the Wayback Machine[full citation needed]
  6. ^ a b c Ishii, Hiroshi (2007). "Tangible User Interfaces". The Human-Computer Interaction Handbook. pp. 495–514. doi:10.1201/9781410615862-35. ISBN 978-0-429-16397-5.
  7. ^ a b c Campbell, John; Carandang, Xharmagne (29 July 2012). "Comparing Graphical and Tangible User Interfaces for a Tower Defense Game". AMCIS 2012 Proceedings. CiteSeerX 10.1.1.924.6112.
  8. ^ "Internet-of-Things answering machine from 1992, with marbles / Boing Boing". boingboing.net. 21 March 2013.
  9. ^ "Topobo construction kit with kinetic memory". www.topobo.com.
  10. ^ "jive - social networking for your gran". jive.benarent.co.uk.
  11. ^ http://www.cs.rit.edu/~pns6910/docs/Tuple%20Space/A%20Tuple-Space%20Based%20Middleware%20for%20Collaborative%20Tangible%20User%20Interfaces.pdf[dead link][full citation needed]
  12. ^ "The Evolution of Tangible User Interfaces on Touch Tables | Ideum". Ideum - exhibit design | touch tables | interactive exhibits. Retrieved 31 October 2017.
  13. ^ "Wireless I/O apparatus and method of computer-assisted instruction".
  14. ^ "Tangible Media". www.media.mit.edu. MIT Media Lab. Retrieved 10 December 2014.
  15. ^ Underkoffler, John; Ishii, Hiroshi (1999). "Urp". Proceedings of the SIGCHI conference on Human factors in computing systems the CHI is the limit - CHI '99. pp. 386–393. doi:10.1145/302979.303114. ISBN 978-0-201-48559-2. S2CID 52817952.
  16. ^ Ishii, H.; Underkoffler, J.; Chak, D.; Piper, B.; Ben-Joseph, E.; Yeung, L.; Kanji, Z. (2002). "Augmented urban planning workbench: Overlaying drawings, physical models and digital simulation". Proceedings. International Symposium on Mixed and Augmented Reality. pp. 203–211. CiteSeerX 10.1.1.19.4960. doi:10.1109/ISMAR.2002.1115090. ISBN 978-0-7695-1781-0. S2CID 2303022.
  17. ^ Piper, Ben; Ratti, Carlo; Ishii, Hiroshi (2002). "Illuminating clay". Proceedings of the SIGCHI conference on Human factors in computing systems Changing our world, changing ourselves - CHI '02. p. 355. doi:10.1145/503376.503439. ISBN 978-1-58113-453-7. S2CID 7146503.
  18. ^ Ishii, Hiroshi (June 2008). "The tangible user interface and its evolution". Communications of the ACM. 51 (6): 32–36. doi:10.1145/1349026.1349034. S2CID 29416502.
  19. ^ Hofer, Ramon; Kaplan, Patrick; Kunz, Andreas (2008). "Mighty Trace". Proceeding of the twenty-sixth annual CHI conference on Human factors in computing systems - CHI '08. p. 215. doi:10.1145/1357054.1357091. hdl:20.500.11850/9226. ISBN 978-1-60558-011-1. S2CID 12977345.
  20. ^ Alexa, Marc (5 August 2007). "Tangible user interface for supporting disaster education". ACM SIGGRAPH 2007 posters. Siggraph '07. pp. 144–es. doi:10.1145/1280720.1280877. ISBN 9781450318280. S2CID 1851821.
  21. ^ Jordà, Sergi; Geiger, Günter; Alonso, Marcos; Kaltenbrunner, Martin (2007). "The reac Table". Proceedings of the 1st international conference on Tangible and embedded interaction - TEI '07. p. 139. CiteSeerX 10.1.1.81.1645. doi:10.1145/1226969.1226998. ISBN 978-1-59593-619-6. S2CID 17384158.
  22. ^ Wall, Josh (2009). "Demo I Microsoft Surface and the Single View Platform". 2009 International Symposium on Collaborative Technologies and Systems. pp. xxxi–xxxii. doi:10.1109/CTS.2009.5067436. ISBN 978-1-4244-4584-4.
  23. ^ Hauert, Sibylle; Reichmuth, Daniel; Böhm, Volker (2007). "Instant city". Proceedings of the 7th international conference on New interfaces for musical expression - NIME '07. p. 422. doi:10.1145/1279740.1279846. S2CID 22458111.
  24. ^ a b c Kaltenbrunner, Martin; Bencina, Ross (2007). "ReacTIVision". Proceedings of the 1st international conference on Tangible and embedded interaction - TEI '07. p. 69. doi:10.1145/1226969.1226983. ISBN 978-1-59593-619-6. S2CID 459304.
  25. ^ "Sourceforge TUIO User Exhibition".
  26. ^ Tangible Factory Planning, Diploma Thesis, Daniel Guse, http://www.danielguse.de/tangibletable.php 9 July 2010 at the Wayback Machine
  27. ^ "Interactive Table with reacTIVision : ImpulsBauhaus".
  28. ^ Skulmowski, Alexander; Pradel, Simon; Kühnert, Tom; Brunnett, Guido; Rey, Günter Daniel (January 2016). "Embodied learning using a tangible user interface: The effects of haptic perception and selective pointing on a spatial learning task". Computers & Education. 92–93: 64–75. doi:10.1016/j.compedu.2015.10.011. S2CID 10493691.
  29. ^ Fidalgo, F., Silva, P., Realinho, V.: "Ubiquitous Computing and Organizations", page 201. Current Developments in Technology-Assisted Education, 2006
  30. ^ Michitaka Hirose (2001). Human-computer Interaction: INTERACT '01 : IFIP TC.13 International Conference on Human-Computer Interaction, 9th-13th July 2001, Tokyo, Japan. IOS Press. pp. 337–. ISBN 978-1-58603-188-6.
  31. ^ Hamid Aghajan; Juan Carlos Augusto; Ramon Lopez-Cozar Delgado (25 September 2009). Human-Centric Interfaces for Ambient Intelligence. Academic Press. pp. 15–. ISBN 978-0-08-087850-8.
  32. ^ Howard Rheingold (21 March 2007). Smart Mobs: The Next Social Revolution. Basic Books. pp. 104–. ISBN 978-0-465-00439-3.
  33. ^ Paul Dourish (2004). Where the Action is: The Foundations of Embodied Interaction. MIT Press. pp. 45–. ISBN 978-0-262-54178-7.
  34. ^ Mary Beth Rosson; John Millar Carroll (2002). Usability Engineering: Scenario-based Development of Human-computer Interaction. Morgan Kaufmann. pp. 316–. ISBN 978-1-55860-712-5.

External links edit

  • TUIO open framework for tangible multitouch surfaces at TUIO.org
  • Encyclopedia entry on the history of Tangible Interaction and Tangible User Interfaces
  • White paper on The Evolution of Tangible User Interfaces on Touch Tables

tangible, user, interface, confused, with, text, based, user, interface, tangible, user, interface, user, interface, which, person, interacts, with, digital, information, through, physical, environment, initial, name, graspable, user, interface, which, longer,. Not to be confused with Text based user interface A tangible user interface TUI is a user interface in which a person interacts with digital information through the physical environment The initial name was Graspable User Interface which is no longer used The purpose of TUI development is to empower collaboration learning and design by giving physical forms to digital information thus taking advantage of the human ability to grasp and manipulate physical objects and materials 1 Reactable an electronic musical instrument example of tangible user interface SandScape device installed in the Children s Creativity Museum in San Francisco This was first conceived by Radia Perlman as a new programming language that would teach much younger children similar to Logo but using special keyboards and input devices Another pioneer in tangible user interfaces is Hiroshi Ishii a professor at the MIT who heads the Tangible Media Group at the MIT Media Lab His particular vision for tangible UIs called Tangible Bits is to give physical form to digital information making bits directly manipulable and perceptible Tangible bits pursues the seamless coupling between physical objects and virtual data Contents 1 Characteristics 2 Comparison with graphical user interfaces 3 Examples 4 State of the art 5 Physical icon 5 1 History 6 See also 7 References 8 External linksCharacteristics editThere are several frameworks describing the key characteristics of tangible user interfaces Brygg Ullmer and Hiroshi Ishii describe six characteristics concerning representation and control 2 Physical representations are computationally coupled to underlying digital information Physical representations embody mechanisms for interactive control Physical representations are perceptually coupled to actively mediated digital representations Physical state of tangibles embodies key aspects of the digital state of a system Eva Hornecker and Jacob Buur describe a structured framework with four themes 3 Tangible manipulation material representations with distinct tactile qualities which are typically physically manipulated A typical example is haptic direct manipulation can user grab feel and move important elements in the interface Spatial interaction tangible interaction is embedded in real space interaction occurs as movement in this space An example is full body interaction can the user make use of their whole body Embodied facilitation the configuration of material objects and space affects how multiple users interact jointly with the tangible user interface Examples include multiple access points can all users in the space see what is going on and interact with central elements of the interface Expressive representation expressiveness and legibility of material and digital representations employed by tangible interaction systems An example is representational significance do physical and digital representations have the same strength and salience According to Mi Jeong Kim and Mary Lou Maher the five basic defining properties of tangible user interfaces are as follows 4 Space multiplex both input and output Concurrent access and manipulation of interface components Strong specific devices Spatially aware computational devices Spatial re configurability of devices Comparison with graphical user interfaces editA tangible user interface must be differentiated from a graphical user interface GUI A GUI exists only in the digital world whereas a TUI connects the digital with the physical world For example a screen displays the digital information whereas a mouse allows us to directly interact with this digital information 5 A tangible user interface represents the input directly in the physical world and makes the digital information directly graspable 6 A tangible user interface is usually built for one specific target group because of the low range of possible application areas Therefore the design of the interface must be developed together with the target group to ensure a good user experience 7 In comparison with a TUI a GUI has a wide range of usages in one interface Because of that it targets a large group of possible users 7 One advantage of the TUI is the user experience because it occurs a physical interaction between the user and the interface itself E g SandScape Building your own landscape with sand Another advantage is usability because the user knows intuitively how to use the interface by knowing the function of the physical object So the user does not need to learn the functionality That is why the Tangible User interface is often used to make technology more accessible for elderly people 6 Interface type attributes Tangible user interface Graphical user interface Range of possible application areas Build for one specific application area Build for many kinds of application areas How the system is driven Physical objects such as a mouse or a keyboard Based on graphical bits such as pixels on the screen Coupling between cognitive bits and the physical output Unmediated connection Indirect connection How user experience is driven The user already knows the function of the interface by knowing how the physical objects function The user explores the functionality of the interface User behavior when approaching the system Intuition Recognition 7 Examples editA simple example of tangible UI is the computer mouse Dragging the mouse over a flat surface moves a pointer on the screen accordingly There is a very clear relationship about the behaviors shown by a system with the movements of a mouse Other examples include Marble Answering Machine by Durrell Bishop 1992 8 A marble represents a single message left on the answering machine Dropping a marble into a dish plays back the associated message or calls back the caller The Topobo system The blocks in Topobo are like LEGO blocks which can be snapped together but can also move by themselves using motorized components A person can push pull and twist these blocks and the blocks can memorize these movements and replay them 9 Implementations which allow the user to sketch a picture on the system s table top with a real tangible pen Using hand gestures the user can clone the image and stretch it in the X and Y axes just as one would in a paint program This system would integrate a video camera with a gesture recognition system jive The implementation of a TUI helped make this product more accessible to elderly users of the product The friend passes can also be used to activate different interactions with the product 10 a projection augmented model SandScape Designing landscape with TUI This interface lets the user form a landscape out of sand on a table The sand model represents the terrain which is projected on the surface In real time the model projects the deformations of the sand 6 Several approaches have been made to establish a generic middleware for TUIs They target toward the independence of application domains as well as flexibility in terms of the deployed sensor technology For example Siftables provides an application platform in which small gesture sensitive displays act together to form a human computer interface For collaboration support TUIs have to allow the spatial distribution asynchronous activities and the dynamic modification of the TUI infrastructure to name the most prominent ones This approach presents a framework based on the LINDA tuple space concept to meet these requirements The implemented TUIpist framework deploys arbitrary sensor technology for any type of application and actuators in distributed environments 11 State of the art editInterest in tangible user interfaces TUIs has grown constantly since the 1990s and with every year more tangible systems are showing up A 2017 white paper outlines the evolution of TUIs for touch table experiences and raises new possibilities for experimentation and development 12 In 1999 Gary Zalewski patented a system of moveable children s blocks containing sensors and displays for teaching spelling and sentence composition 13 Tangible Engine is a proprietary authoring application used to build object recognition interfaces for projected capacitive touch tables The Tangible Engine Media Creator allows users with little or no coding experience to quickly create TUI based experiences The MIT Tangible Media Group headed by Hiroshi Ishi is continuously developing and experimenting with TUIs including many tabletop applications 14 The Urp 15 system and the more advanced Augmented Urban Planning Workbench 16 allow digital simulations of air flow shadows reflections and other data based on the positions and orientations of physical models of buildings on the table surface Newer developments go even one step further and incorporate the third dimension by allowing a user to form landscapes with clay Illuminating Clay 17 or sand Sand Scape 18 Again different simulations allow the analysis of shadows height maps slopes and other characteristics of the interactively formable landmasses InfrActables is a back projection collaborative table that allows interaction by using TUIs that incorporate state recognition Adding different buttons to the TUIs enables additional functions associated to the TUIs Newer versions of the technology can even be integrated into LC displays 19 by using infrared sensors behind the LC matrix The Tangible Disaster 20 allows the user to analyze disaster measures and simulate different kinds of disasters fire flood tsunami and evacuation scenarios during collaborative planning sessions Physical objects allow positioning disasters by placing them on the interactive map and additionally tuning parameters i e scale using dials attached to them The commercial potential of TUIs has been identified recently The repeatedly awarded Reactable 21 an interactive tangible tabletop instrument is now distributed commercially by Reactable Systems a spinoff company of the Pompeu Fabra University where it was developed With the Reactable users can set up their own instrument interactively by physically placing different objects representing oscillators filters modulators and parametrise them by rotating and using touch input Microsoft is distributing its novel Windows based platform Microsoft Surface 22 now Microsoft PixelSense since 2009 Beside multi touch tracking of fingers the platform supports the recognition of physical objects by their footprints Several applications mainly for the use in commercial space have been presented Examples range from designing an own individual graphical layout for a snowboard or skateboard to studying the details of a wine in a restaurant by placing it on the table and navigating through menus via touch input Interactions such as the collaborative browsing of photographs from a handycam or cell phone that connects seamlessly once placed on the table are also supported Another notable interactive installation is instant city 23 that combines gaming music architecture and collaborative aspects It allows the user to build three dimensional structures and set up a city with rectangular building blocks which simultaneously results in the interactive assembly of musical fragments of different composers The development of the Reactable and the subsequent release of its tracking technology reacTIVision 24 under the GNU GPL as well as the open specifications of the TUIO protocol have triggered an enormous amount of developments based on this technology In the last few years many amateur and semi professional projects outside of academia and commerce have been started Due to open source tracking technologies reacTIVision 24 and the ever increasing computational power available to end consumers the required infrastructure is now accessible to almost everyone A standard PC webcam and some handicraft work allows individuals to set up tangible systems with a minimal programming and material effort This opens doors to novel ways of perceiving human computer interaction and allows for new forms of creativity for the public to experiment with citation needed It is difficult to keep track and overlook the rapidly growing number of all these systems and tools but while many of them seem only to utilize the available technologies and are limited to initial experiments and tests with some basic ideas or just reproduce existing systems a few of them open out into novel interfaces and interactions and are deployed in public space or embedded in art installations 25 The Tangible Factory Planning 26 is a tangible table based on reacTIVision 24 that allows to collaboratively plan and visualize production processes in combination with plans of new factory buildings and was developed within a diploma thesis Another example of the many reacTIVision based tabletops is ImpulsBauhaus Interactive Table 27 and was on exhibition at the Bauhaus University in Weimar marking the 90th anniversary of the establishment of Bauhaus Visitors could browse and explore the biographies complex relations and social networks between members of the movement Using principles derived from embodied cognition cognitive load theory and embodied design TUIs have been shown to increase learning performance by offering multimodal feedback 28 However these benefits for learning require forms of interaction design that leave as much cognitive capacity as possible for learning Physical icon editA physical icon or phicon is the tangible computing equivalent of an icon in a traditional graphical user interface or GUI Phicons hold a reference to some digital object and thereby convey meaning 29 30 31 History edit Physical icons were first used as tangible interfaces in the metaDesk project built in 1997 by Professor Hiroshi Ishii s tangible bits research group at MIT 32 33 The metaDesk consisted of a table whose surface showed a rear projected video image Placing a phicon on the table triggered sensors that altered the video projection 34 See also editHardware interface Kinetic user interface Natural user interface Organic user interface Unconventional computingReferences edit Ishii Hiroshi 2008 Tangible bits Proceedings of the 2nd international conference on Tangible and embedded interaction TEI 08 pp xv doi 10 1145 1347390 1347392 ISBN 978 1 60558 004 3 S2CID 18166868 Ullmer Brygg Ishii Hiroshi 2000 Emerging Frameworks for Tangible User Interfaces PDF IBM Systems Journal 39 3 4 915 931 Retrieved 17 February 2024 Hornecker Eva Buur Jacob 2006 Getting a Grip on Tangible Interaction a Framework on Physical Space and Social Interaction PDF Proceedings of the SIGCHI Conference on Human Factors in Computing Systems CHI 06 doi 10 1145 1124772 1124838 Retrieved 17 February 2024 Kim Mi Jeong Maher Mary Lou 30 May 2008 The Impact of Tangible User Interfaces on Designers Spatial Cognition Human Computer Interaction 23 2 101 137 doi 10 1080 07370020802016415 S2CID 1268154 http tmg trackr media mit edu 8020 SuperContainer RawData Papers 485 Radical 20Atoms 20Beyond 20Tangible Published PDF Archived 19 September 2012 at the Wayback Machine full citation needed a b c Ishii Hiroshi 2007 Tangible User Interfaces The Human Computer Interaction Handbook pp 495 514 doi 10 1201 9781410615862 35 ISBN 978 0 429 16397 5 a b c Campbell John Carandang Xharmagne 29 July 2012 Comparing Graphical and Tangible User Interfaces for a Tower Defense Game AMCIS 2012 Proceedings CiteSeerX 10 1 1 924 6112 Internet of Things answering machine from 1992 with marbles Boing Boing boingboing net 21 March 2013 Topobo construction kit with kinetic memory www topobo com jive social networking for your gran jive benarent co uk http www cs rit edu pns6910 docs Tuple 20Space A 20Tuple Space 20Based 20Middleware 20for 20Collaborative 20Tangible 20User 20Interfaces pdf dead link full citation needed The Evolution of Tangible User Interfaces on Touch Tables Ideum Ideum exhibit design touch tables interactive exhibits Retrieved 31 October 2017 Wireless I O apparatus and method of computer assisted instruction Tangible Media www media mit edu MIT Media Lab Retrieved 10 December 2014 Underkoffler John Ishii Hiroshi 1999 Urp Proceedings of the SIGCHI conference on Human factors in computing systems the CHI is the limit CHI 99 pp 386 393 doi 10 1145 302979 303114 ISBN 978 0 201 48559 2 S2CID 52817952 Ishii H Underkoffler J Chak D Piper B Ben Joseph E Yeung L Kanji Z 2002 Augmented urban planning workbench Overlaying drawings physical models and digital simulation Proceedings International Symposium on Mixed and Augmented Reality pp 203 211 CiteSeerX 10 1 1 19 4960 doi 10 1109 ISMAR 2002 1115090 ISBN 978 0 7695 1781 0 S2CID 2303022 Piper Ben Ratti Carlo Ishii Hiroshi 2002 Illuminating clay Proceedings of the SIGCHI conference on Human factors in computing systems Changing our world changing ourselves CHI 02 p 355 doi 10 1145 503376 503439 ISBN 978 1 58113 453 7 S2CID 7146503 Ishii Hiroshi June 2008 The tangible user interface and its evolution Communications of the ACM 51 6 32 36 doi 10 1145 1349026 1349034 S2CID 29416502 Hofer Ramon Kaplan Patrick Kunz Andreas 2008 Mighty Trace Proceeding of the twenty sixth annual CHI conference on Human factors in computing systems CHI 08 p 215 doi 10 1145 1357054 1357091 hdl 20 500 11850 9226 ISBN 978 1 60558 011 1 S2CID 12977345 Alexa Marc 5 August 2007 Tangible user interface for supporting disaster education ACM SIGGRAPH 2007 posters Siggraph 07 pp 144 es doi 10 1145 1280720 1280877 ISBN 9781450318280 S2CID 1851821 Jorda Sergi Geiger Gunter Alonso Marcos Kaltenbrunner Martin 2007 The reac Table Proceedings of the 1st international conference on Tangible and embedded interaction TEI 07 p 139 CiteSeerX 10 1 1 81 1645 doi 10 1145 1226969 1226998 ISBN 978 1 59593 619 6 S2CID 17384158 Wall Josh 2009 Demo I Microsoft Surface and the Single View Platform 2009 International Symposium on Collaborative Technologies and Systems pp xxxi xxxii doi 10 1109 CTS 2009 5067436 ISBN 978 1 4244 4584 4 Hauert Sibylle Reichmuth Daniel Bohm Volker 2007 Instant city Proceedings of the 7th international conference on New interfaces for musical expression NIME 07 p 422 doi 10 1145 1279740 1279846 S2CID 22458111 a b c Kaltenbrunner Martin Bencina Ross 2007 ReacTIVision Proceedings of the 1st international conference on Tangible and embedded interaction TEI 07 p 69 doi 10 1145 1226969 1226983 ISBN 978 1 59593 619 6 S2CID 459304 Sourceforge TUIO User Exhibition Tangible Factory Planning Diploma Thesis Daniel Guse http www danielguse de tangibletable php Archived 9 July 2010 at the Wayback Machine Interactive Table with reacTIVision ImpulsBauhaus Skulmowski Alexander Pradel Simon Kuhnert Tom Brunnett Guido Rey Gunter Daniel January 2016 Embodied learning using a tangible user interface The effects of haptic perception and selective pointing on a spatial learning task Computers amp Education 92 93 64 75 doi 10 1016 j compedu 2015 10 011 S2CID 10493691 Fidalgo F Silva P Realinho V Ubiquitous Computing and Organizations page 201 Current Developments in Technology Assisted Education 2006 Michitaka Hirose 2001 Human computer Interaction INTERACT 01 IFIP TC 13 International Conference on Human Computer Interaction 9th 13th July 2001 Tokyo Japan IOS Press pp 337 ISBN 978 1 58603 188 6 Hamid Aghajan Juan Carlos Augusto Ramon Lopez Cozar Delgado 25 September 2009 Human Centric Interfaces for Ambient Intelligence Academic Press pp 15 ISBN 978 0 08 087850 8 Howard Rheingold 21 March 2007 Smart Mobs The Next Social Revolution Basic Books pp 104 ISBN 978 0 465 00439 3 Paul Dourish 2004 Where the Action is The Foundations of Embodied Interaction MIT Press pp 45 ISBN 978 0 262 54178 7 Mary Beth Rosson John Millar Carroll 2002 Usability Engineering Scenario based Development of Human computer Interaction Morgan Kaufmann pp 316 ISBN 978 1 55860 712 5 External links editTUIO open framework for tangible multitouch surfaces at TUIO org Encyclopedia entry on the history of Tangible Interaction and Tangible User Interfaces White paper on The Evolution of Tangible User Interfaces on Touch Tables Retrieved from https en wikipedia org w index php title Tangible user interface amp oldid 1220089631, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.