0% found this document useful (0 votes)
150 views

Tangible User Interfaces

Tangible User Interfaces: Past, Present, and Future Directions

Uploaded by

Khalifa Moiz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
150 views

Tangible User Interfaces

Tangible User Interfaces: Past, Present, and Future Directions

Uploaded by

Khalifa Moiz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 30

Full text available at: http://dx.doi.org/10.

1561/1100000026

Tangible User Interfaces:


Past, Present, and
Future Directions
Full text available at: http://dx.doi.org/10.1561/1100000026

Tangible User Interfaces:


Past, Present, and
Future Directions

Orit Shaer

Wellesley College
Wellesley, MA 02481
USA
[email protected]

Eva Hornecker

University of Strathclyde
Scotland, G1 1XH
UK
[email protected]

Boston – Delft
Full text available at: http://dx.doi.org/10.1561/1100000026

Foundations and Trends


R
in
Human–Computer Interaction

Published, sold and distributed by:


now Publishers Inc.
PO Box 1024
Hanover, MA 02339
USA
Tel. +1-781-985-4510
www.nowpublishers.com
[email protected]

Outside North America:


now Publishers Inc.
PO Box 179
2600 AD Delft
The Netherlands
Tel. +31-6-51115274

The preferred citation for this publication is O. Shaer and E. Hornecker, Tangible
User Interfaces: Past, Present, and Future Directions, Foundations and Trends
R
in
Human–Computer Interaction, vol 3, nos 1–2, pp 1–137, 2009

ISBN: 978-1-60198-328-2
c 2010 O. Shaer and E. Hornecker

All rights reserved. No part of this publication may be reproduced, stored in a retrieval
system, or transmitted in any form or by any means, mechanical, photocopying, recording
or otherwise, without prior written permission of the publishers.
Photocopying. In the USA: This journal is registered at the Copyright Clearance Cen-
ter, Inc., 222 Rosewood Drive, Danvers, MA 01923. Authorization to photocopy items for
internal or personal use, or the internal or personal use of specific clients, is granted by
now Publishers Inc for users registered with the Copyright Clearance Center (CCC). The
‘services’ for users can be found on the internet at: www.copyright.com
For those organizations that have been granted a photocopy license, a separate system
of payment has been arranged. Authorization does not extend to other kinds of copy-
ing, such as that for general distribution, for advertising or promotional purposes, for
creating new collective works, or for resale. In the rest of the world: Permission to pho-
tocopy must be obtained from the copyright owner. Please apply to now Publishers Inc.,
PO Box 1024, Hanover, MA 02339, USA; Tel. +1-781-871-0245; www.nowpublishers.com;
[email protected]
now Publishers Inc. has an exclusive license to publish this material worldwide. Permission
to use this content must be obtained from the copyright license holder. Please apply to now
Publishers, PO Box 179, 2600 AD Delft, The Netherlands, www.nowpublishers.com; e-mail:
[email protected]
Full text available at: http://dx.doi.org/10.1561/1100000026

Foundations and Trends R


in
Human–Computer Interaction
Volume 3 Issues 1–2, 2009
Editorial Board

Editor-in-Chief:
Ben Bederson
Human–Computer Interaction Lab
University of Maryland
3171 A. V. Williams Bldg
20742, College Park, MD

Editors
Gregory Abowd (Georgia Institute of Technology)
Jonathan Grudin (Microsoft Research)
Clayton Lewis (University of Colorado)
Jakob Nielsen (Nielsen Norman Group)
Don Norman (Nielsen Norman Group and Northwestern University)
Dan Olsen (Brigham Young University)
Gary Olson (UC Irvine)
Full text available at: http://dx.doi.org/10.1561/1100000026

Editorial Scope

Foundations and Trends R


in Human–Computer Interaction
will publish survey and tutorial articles in the following topics:
• History of the research Community • Online communities
• Design and Evaluation • Games
• Ergonomics/Human Factors • Communication technologies
• Cognitive engineering and • Interdisciplinary influence
performance models • The role of the social sciences in HCI
• Predictive models of interaction • MIS and HCI
• User-centered design processes • Graphic design
• Participatory design • Artificial intelligence and the user
• Graphic design interface
• Discount evaluation techniques • Architecture and the role of the
• Design and interaction physical environment
• Ethnography • Advanced topics and tends
• Theory • Information visualization
• Models of cognition • Web design
• Empirical methods of evaluation • Assistive technologies
• Qualitative methods of design • Multimodal interaction
and evaluation • Perception and the user interface
• Technology • Specific user groups (children, elders,
• Programming the graphical user etc.)
interface • Sensor-based or tangible interaction
• Input technologies • Ubiquitous computing
• Output technologies • Virtual reality
• Computer supported cooperative • Augmented reality
work • Wearable computing
• History of CSCW in HCI • Design and fashion
• Organizational issues • Privacy and social implications

Information for Librarians


Foundations and Trends R
in Human–Computer Interaction, 2009, Volume 3,
4 issues. ISSN paper version 1551-3955. ISSN online version 1551-3963. Also avail-
able as a combined paper and online subscription.
Full text available at: http://dx.doi.org/10.1561/1100000026

Foundations and Trends R


in
Human–Computer Interaction
Vol. 3, Nos. 1–2 (2009) 1–137

c 2010 O. Shaer and E. Hornecker
DOI: 10.1561/1100000026

Tangible User Interfaces: Past, Present, and


Future Directions

Orit Shaer1 and Eva Hornecker2

1
Wellesley College, 106 Central St., Wellesley, MA, 02481, USA,
[email protected]
2
University of Strathclyde, 26 Richmond Street, Glasgow, Scotland,
G1 1XH, UK, [email protected]

Abstract
In the last two decades, Tangible User Interfaces (TUIs) have emerged
as a new interface type that interlinks the digital and physical worlds.
Drawing upon users’ knowledge and skills of interaction with the real
non-digital world, TUIs show a potential to enhance the way in which
people interact with and leverage digital information. However, TUI
research is still in its infancy and extensive research is required in
order to fully understand the implications of tangible user interfaces,
to develop technologies that further bridge the digital and the physical,
and to guide TUI design with empirical knowledge.
This monograph examines the existing body of work on Tangible
User Interfaces. We start by sketching the history of tangible user inter-
faces, examining the intellectual origins of this field. We then present
TUIs in a broader context, survey application domains, and review
frameworks and taxonomies. We also discuss conceptual foundations
Full text available at: http://dx.doi.org/10.1561/1100000026

of TUIs including perspectives from cognitive sciences, psychology,


and philosophy. Methods and technologies for designing, building, and
evaluating TUIs are also addressed. Finally, we discuss the strengths
and limitations of TUIs and chart directions for future research.
Full text available at: http://dx.doi.org/10.1561/1100000026

Contents

1 Introduction 1

2 Origins of Tangible User Interfaces 5


2.1 Graspable User Interface 6
2.2 Tangible Bits 7
2.3 Precursors of Tangible User Interfaces 9

3 Tangible Interfaces in a Broader Context 13


3.1 Related Research Areas 13
3.2 Unifying Perspectives 16
3.3 Reality-Based Interaction 18

4 Application Domains 21
4.1 TUIs for Learning 22
4.2 Problem Solving and Planning 26
4.3 Information Visualization 30
4.4 Tangible Programming 32
4.5 Entertainment, Play, and Edutainment 35
4.6 Music and Performance 38
4.7 Social Communication 42
4.8 Tangible Reminders and Tags 43

5 Frameworks and Taxonomies 45

5.1 Properties of Graspable User Interfaces 46

ix
Full text available at: http://dx.doi.org/10.1561/1100000026

5.2 Conceptualization of TUIs and the MCRit


Interaction Model 47
5.3 Classifications of TUIs 48
5.4 Frameworks on Mappings: Coupling the Physical
with the Digital 50
5.5 Tokens and Constraints 53
5.6 Frameworks for Tangible and Sensor-Based Interaction 55
5.7 Domain-Specific Frameworks 58

6 Conceptual Foundations 61

6.1 Cuing Interaction: Affordances, Constraints, Mappings


and Image Schemas 61
6.2 Embodiment and Phenomenology 63
6.3 External Representation and Distributed Cognition 65
6.4 Two-Handed Interaction 68
6.5 Semiotics 69

7 Implementation Technologies 73
7.1 RFID 74
7.2 Computer Vision 75
7.3 Microcontrollers, Sensors, and Actuators 77
7.4 Comparison of Implementation Technologies 79
7.5 Tool Support for Tangible Interaction 81

8 Design and Evaluation Methods 89


8.1 Design and Implementation 89
8.2 Evaluation 94

9 Strengths and Limitations of


Tangible User Interfaces 97
9.1 Strengths 98
9.2 Limitations 106
Full text available at: http://dx.doi.org/10.1561/1100000026

10 Research Directions 111


10.1 Actuation 111
10.2 From Tangible User Interfaces to Organic User Interfaces 113
10.3 From Tangible Representation to Tangible Resources for
Action 114
10.4 Whole-Body Interaction and Performative Tangible
Interaction 116
10.5 Aesthetics 117
10.6 Long-Term Interaction Studies 117

11 Summary 121

Acknowledgments 123

References 125
Full text available at: http://dx.doi.org/10.1561/1100000026

1
Introduction

“We live in a complex world, filled with myriad objects,


tools, toys, and people. Our lives are spent in diverse
interaction with this environment. Yet, for the most
part, our computing takes place sitting in front of, and
staring at, a single glowing screen attached to an array
of buttons and a mouse.” [253]

For a long time, it seemed as if the human–computer interface was to


be limited to working on a desktop computer, using a mouse and a key-
board to interact with windows, icons, menus, and pointers (WIMP).
While the detailed design was being refined with ever more polished
graphics, WIMP interfaces seemed undisputed and no alternative inter-
action styles existed. For any application domain, from productivity
tools to games, the same generic input devices were employed.
Over the past two decades, human–computer interaction (HCI)
researchers have developed a wide range of interaction styles and inter-
faces that diverge from the WIMP interface. Technological advance-
ments and a better understanding of the psychological and social
aspects of HCI have lead to a recent explosion of new post-WIMP

1
Full text available at: http://dx.doi.org/10.1561/1100000026

2 Introduction

interaction styles. Novel input devices that draw on users’ skill of inter-
action with the real non-digital world gain increasing popularity (e.g.,
the Wii Remote controller, multi-touch surfaces). Simultaneously, an
invisible revolution takes place: computers become embedded in every-
day objects and environments, and products integrate computational
and mechatronic components,
This monograph provides a survey of the research on Tangible
User Interfaces (TUIs), an emerging post-WIMP interface type that
is concerned with providing tangible representations to digital infor-
mation and controls, allowing users to quite literally grasp data with
their hands. Implemented using a variety of technologies and materi-
als, TUIs computationally augment physical objects by coupling them
to digital data. Serving as direct, tangible representations of digital
information, these augmented physical objects often function as both
input and output devices providing users with parallel feedback loops:
physical, passive haptic feedback that informs users that a certain phys-
ical manipulation is complete; and digital, visual or auditory feedback
that informs users of the computational interpretation of their action
[237]. Interaction with TUIs is therefore not limited to the visual and
aural senses, but also relies on the sense of touch. Furthermore, TUIs
are not limited to two-dimensional images on a screen; interaction
can become three-dimensional. Because TUIs are an emerging field of
research, the design space of TUIs is constantly evolving. Thus, the
goal of this monograph is not to bound what a TUI is or is not. Rather,
it describes common characteristics of TUIs and discusses a range of
perspectives so as to provide readers with means for thinking about
particular designs.
Tangible Interfaces have an instant appeal to a broad range of users.
They draw upon the human urge to be active and creative with one’s
hands [257], and can provide a means to interact with computational
applications in ways that leverage users’ knowledge and skills of inter-
action with the everyday, non-digital, world [119].
TUIs have become an established research area through the con-
tributions of Hiroshi Ishii and his Tangible Media Group as well as
through the efforts of other research groups worldwide. The word ‘tan-
gible’ now appears in many calls for papers or conference session titles.
Full text available at: http://dx.doi.org/10.1561/1100000026

Following diverse workshops related to tangible interfaces at different


conferences, the first conference fully devoted to tangible interfaces and,
more generally, tangible interaction, took place in 2007 in Baton Rouge,
Louisiana. Since then, the annual TEI Conference (Tangible, Embedded
and Embodied Interaction) serves as a focal point for a diverse commu-
nity that consists of HCI researchers, technologists, product designers,
artists, and others.
This monograph is the result of a systematic review of the body of
work on tangible user interfaces. Our aim has been to provide a useful
and unbiased overview of history, research trends, intellectual lineages,
background theories, and technologies, and open research questions for
anyone who wants to start working in this area, be it in developing
systems or analyzing and evaluating them. We first surveyed seminal
work on tangible user interfaces to expose lines of intellectual influence.
Then, in order to clarify the scope of this monograph we examined
past TEI and CHI proceedings for emerging themes. We then identified
a set of questions to be answered by this monograph and conducted
dedicated literature research on each of these questions.
We begin by sketching the history of tangible user interfaces, tak-
ing a look at the origins of this field. We then discuss the broader
research context surrounding TUIs, which includes a range of related
research areas. Section 4 is devoted to an overview of dominant appli-
cation areas of TUIs. Section 5 provides an overview of frameworks and
theoretical work in the field, discussing attempts to conceptualize, cat-
egorize, analyze, and describe TUIs, as well as analytical approaches to
understand issues of TUI interaction. We then present conceptual foun-
dations underlying the ideas of TUIs in Section 6. Section 7 provides
an overview of implementation technologies and toolkits for building
TUIs. We then move on to design and evaluation methods in Section 8.
We close with a discussion of the strengths and limitations of TUIs and
future research directions.
Full text available at: http://dx.doi.org/10.1561/1100000026

References

[1] R. Abrams, “Adventures in tangible computing: The work of interaction


designer ‘Durrell Bishop’ in context,” Master’s thesis, Royal College of Art,
London, 1999.
[2] D. Africano, S. Berg, K. Lindbergh, P. Lundholm, F. Nilbrink, and A. Persson,
“Designing tangible interfaces for children’s collaboration,” in Proceedings of
CHI04 Extended Abstracts, pp. 853–886, ACM, 2004.
[3] R. Aish, “3D input for CAAD systems,” Computer Aided Design, vol. 11,
no. 2, pp. 66–70, 1979.
[4] R. Aish and P. Noakes, “Architecture without numbers,” Computer Aided
Design, vol. 16, no. 6, pp. 321–328, 1984.
[5] M. W. Alibali, S. Kita, and A. Young, “Gesture and the process of speech
production: We think, therefore we gesture,” Language & Cognitive Processes,
vol. 15, pp. 593–613, 2000.
[6] A. N. Antle, “The CTI framework: Informing the design of tangible systems
for children,” in Proceedings of TEI ’07, pp. 195–202, NY: ACM, 2007.
[7] A. N. Antle, N. Motamedi, K.Tanenbaum, and Z. L. Xie, “The EventTable
technique: Distributed fiducial markers,” in Proceedings of TEI ’09, pp. 307–
313, NY: ACM, 2009.
[8] D. Avrahami and S. Hudson, “Forming interactivity: A tool for rapid prototyp-
ing of physical interactive products,” in Proceedings of DIS’02, pp. 141–146,
NY: ACM, 2002.
[9] R. Balakrishnan and K. Hinckley, “The roles of kinesthetic reference frames
in two-handed input performance,” UIST’99 Symposium on User Interface
Software and Technology, pp. 171–178, NY: ACM.

125
Full text available at: http://dx.doi.org/10.1561/1100000026

126 References

[10] R. Ballagas, F. Memon, R. Reiners, and J. Borchers, “iStuff mobile: Rapidly


prototyping new mobile phone interfaces for ubiquitous computing,” in Pro-
ceedings of CHI ’07, pp. 1107–1116, NY: ACM, 2007.
[11] R. Ballagas, M. Ringel, M. Stone, and J. Borchers, “iStuff: A physical user
interface toolkit for ubiquitous computing environments,” in Proceedings of
CHI ’03, pp. 537–544, NY: ACM, 2003.
[12] M. Banzi, Getting Started with Arduino. OReilly, 2009.
[13] T. Bartindale, J. Hook, and P. Olivier, “Media Crate: Tangible Live Media
Production Interface,” in Proceedings of TEI09, pp. 255–262, NY: ACM, 2009.
[14] M. Baskinger and M. Gross, “Tangible Interaction = Form + Computing,”
Interactions, vol. xvii.1, pp. 6–11, 2010.
[15] M. Beaudouin-Lafon, “Instrumental interaction: An interaction model for
designing post-WIMP user interfaces,” in Proceedings of CHI’00, pp. 446–453,
NY: ACM, 2000.
[16] V. Bellotti, M. Back, W. Edwards, R. Grinter, A. Henderson, and
C. Lopes, “Making sense of sensing systems: Five questions for designers and
researchers,” in Proceedings of CHI02, pp. 415–422, NY: ACM, 2002.
[17] S. Benford et al., “Expected, sensed and desired: A framework for designing
sensing-based interaction,” ACM Transactions on Computer-Human Interac-
tion, vol. 12, no. 1, pp. 3–30, 2005.
[18] M. Billinghurst, H. Kato, and I. Poupyrev, “The MagicBook — Moving seam-
lessly between reality and virtuality,” IEEE Computer Graphics and Applica-
tions, pp. 1–4, May/June 2001.
[19] N. Biloria, “Spatializing real time interactive environments,” in Proceedings
of TEI07, pp. 215–222, NY: ACM, 2007.
[20] A. Blackwell, D. Edge, L. M. Dubuc, J. A. Rode, M. Stringer, and E. F. Toye,
“Using solid diagrams for tangible interface prototyping,” IEEE Pervasive
Computing, pp. 18–21, October–December 2005.
[21] A. Blackwell and R. Hague, “AutoHAN: An architecture for programming the
home,” in Proceedings of the IEEE Symposia on Human-Centric Computing
Languages and Environments, pp. 150–157, 2001.
[22] S. Brave and A. Dahley, “inTouch: A Medium for Haptic Interpersonal Com-
munication,” in Extended Abstracts of CHI ’97, pp. 363–364, NY: ACM, 1997.
[23] J. Brewer, A. Williams, and P. Dourish, “A handle on whats going on: Com-
bining tangible interfaces and ambient displays for collaborative groups,” in
Proceedings of TEI07, pp. 3–10, NY: ACM.
[24] F. W. Bruns, “Zur Rückgewinnung von Sinnlichkeit. Eine neue Form des
Umgangs mit Rechnern,” Technische Rundschau, vol. 29, no. 39, pp. 14–18,
1993.
[25] W. Bruns and V. Brauer, “Bridging the gap between real and virtual model-
ing — A new approach to human-computer interaction,” in Proceedings of the
IFIP 5.10 Workshop on Virtual Prototyping, Providence, September, 1994,
IFIP, 1996.
[26] L. Buechley, M. Eisenberg, J. Catchen, and A. Crockett, “The LilyPad
Arduino: Using computational textiles to investigate engagement, aesthetics,
and diversity in computer science education,” in Proceedings of CHI08,
pp. 423–432, NY: ACM, 2008.
Full text available at: http://dx.doi.org/10.1561/1100000026

References 127

[27] B. E. Bürdek, Design: History, Theory and Practice of Product Design.


Birkhäuser Basel, 2005.
[28] A. Butz, M. Schmitz, A. Krüger, and H. Hullmann, “Tangible UIs for media
control probes into the design space,” in CHI 05 Extended Abstracts, pp. 957–
971, NY: ACM, 2005.
[29] J. Buur, M. V. Jensen, and T. Djajadiningrat, “Hands-only scenarios and
video action walls: Novel methods for tangible user interaction design,” in
Proceedings of DIS04, pp. 185–192, NY: ACM, 2004.
[30] B. Buxton, Sketching User Experiences: Getting the Design Right and the
Right Design. Morgan Kaufmann Publishers Inc, 2007.
[31] W. Buxton and B. Myers, “A study in two-handed input,” in Proceedings of
CHI’86: ACM Conference on Human Factors in Computing Systems, pp. 321–
326, 1986.
[32] K. Camarata, E. Y. Do, B. R. Johnson, and M. D. Gross, “Navigational blocks:
Navigating information space with tangible media,” in Proceedings of the 7th
International Conference on Intelligent User Interfaces, pp. 31–38, NY: IUI
’02. ACM, 2002.
[33] K. Camarata, M. Gross, and E. Y. Do, “A physical computing studio: Explor-
ing computational artifacts and environments,” in International Journal of
Architectural Computing, vol. 1, no. 2, pp. 169–190, 2004.
[34] A. Chang, B. Resner, B. Koerner, X. Wang, and H. Ishii, “LumiTouch:
An emotional communication device,” in Proceedings of CHI’01 Extended
Abstracts, pp. 313–314, NY: ACM, 2001.
[35] H. Chung, C. J. Lee, and T. Selker, “Lover’s cups: Drinking interfaces as new
communication channels,” in Proceedings of CHI’06, NY: ACM, 2006.
[36] A. Clark, Being There: Putting Brain, Body and World Together Again. Cam-
bridge MA: MIT Press, 1997.
[37] M. Coelho and P. Maes, “Sprout I/O: A texturally rich interface,” in Proceed-
ings of TEI08, pp. 221–222, NY: ACM, 2008.
[38] M. Coelho and P. Maes, “Shutters: A permeable surface for environmental
control and communication,” in Proceedings of TEI ’09, pp. 13–18, NY: ACM,
2009.
[39] M. Coelho, I. Poupyrev, S. Sadi, R. Vertegaal, J. Berzowska, L. Buechley,
P. Maes, and N. Oxman, “Programming reality: From transitive materials to
organic user interfaces,” in Proceedings of CHI09 extended abstracts, pp. 4759–
4762, NY: ACM, 2009.
[40] J. Coffin, “Robotany and Lichtung: A contribution to phenomenological dia-
logue,” in Proceedings of TEI08, pp. 217–220, NY: ACM, 2008.
[41] J. Cohen, M. Withgott, and P. Piernot, “Logjam: A tangible multi-person
interface for video logging,” in Proceedings of CHI99, pp. 128–135, NY: ACM,
1999.
[42] N. Couture, G. Rivière, and P. Reuter, “GeoTUI: A tangible user interface for
geoscience,” in Proceedings of TEI08, pp. 89–96, NY: ACM, 2008.
[43] A. Cypher, ed., Watch What I Do: Programming by Demonstration. The MIT
Press, 1993.
Full text available at: http://dx.doi.org/10.1561/1100000026

128 References

[44] A. Damasio, The Feeling of What Happens: Body and Emotion in the Making
of Consciousness. 1999.
[45] C. S. de Souza, The Semiotic Engineering of HumanComputer Interaction.
Cambridge, MA: The MIT Press, 2004.
[46] P. Dietz and D. Leigh, “Diamondtouch: A multi-user touch technology,” in
UIST 01: Proceedings of the 14th Annual ACM Symposium on User Interface
Software and Technology, pp. 219–226, NY: ACM, 2001.
[47] J. P. Djajadiningrat, B. Matthews, and M. Stienstra, “Easy doesn’t do it: Skill
and expression in tangible aesthetics,” Personal and Ubiquitous Computing,
vol. 11, no. 8, pp. 657–676, 2007.
[48] T. Djajadiningrat, K. Overbeeke, and S. Wensveen, “Augmenting fun and
beauty: A pamphlet,” in Proceedings of DARE2000, pp. S. 131–134, NY: ACM,
2000.
[49] T. Djajadiningrat, K. Overbeeke, and S. Wensveen, “But how, Donald, tell
us how? On the creation of meaning in interaction design through feedfor-
ward and inherent feedback,” in Proceedings of Designing Interactive Systems
(DIS2002), pp. 285–291, NY: ACM, 2002.
[50] P. Dourish, Where the Action Is. The Foundations of Embodied Interaction.
MIT Press, 2001.
[51] D. Edge and A. Blackwell, “Peripheral tangible interaction by analytic
design,” in Proceedings of TEI09, pp. 69–76, NY: ACM.
[52] D. Edge and A. Blackwell, “Correlates of the cognitive dimensions for tangible
user interfaces,” Journal of Visual Languages and Computing, vol. 17, no. 4,
pp. 366–394, 2006.
[53] A. Ernevi, J. Redström, M. Redström, and L. Worbin, “The Interactive Pil-
lows,” in IT+Textiles, pp. 47–54, 2005.
[54] D. Fallman, “Wear, point and tilt: Designing support for mobile service and
maintenance in industrial settings,” in Proceedings of DIS2002, pp. 293–302,
NY: ACM, 2002.
[55] M. Familant and M. Detweiler, “Iconic reference: Evolving perspectives and
an organising framework,” in International Journal of Man-Machine Studies
vol. 39, pp. 705–728, 1993.
[56] L. Feijs, S. Kyffin, and B. Young, Proceedings of Design and Semantics of Form
and Movement — DesForM 2005. Foreword. Koninklijke Philips Electronics
N.V. Eindhoven. 3, 2005.
[57] Y. Fernaeus and J. Tholander, “Finding design qualities in a tangible pro-
gramming space,” in Proceedings of CHI06, pp. 447–456, NY: ACM, 2006.
[58] Y. Fernaeus, J. Tholander, and M. Jonsson, “Beyond representations: Towards
an action-centric perspective on tangible interaction,” International Journal
of Arts and Technology, vol. 1, no. 3/4, pp. 249–267, 2008.
[59] Y. Fernaeus, J. Tholander, and M. Jonsson, “Towards a new set of ideals:
Consequences of the practice turn in tangible interaction,” in Proceedings of
TEI’08, pp. 223–230, NY: ACM, 2008.
[60] J. Ferreira, P. Barr, and J. Noble, “The semiotics of user interface redesign,” in
Proceedings of the Sixth Australasian Conference on User interface — Volume
40, ACM International Conference Proceeding Series, vol. 104, pp. 47–53,
Australian Computer Society, 2005.
Full text available at: http://dx.doi.org/10.1561/1100000026

References 129

[61] K. Ferris and L. Bannon, “. . . A Load of ould Boxology!,” in Proceedings of


DIS 2002, pp. 41–49, N.Y.: ACM, 2002.
[62] A. Ferschau, S. Vogl, B. Emsenhuber, and B. Wally, “Physical shortcuts for
media remote controls,” in Proceedings of the Second International Confer-
ence on INtelligent TEchnologies for interaction enterTAINment, InteTain08,
2008.
[63] K. Fishkin, “A taxonomy for and analysis of tangible interfaces,” Personal
and Ubiquitous Computing, vol. 8, pp. 347–358, 2004.
[64] K. P. Fishkin, A. Gujar, B. L. Harrison, T. P. Moran, and R. Want, “Embodied
user interfaces for really direct manipulation,” Communications of the ACM,
vol. 43, no. 9, pp. 75–80, 2000.
[65] G. W. Fitzmaurice, Graspable User Interfaces. Dissertation, Computer Sci-
ence, University of Toronto, Canada, 1996.
[66] G. W. Fitzmaurice and W. Buxton, “An empirical evaluation of graspable
user interfaces: Towards specialized, space-multiplexed input,” in Proceedings
of CHI97, pp. 43–50, NY: ACM, 1997.
[67] G. W. Fitzmaurice, H. Ishii, and W. Buxton, “Bricks: Laying the foundations
for graspable user interfaces,” in Proceedings of CHI95, pp. 442–449, NY:
ACM, 1995.
[68] M. Fjeld, J. Fredriksson, M. Ejdestig, F. Duca, K. Bötschi, B. Voegtli, and
P. Juchli, “Tangible user interface for chemistry education: Comparative eval-
uation and re-design,” in Proceedings of CHI ’07, pp. 805–808, NY: ACM,
2007.
[69] Fjeld, Bichsel, Rauterberg, “BUILD-IT: An intuitive design tool based on
direct object manipulation,” in Proceedings of International Gesture Workshop
1997, pp. 287–308, Berlin, Heidelberg, New York: Springer, 1997.
[70] J. Frazer, An Evolutionary Architecture. Themes VII. London: Architectural
Association, 1995.
[71] J. Frazer and P. Frazer, “Intelligent physical three-dimensional modelling sys-
tems,” in Proceedings of Computer Graphics’80, pp. 359–370, Online Publica-
tions, 1980.
[72] J. Frazer and P. Frazer, “Three-dimensional data input devices,” in Proceed-
ings of Computer Graphics in the Building Process, Washington: National
Academy of Sciences, 1982.
[73] P. Frei, V. Su, B. Mikhak, and H. Ishii, “Curlybot: Designing a New Class of
Computational Toys,” in Proceedings of CHI 2000, pp. 129–136, NY: ACM,
2000.
[74] Furukawa, Fujihata, Muench, Small fish. http://hosting.zkm.de/wmuench/
small fish, 2000.
[75] H. Gellersen, G. Kortuem, A. Schmidt, and M. Beigl, “Physcial prototyp-
ing with Smart-Its,” IEEE Pervasive Computing, pp. 10–18, July–September
2004.
[76] J. J. Gibson, The Ecological Approach to Visual Perception. NY: Houghton
Mifflin, 1979.
[77] S. Gill, “Developing information appliance design tools for designers,” in Pro-
ceedings of the 1st Appliance Design Conference, Bristol, UK, 2003.
Full text available at: http://dx.doi.org/10.1561/1100000026

130 References

[78] A. Gillet, M. Sanner, D. Stoffler, and A. Olson, “Tangible augmented interfaces


for structural molecular biology,” IEEE Computer Graphics & Applications,
vol. 25, no. 2, pp. 13–17, 2005.
[79] A. Girouard, E. T. Solovey, L. M. Hirshfield, S. Ecott, O. Shaer, and R. J. K.
Jacob, “Smart blocks: A tangible mathematical manipulative,” in Proceedings
of TEI07, pp. 183–186, NY: ACM, 2007.
[80] S. Goldin-Meadow, Hearing Gesture: How Our Hands Help Us Think. Harvard
University Press, 2003.
[81] S. Greenberg, “Collaborative physical user interfaces,” in Communication
and Collaboration Support Systems, (T. H. K. Okada and T. Inoue, eds.),
pp. 24–42, Amsterdam, The Netherlands: IOS Press, 2005.
[82] S. Greenberg and C. Fitchett, “Phidgets: Easy development of physical inter-
faces through physical widgets,” in Proceedings of UIST’01, pp. 209–218, NY:
ACM, 2001.
[83] S. Greenberg and H. Kuzuoka, “Using digital but physical surrogates to medi-
ate awareness, communication and privacy in media spaces,” Personal Tech-
nologies, vol. 4, no. 1, Elsevier, January 2000.
[84] Y. Guiard, “Asymmetric division of labor in human skilled bimanual action:
The kinematic chain as a model,” The Journal of Motor Behavior, vol. 19,
no. 4, pp. 486–517, 1987.
[85] D. Harel, “On visual formalisms,” Communications of the ACM, vol. 31, no. 5,
pp. 514–530, 1988.
[86] B. L. Harrison, K. P. Fishkin, A. Gujar, D. Portnov, and R. Want, “Bridging
physical and virtual worlds with tagged documents, objects and locations,” in
Proceedings of CHI ’99 Extended Abstracts, pp. 29–30, NY: ACM, 1999.
[87] B. Hartmann, L. Abdulla, M. Mittal, and S. R. Klemmer, “Authoring sensor-
based interactions by demonstration with direct manipulation and pattern
recognition,” in Proceedings of CHI ’07, pp. 145–154, NY: ACM, 2007.
[88] B. Hartmann, S. R. Klemmer, M. Bernstein, and N. Mehta, “d.tools: Visually
prototyping physical UIs through statecharts,” in Conference Supplement to
UIST’2005.
[89] A. Hauptmann, “Speech and gestures for graphic image manipulation,” in
Proceedings of CHI89, pp. 241–245, NY: ACM, 1989.
[90] C. Heath and P. Luff, “Convergent activities — Line control and passenger
information on the London underground,” in Cognition and Communication
at Work, (Y. Engeström and D. Middleton, eds.), pp. 96–129, Cambridge
University Press, 1998.
[91] M. Heidegger, Being and Time. NY: Harper and Row, 1927. English transla-
tion 1962.
[92] M. Heijboer and E. van den Hoven, “Keeping up appearances: Interpretation
of tangible artifact design,” in Proceedings of NordiCHI 2008, pp. 162–171,
NY: ACM, 2008.
[93] B. Hengeveld, C. Hummels, and K. Overbeeke, “Tangibles for toddlers learn-
ing language,” in Proceedings of TEI09, pp. 161–168, NY: ACM, 2009.
[94] B. Hengeveld, C. Hummels, K. Overbeeke, R. Voort, H. van Balkom, and
J. de Moor, “Let me actuate you,” in Proceedings of TEI08, pp. 159–166, NY:
ACM, 2008.
Full text available at: http://dx.doi.org/10.1561/1100000026

References 131

[95] K. Hinckley, R. Pausch, J. Goble, and N. Kassel, “Passive real-world interface


props for neurosurgical visualization,” in Proceedings of CHI94, pp. 452–458,
NY: ACM, 1994.
[96] K. Hinckley, R. Pausch, D. Proffitt, J. Patten, and N. Kassell, “Cooperative
bimanual action,” in Proceedings of the SIGCHI conference on Human factors
in computing systems, pp. 27–34, NY: ACM, 1997.
[97] S. Hinske, M. Langheinrich, and M. Lampe, “Towards guidelines for designing
augmented toy environments,” in Proceedings of DIS 2008, pp. 78–87, NY:
ACM.
[98] J. D. Hollan, E. Hutchins, and D. Kirsh, “Distributed cognition: A new
foundation for human-computer interaction research,” ACM Transactions on
Computer-Human Interaction (TOCHI), vol. 7, no. 2, pp. 174–196, 2000.
[99] D. Holman and R. Vertegaal, “Organic user interfaces: Designing computers
in any way, shape, or form,” Communications of the ACM, vol. 51, no. 6,
pp. 48–55, 2008.
[100] D. Holman, R. Vertegaal, and N. Troje, “PaperWindows: Interaction tech-
niques for digital paper,” in Proceedings of ACM CHI’05, pp. 591–599, NY:
ACM, 2005.
[101] L. E. Holmquist, J. Redström, and P. Ljungstrand, “Token-based acces to
digital information,” in Proceedings of the 1st International Symposium on
Handheld and Ubiquitous Computing, (H. Gellersen, ed.), pp. 234–245, Lecture
Notes In Computer Science, vol. 1707, London: Springer-Verlag, 1999.
[102] M. S. Horn, E. T. Solovey, R. J. Crouser, and R. J. K. Jacob, “Comparing
the use of tangible and graphical programming interfaces for informal science
education,” in Proceedings of CHI’09, pp. 975–984, NY: ACM, 2009.
[103] M. S. Horn, E. T. Solovey, and R. J. K. Jacob, “Tangible programming for
informal science learning: Making TUIs work for Museums,” in Proceedings
of 7th International Conference on Interaction Design and Children IDC’08,
pp. 194–201, NY: ACM, 2008.
[104] E. Hornecker, “Creative idea exploration within the structure of a guiding
framework: The card brainstorming game,” in Proceedings of TEI’10, pp. 101–
108, NY: ACM, 2010.
[105] E. Hornecker and J. Buur, “Getting a grip on tangible interaction: A frame-
work on physical space and social interaction,” in Proceedings of CHI06,
pp. 437–446, NY: ACM, 2006.
[106] E. Hornecker, R. Jacob, C. Hummels, B. Ullmer, A. Schmidt, E. van den
Hoven, and A. Mazalek, “TEI goes on: Tangible and embedded interaction,”
IEEE Pervasive Computing Magazine/Journal, vol. 7, no. 2, pp. 91–95, April–
June 2008.
[107] E. Hornecker and T. Psik, “Using ARToolKit markers to build tangible pro-
totypes and simulate other technologies,” in Proceedings of Interact 2005,
pp. 30–42, Springer.
[108] C. J. Huang, E. Yi-Luen Do, and M. D. Gross, “MouseHaus table,” in Pro-
ceedings of CAAD Futures, 2003.
[109] Y. Huang, M. D. Gross, E. Y. Do, and M. Eisenberg, “Easigami: A reconfig-
urable folded-sheet TUI,” in Proceedings of TEI ’09, pp. 107–112, NY: ACM,
2009.
Full text available at: http://dx.doi.org/10.1561/1100000026

132 References

[110] C. Hummels, K. C. Overbeeke, and S. Klooster, “Move to get moved: A search


for methods, tools and knowledge to design for expressive and rich movement-
based interaction,” Personal Ubiquitous Computing, vol. 11, no. 8, pp. 677–
690, 2007.
[111] J. Hurtienne and J. H. Israel, “Image schemas and their metaphorical exten-
sions: Intuitive patterns for tangible interaction,” in Proceedings of TEI07,
pp. 127–134, NY: ACM, 2007.
[112] J. Hurtienne, J. H. Israel, and K. Weber, “Cooking up real world buisiness
applications combining physicality, digitality, and image schemas,” in Proceed-
ings of TEI’08, pp. 239–246, NY: ACM, 2008.
[113] E. Hutchins, Cognition in the Wild. Cambridge, London: MIT Press. 3d Press-
ing, 1999.
[114] E. Hutchins and L. Palen, “Constructing meaning from space, gesture, and
speech,” in Discourse, Tools, and Reasoning — Essays on Situated Cognition,
Series F: Computer and System Sciences, Vol. 160, (L. B. Resnick, R. Säljö,
C. Pontecorvo, and B. Burge, eds.), pp. S. 23–40, NATO ASI Series, 1993.
[115] H. Ishii, “The tangible user interface and its evolution,” Communications of
the ACM, vol. 51, no. 6, pp. 32–36, 2008.
[116] H. Ishii, C. Ratti, B. Piper, Y. Wang, A. Biderman, and E. Ben-Joseph,
“Bringing clay and sand into digital design — Continuous tangible user inter-
faces,” BT Technology Journal, vol. 22, no. 4, pp. 287–299, October 2004.
[117] H. Ishii and B. Ullmer, “Tangible bits: Towards seamless interfaces between
people, bits and atoms,” in Proceedings of CHI97, pp. 234–241, NY: ACM,
1997.
[118] S. Izadi, A. Butler, S. Hodges, D. West, M. Hall, B. Buxton, and M. Molloy,
“Experiences with building a thin form-factor touch and tangible tabletop,”
in Proceedings of IEEE Tabletop ’08, pp. 193–196, 2008.
[119] R. J. K. Jacob, A. Girouard, L. M. Hirshfield, M. S. Horn, O. Shaer, E. T.
Solovey, and J. Zigelbaum, “Reality-based interaction: A framework for post-
WIMP interfaces,” in Proceedings of CHI 2008, pp. 201–210, NY: ACM,
2008.
[120] R. J. K. Jacob, H. Ishii, G. Pangaro, and J. Patten, “A tangible interface for
organizing information using a grid,” in Proceedings of CHI ’02, pp. 339–346,
NY: ACM, 2002.
[121] M. Jacobsson, J. Bodin, and L. E. Holmquist, “The see-Puck: A platform for
exploring human–robot relationships,” in Proceedings of CHI08, pp. 141–144,
NY: ACM, 2008.
[122] M. V. Jensen and M. Stienstra, “Making sense: Interactive sculptures as tan-
gible design material,” in Proceedings of DPPI07, Designing Pleasurable Prod-
ucts and Interfaces, pp. 255–269, NY: ACM, 2007.
[123] Johnson, The body in the mind. The bodily basis of meaning, imagination,
and reason, University of Chicago Press, 1987.
[124] S. Jordà, “On stage: The reactable and other musical tangibles go real,” Inter-
national Journal of Arts and Technology (IJART), vol. 1, no. 3/4, pp. 268–287,
Special Issue on Tangible and Embedded Interaction 2008.
Full text available at: http://dx.doi.org/10.1561/1100000026

References 133

[125] S. Jordà, G. Geiger, M. Alonso, and M. Kaltenbrunner, “The reacTable:


Exploring the synergy between live music performance and tabletop tangi-
ble interfaces,” in Proceedings of TEI ’07, pp. 139–146, NY: ACM, 2007.
[126] B. Jordan and A. Henderson, “Interaction analysis foundations and practice,”
Journal of the Learning Sciences, vol. 4, no. 1, pp. 39–103, 1995.
[127] E. Kabisch, A. Williams, and P. Dourish, “Symbolic objects in a net-
worked gestural sound interface,” in Proceedings of CHI05 extended abstracts,
pp. 1513–1516, NY: ACM, 2005.
[128] J. J. Kalanithi and V. M. Bove, “Connectibles: Tangible social networks,” in
Proceedings of TEI08, pp. 199–206, NY: ACM, 2008.
[129] M. Kaltenbrunner, Website on Tangible Music. Read April 2009, http://
modin.yuri.at/tangibles/.
[130] M. Kaltenbrunner and R. Bencina, “reacTIVision: A computer-vision frame-
work for table-based tangible interaction,” in Proceedings of TEI ’07, pp. 69–
74, NY: ACM, 2007.
[131] H. Kato and M. Billinghurst, “Marker tracking and HMD calibration for a
video-based augmented reality conferencing system,” in Proceedings of the
2nd International Workshop on Augmented Reality (IWAR 99), 1999.
[132] H. Kato, M. Billinghurst, I. Poupyrev, N. Tetsutani, and K. Tachibana, “Tan-
gible augmented reality for human computer interaction,” in Proceedings of
Nicograph 2001, Nagoya, Japan, 2001.
[133] H. Kato and M. Billinghurst et al., “Virtual object manipulation on a table-top
AR environment,” in Proceedings of International Symposium on Augmented
Reality ISAR 2000, pp. 111–119, 2000.
[134] M. J. Kim and M. L. Maher, “The impact of tangible user interfaces on design-
ers’ spatial cognition,” Human-Computer Interaction, vol. 23, no. 2, 2008.
[135] D. S. Kirk, A. Sellen, S. Taylor, N. Villar, and S. Izadi, “Putting the physical
into the digital: Issues in designing hybrid interactive surfaces,” in Proceedings
of HCI 2009, 2009.
[136] D. Kirsh, “The intelligent use of space,” Artificial Intelligence, vol. 73, no. 1–2,
pp. 31–68, British Computer Society, 1995.
[137] D. Kirsh and P. Maglio, “On distinguishing epistemic from pragmatic actions,”
Cognitive Science, vol. 18, no. 4, pp. 513–549, 1994.
[138] S. R. Klemmer, B. Hartmann, and L. Takayama, “How bodies matter: Five
themes for interaction design,” in Proceedings of DIS2006 Conference on
Designing Interactive Systems, pp. 140–149, NY: ACM, 2006.
[139] S. R. Klemmer, J. Li, J. Lin, and J. A. Landay, “Papier Mâché: Toolkit sup-
port for tangible input,” in Proceedings of CHI2004, pp. 399–406, NY: ACM,
2004.
[140] S. R. Klemmer, M. W. Newman, R. Farrell, M. Bilezikjian, and J. A. Landay,
“The designers outpost: A tangible interface for collaborative web site design,”
in Proceedings of UIST’2001: ACM Symposium on User Interface Software
and Technology, pp. 1–10, NY: ACM, 2001.
[141] K. Kobayashi, M. Hirano, A. Narita, and H. Ishii, “A tangible interface for IP
network simulation,” in Proceedings of CHI ’03 extended abstracts, pp. 800–
801, NY: ACM, 2003.
Full text available at: http://dx.doi.org/10.1561/1100000026

134 References

[142] B. Koleva, S. Benford, K. H. Ng, and T. Rodden, A Framework for Tangible


User Interfaces, Physical Interaction, (PI03) — Workshop on Real World User
Interfaces, Mobile HCI Conference 2003, Udine, Italy, 2003.
[143] G. Kurtenbach, G. Fitzmaurice, T. Baudel, and B. Buxton, “The design of a
GUI paradigm based on tablets, two-hands, and transparency,” in Proceedings
of CHI97, pp. 35–42, NY: ACM, 1997.
[144] J.-B. Labrune and W. Mackay, “Tangicam: Exploring observation tools for
children,” in Proceedings of IDC 2005, pp. 95–102, NY: ACM, 2005.
[145] G. Lakoff and M. Johnson, Metaphors We Live By. University of Chicago
Press, 1980.
[146] B. Laurel, Computers as Theater. Addison-Wesley Professional, 1993.
[147] V. LeClerc, A. Parkes, and H. Ishii, “Senspectra: A computationally aug-
mented physical modeling toolkit for sensing and visualization of structural
strain,” in Proceedings of CHI ’07, pp. 801–804, NY: ACM, 2007.
[148] G. A. Lee, C. Nelles, M. Billinghurst, and G. J. Kim, “Immersive authoring
of tangible augmented reality applications,” in Proceedings of IEEE/ACM
International Symposium on Mixed and Augmented Reality, pp. 172–181, IEEE
Computer Society, 2004.
[149] J. Leitner, M. Haller, K. Yun, W. Woo, M. Sugimoto, and M. Inami, “Inc-
reTable, a mixed reality tabletop game experience,” in Proceedings of the 2008
International Conference on Advances in Computer Entertainment Technol-
ogy, pp. 9–16, NY: ACM, 2008.
[150] W. E. Mackay and A.-L. Fayard, “Designing interactive paper: Lessons from
three augmented reality projects,” in Proceedings of IWAR98, International
Workshop on Augmented Reality, Natick, MA, 1999.
[151] C. L. MacKenzie and T. Iberall, The Grasping Hand. North Holland, 1994.
[152] C. Magerkurth, M. Memisoglu, T. Engelke, and N. A. Streitz, “Towards the
next generation of tabletop gaming experiences,” in Graphics Interface 2004
(GI’04), pp. 73–80, AK Peters, 2004.
[153] A. Manches, C. O’Malley, and S. Benford, “Physical manipulation: Evaluating
the potential for tangible designs,” in Proceedings of TEI 09, NY: ACM, 2009.
[154] V. Maquil, T. Psik, and I. Wagner, “The ColorTable: A design story,” in
Proceedings of TEI ’08, pp. 97–104, NY: ACM, 2008.
[155] N. Marquardt and S. Greenberg, “Distributed physical interfaces with shared
Phidgets,” in Proceedings of TEI ’07, pp. 13–20, NY: ACM, 2007.
[156] P. Marshall, “Do tangible interfaces enhance learning?,” in Proceedings of
TEI’07, pp. 163–170, NY: ACM, 2007.
[157] P. Marshall, S. Price, and Y. Rogers, “Conceptualizing tangibles to support
learning,” in Proceedings of IDC 2003, pp. 101–109, NY: ACM, 2003.
[158] E. S. Martinussen and T. Arnall, “Designing with RFID,” in Proceedings of
TEI 2009, pp. 343–350, NY: ACM, 2009.
[159] N. Matsushita and J. Rekimoto, “HoloWall: Designing a finger, hand, body,
and object sensitive wall,” in Proceedings of UIST97, pp. 209–210, NY: ACM,
1997.
[160] A. Mazalek and E. van den Hoven, “Framing tangible interaction frameworks.
Tangible interaction for design special issue,” AIEDAM journal, Spring 2009,
vol. 23, pp. 225–235, 2009.
Full text available at: http://dx.doi.org/10.1561/1100000026

References 135

[161] T. S. McNerney, “From turtles to tangible programming bricks: Explorations


in physical language design,” Pers Ubiquit Comput, vol. 8, pp. 326–337,
2004.
[162] M. Merleau-Ponty, Phenomnologie de la perception. Paris: Gallimard, 1945.
[163] Microsoft Corp. Microsoft Surface, http://www.microsoft.com/surface/, 2009.
[164] J. Moen, “From hand-held to body-worn: Embodied experiences of the design
and use of a wearable movement-based interaction concept,” in Proceedings of
TEI’07, pp. 251–258, NY: ACM, 2007.
[165] E. Mugellini, E. Rubegni, S. Gerardi, and O. A. Khaled, “Using personal
objects as tangible interfaces for memory recollection and sharing,” in Pro-
ceedings of TEI’07, pp. 231–238, NY: ACM, 2007.
[166] B. A. Myers, “Why are human-computer interfaces difficult to design and
implement?,” Technical Report. UMI Order Number: CS-93-183., Carnegie
Mellon University, 1993.
[167] H. Newton-Dunn, H. Nakano, and J. Gibson, “Blockjam: A tangible inter-
face for interactive music,” in Proceedings of the 2003 Conference on New
Interfaces for Musical Expression (NIME-03), pp. 170–177, 2003.
[168] D. Norman, The Psychology of Everyday Things. New York: Basic Books,
1988.
[169] D. Norman, Things that Make Us Smart. Defending Human Attributes in the
Age of the Machine. Reading, Mass, Addison Wesley, 1994.
[170] D. Norman, “Affordance, Conventions, and Design,” Interactions, vol. 6, no. 3,
pp. 38–43, 1999.
[171] D. Norman, “The next UI breakthrough, Part 2: Physicality,” Interactions,
vol. 14, no. 4, pp. 46–47, 2007.
[172] D. R. Olsen, User Interface Management Systems: Models and Algorithms.
Morgan Kaufmann, San Francisco, 1992.
[173] C. O’Malley and D. Stanton, “Tangible technologies for collaborative story-
telling,” in Proceedings of the 1st European Conference on Mobile and Con-
textual Learning, pp. 3–7, 2002.
[174] C. O’Malley and D. Stanton Fraser, “Literature review in learning with tan-
gible technologies,” NESTA futurelab report 12, Bristol, 2004.
[175] D. O’Sullivan and T. Igoe, Physical Computing: Sensing and Controlling the
Physical World with Computers. Boston: Muska and Lipman, 2004.
[176] K. Overbeeke and S. Wensveen, “From perception to experience, from affor-
dances to irresistibles,” in Proceedings of DPPI03 (Designing Pleasurable
Products and Interfaces), pp. 92–97, NY: ACM, 2003.
[177] G. Pangaro, D. Maynes-Aminzade, and H. Ishii, “The actuated workbench:
Computer-controlled actuation in tabletop tangible interfaces,” in Proceedings
of UIST 2002 Symposium on User Interface Software and Technology, pp. 181–
190, NY: ACM, 2002.
[178] A. Parkes and H. Ishii, “Kinetic sketchup: Motion prototyping in the tangible
design process,” in Proceedings of TEI ’09, pp. 367–372, NY: ACM, 2009.
[179] V. Parmar, G. Groeneveld, A. Jalote-Parmar, and D. Keyson, “Tangible user
interface for increasing social interaction among rural women,” in Proceedings
of TEI 2009, pp. 139–145, NY: ACM, 2009.
Full text available at: http://dx.doi.org/10.1561/1100000026

136 References

[180] J. Patten and H. Ishii, “A comparison of spatial organization strategies in


graphical and tangible user interfaces,” in Proceedings of Designing Augmented
Reality Environments, DARE ’00, pp. 41–50, NY: ACM, 2000.
[181] J. Patten and H. Ishii, “Mechanical constraints as computational constraints
in tabletop tangible interfaces,” in Proceedings of CHI’07, pp. 809–818, NY:
ACM, 2007.
[182] J. Patten, B. Recht, and H. Ishii, “Audiopad: A tag-based interface for musical
performance,” in Proceedings of the International Conference on New Interface
for Musical Expression NIME02, pp. 24–26, 2002.
[183] E. W. Pedersen and K. Hornbæk, “mixiTUI: A tangible sequencer for elec-
tronic live performances,” in Proceedings of TEI ’09, pp. 223–230, NY: ACM,
2009.
[184] C. S. Peirce, “Collected Papers of Charles Sanders Peirce,” in 8 Volumes,
(C. Hartshorne, P. Weiss, and A. Burks, eds.), Cambridge MA: Harvard Uni-
versity Press, 1931–1958.
[185] R. Perlman, Using Computer Technology to Provide a Creative Learning Envi-
ronment for Preschool Children. MIT Logo Memo 24, 1976.
[186] M. G. Petersen, “Squeeze: Designing for playful experiences among co-located
people in homes,” in Proceedings of CHI 2007 Extended Abstracts, pp. 2609–
2614, NY: ACM, 2007.
[187] C. Petri, “Kommunikation mit Automaten,” Ph.D. thesis, University of Bonn,
1962.
[188] B. Piper, C. Ratti, and H. Ishii, “Illuminating clay: A 3-D tangible interface
for landscape analysis,” in Proceedings of CHI ’02, pp. 355–362, NY: ACM,
2002.
[189] I. Poupyrev, T. Nashida, and M. Okabe, “Actuation and tangible user inter-
faces: The Vaucanson duck, robots, and shape displays,” in Proceedings of
Tangible and Embedded interaction, TEI ’07, pp. 205–212, NY: ACM, 2007.
[190] R. Poynor, “The hand that rocks the cradle,” ID Magazine, pp. 60–65,
May/June 1995.
[191] S. Price, “A representation approach to conceptualizing tangible learning envi-
ronments,” in Proceedings of TEI 2008, pp. 151–158, NY: ACM, 2008.
[192] S. Price and Y. Rogers, “Let’s get physical: The learning benefits of interacting
in digitally-augmented physical spaces,” Journal of Computers and Education,
vol. 15, no. 2, pp. 169–185, 2004.
[193] S. Price, Y. Rogers, M. Scaife, D. Stanton, and H. Neale, “Using tangibles
to support new ways of playing and learning,” in Proceedings of Interaction
Design for Children 2002, 2002.
[194] J. Qi and L. Buechley, “Electronic popables: Exploring paper-based computing
through an interactive pop-up book,” in Proceedings of TEI ’10, pp. 121–128,
NY: ACM, 2010.
[195] H. Raffle, C. Vaucelle, R. Wang, and H. Ishii, “Jabberstamp: Embedding
sound and voice in traditional drawings,” in Proceedings of IDC, pp. 137–144,
NY: ACM, 2007.
[196] H. S. Raffle, A. J. Parkes, and H. Ishii, “Topobo: A constructive assembly
system with kinetic memory,” in Proceedings of the ACM CHI’04, pp. 647–
654, NY: ACM, 2004.
Full text available at: http://dx.doi.org/10.1561/1100000026

References 137

[197] J. Rekimoto and Y. Ayatsuka, “CyberCode: Designing augmented reality envi-


ronments with visual tags,” in Proceedings of DARE 2000, NY: ACM, 2000.
[198] M. Resnick, “Behavior construction kits,” Communications of the ACM,
vol. 36, no. 7, pp. 64–71, July 1993.
[199] M. Resnick, F. Martin, R. Berg, R. Borovoy, V. Colella, K. Kramer, and
B. Silverman, “Digital manipulatives: New toys to think with,” in Proceedings
of CHI’98, pp. 281–287, NY: ACM, 1998.
[200] Y. Rogers and H. Muller, “A framework for designing sensor-based interac-
tions to promote exploration and reflection in play,” International Journal of
Human Computer Studies, vol. 64, no. 1, pp. 1–14, 2006.
[201] Y. Rogers, M. Scaife, S. Gabrielli, I.-I. Smith, and E. A. Harris, “Conceptual
framework for mixed reality environments: Designing novel learning activi-
ties for young children,” Presence: Teleoperators and Virtual Environments,
vol. 11, no. 6, pp. 677–686, 2002.
[202] T. Rohrer, “The body in space: Embodiment, experientialism and linguistic
conceptualization,” in Body, Language and Mind, vol. 2, (J. Zlatev, T. Ziemke,
R. Frank, and R. Dirven, eds.), Berlin: Mouton de Gruyter, 2006.
[203] A. Rydarowski, O. Samanci, and A. Mazalek, “Murmur: Kinetic relief sculp-
ture, multi-sensory display, listening machine,” in Proceedings of TEI’08,
pp. 231–239, NY: ACM, 2008.
[204] K. Ryokai and J. Cassell, “StoryMat: A play space for collaborative story-
telling,” in Proceedings of CHI’99, pp. 272–273, NY: ACM, 1999.
[205] K. Ryokai, S. Marti, and H. Ishii, “I/O brush: Drawing with everyday objects
as ink,” in Proceedings of CHI 2004, pp. 303–310, NY: ACM, 2004.
[206] G. Saul and M. D. Gross, “Co-designed paper devices,” in Programming Real-
ity: From Transitive Materials to Organic User Interfaces, a CHI 2009 Work-
shop, 2009.
[207] M. Scaife and Y. Rogers, “External cognition: How do graphical represen-
tations work?,” International Journal of Human-Computer Studies, vol. 45,
no. 2, pp. 185–213, 1996.
[208] K. Schäfer, V. Brauer, and W. Bruns, “A new approach to human-computer
interaction — Synchronous modeling in real and virtual spaces,” in Proceed-
ings of DIS 1997, pp. 335–344, NY: ACM, 1997.
[209] B. Schiettecatte and J. Vanderdonckt, “AudioCubes: A distributed cube tan-
gible interface based on interaction range for sound design,” in Proceedings of
TEI ’08, pp. 3–10, NY: ACM, 2008.
[210] A. Schmidt and K. Van Laerhoven, “How to build smart appliances,” IEEE
Personal Communications, Special Issue on Pervasive Computing, vol. 8,
no. 4, IEEE Press, pp. 66–71, 2001.
[211] D. Schön, Educating the Refective Practicioner. San Francisco, London:
Jossey-Bass Publications, 1989.
[212] E. Schweikardt, N. Elumeze, M. Eisenberg, and M. D. Gross, “A tangible con-
struction kit for exploring graph theory,” in Proceedings of TEI ’09, pp. 373–
376, NY: ACM, 2009.
[213] J. A. Seitz, “The bodily basis of thought,” in Proceedings of 29th Annual
Symposium of the Jean Piaget Society, Mexico City, Mexico, 1999.
Full text available at: http://dx.doi.org/10.1561/1100000026

138 References

[214] O. Shaer, M. S. Horn, and M. S. Jacob, Tangible user interface laboratory:


Teaching tangible interaction design in practice, AIEDAM Special Issue on
Tangible Interaction for Design, 2009.
[215] O. Shaer and R. J. K. Jacob, “A specification paradigm for the design and
implementation of tangible user interfaces,” ACM Transactions on Computer-
Human Interaction, vol. 16, no. 4, pp. 251–261, 2009.
[216] O. Shaer, N. Leland, E. H. Calvillo-Gamez, and R. J. K. Jacob, “The TAC
paradigm: Specifying tangible user interfaces,” Personal and Ubiquitous Com-
puting, vol. 8, pp. 359–369.
[217] O. Shaer, B. Ziraknejad, K. Camarata, E. Yi-Luen Do, and M. Gross A Com-
putationally Enhanced Play Board for Group Interaction, a Hot Spot paper,
Pervasive, 2004.
[218] E. Sharlin, Y. Itoh, B. Watson, Y. Kitamura, S. Sutphen, and L. Liu, “Cogni-
tive cubes: A tangible user interface for cognitive assessment,” in Proceedings
of CHI02, pp. 347–354, NY: ACM, 2002.
[219] E. Sharlin, B. Watson, Y. Kitamura, F. Kishino, and Y. Itoh, “On tangible
user interfaces, humans and spatiality,” Personal and Ubiquitous Computer-
ing, vol. 8, no. 5, pp. 338–346, 2004.
[220] H. Sharp, Y. Rogers, and J. Preece, Interaction Design. John Wiley, 2007.
[221] J. Sheridan and G. Kortuem, “Affordance-based design of physical interfaces
for ubiquitous environments,” in Proceedings of UCS06, pp. 183–199, Springer,
2006.
[222] J. G. Sheridan and N. Bryan-Kinns, “Designing for performative tangible
interaction,” International Journal of Arts and Technology (IJART), vol. 1,
no. 3/4, pp. 288–308, 2008.
[223] L. Sherman, A. Druin, J. Montemayor, A. Farber, M. Platner, S. Simms,
J. Porteous, H. Alborzi, J. Best, J. Hammer, A. Kruskal, J. Matthews,
E. Rhodes, C. Cosans, and L. Lal, “StoryKit: Tools for children to build
room-sized interactive experiences,” in Extended Abstracts, Interactive Video
Poster, CHI2001, NY: ACM, 2001.
[224] A. Singer, D. Hindus, L. Stifelman, and S. White, “Tangible progress: Less is
more in somewire audio spaces,” in Proceedings of CHI ’99, pp. 104–111, NY:
ACM, 1999.
[225] L. Sitorus, S. S. Cao, and J. Buur, “Tangible user interfaces for configuration
practices,” in Proceedings of TEI07, pp. 223–230, NY: ACM, 2007.
[226] W. Spiessl, N. Villar, H. Gellersen, and A. Schmidt, “VoodooFlash: Author-
ing across digital and physical form,” in Proceedings of TEI ’07, NY: ACM,
2007.
[227] M. Stringer, E. F. Toye, J. Rode, and A. Blackwell, “Teaching rhetorical skills
with a tangible user interface,” in Proceedings of IDC 2004, pp. 11–18, NY:
ACM, 2004.
[228] R. Strong and B. Gaver, “Feather, scent and shaker: Supporting simple inti-
macy,” in Proceedings of CSCW 1996, pp. 29–30, NY: ACM, 1996.
[229] J. Sturm, T. Bekker, B. Groenendaal, R. Wesselink, and B. Eggen, “Key
issues for the successful design of an intelligent, interactive playground,” in
Proceedings of IDC 2008, pp. 258–265, NY: ACM, 2008.
Full text available at: http://dx.doi.org/10.1561/1100000026

References 139

[230] H. Suzuki and H. Kato, “AlgoBlock: A tangible programming language, a tool


for collaborative learning,” in Proceedings of the 4th European Logo conference
(Eurologo93), pp. 297–303, Athens, Greece, 1993.
[231] H. Suzuki and H. Kato, “Interaction-level support for collaborative learning:
AlgoBlock — An open programming language,” in Proceedings of CSCL, 1995.
[232] L. Terrenghi, D. Kirk, H. Richter, S. Krämer, O. Hilliges, and A. Butz, “Phys-
ical handles at the interactive surface: Exploring tangibility and its benefits,”
in Proceedings of AVI 2008, pp. 138–145, NY: ACM, 2008.
[233] K. N. Truong, G. R. Hayes, and G. D. Abowd, “Storyboarding: An empirical
determination of best practices and effective guidelines,” in Proceedings of
Designing interactive Systems DIS ’06, pp. 12–2, NY: ACM, 2006.
[234] S. Turkle, Evocative Objects: Things We Think With. Cambridge, MA: MIT
Press, 2007.
[235] B. Ullmer, “Tangible interfaces for manipulating aggregates of digital infor-
mation,” PhD Dissertation, Massachusetts Institute of Technology, 2002.
[236] B. Ullmer and H. Ishii, “Mediablocks: Tangible interfaces for online media,”
in Proceedings of CHI ’99, pp. 31–32, NY: ACM, 1999.
[237] B. Ullmer and H. Ishii, “Emerging frameworks for tangible user interfaces,”
IBM Systems Journal, vol. 39, no. 3–4, pp. 915–931, July 2000.
[238] B. Ullmer and H. Ishii, “Emerging frameworks for tangible user interfaces,”
in Human-Computer Interaction in the New Millenium, (J. M. Carroll, ed.),
pp. 579–601, Addison-Wesley, 2001.
[239] B. Ullmer, H. Ishii, and R. Jacob, “Token+constraint systems for tangi-
ble interaction with digital information,” ACM Transactions on Computer-
Human Interaction, vol. 12, no. 1, pp. 81–118, 2005.
[240] J. Underkoffler and H. Ishii, “Illuminating light: An optical design tool with
a luminous-tangible interface,” in Proceedings of CHI98, pp. 542–549, NY:
ACM, 1998.
[241] J. Underkoffler and H. Ishii, “Urp: A luminous-tangible workbench for urban
planning and design,” in Proceedings of CHI ’99, pp. 386–393, NY: ACM,
1999.
[242] E. van den Hoven and B. Eggen, “Tangible computing in everyday life: Extend-
ing current frameworks for tangible user interfaces with personal objects,” in
Proceedings of EUSAI 2004, pp. 230–242, Springer, LNCS 3295, 2004.
[243] E. van den Hoven, J. Frens, D. Aliakseyeu, J. B. Martens, K. Overbeeke, and
P. Peters, “Design Research and Tangible Interaction,” Proceedings of TEI’07,
pp. 109–115, 2007.
[244] E. van Loenen, T. Bergman, V. Buil, K. van Gelder, M. Groten, G. Hollemans,
J. Hoonhout, T. Lashina, and S. van de Wijdeven, “EnterTaible: A solution for
social gaming experiences,” in Tangible Play: Research and Design for Tan-
gible and Tabletop Games, Workshop at the 2007 Intelligent User Interfaces
Conference, pp. S. 16–19, Honolulu, Hawaii, USA, 2007.
[245] C. L. Vaughan, “Understanding movement,” in Proceedings of CHI 97, NY:
ACM, 1997.
[246] R. Vertegaal and I. Poupyrev, “Introduction to the special issue on Organic
user interfaces,” Communications of the ACM, vol. 51, no. 6, pp. 26–30, 2008.
Full text available at: http://dx.doi.org/10.1561/1100000026

140 References

[247] N. Villar and H. Gellersen, “A malleable control structure for softwired user
interfaces,” in Proceedings of TEI ’07, pp. 49–56, NY: ACM, 2007.
[248] M. Virnes, E. Sutinen, and E. Kärnä-Lin, “How childrens individual needs
challenge the design of educational robotics,” in Proceedings of IDC 2008,
pp. 274–281, NY: ACM, 2008.
[249] R. Want, K. Fishkin, A. Gujar, and B. Harrison, “Bridging physical and virtual
worlds with electronic tags,” in Proceedings of CHI99, pp. 370–377, NY: ACM,
1999.
[250] G. Weinberg and S. Gan, “The squeezables: Toward an expressive and interde-
pendent multi-player musical instrument,” Computer Music Journal, vol. 25,
no. 2, pp. 37–45, 2001.
[251] M. Weiser, “Some computer science issues in ubiquitous computing,” Com-
munications of the ACM, vol. 36, no. 7, pp. 74–84, 1993.
[252] M. P. Weller, E. Y. Do, and M. D. Gross, “Posey: Instrumenting a poseable
hub and strut construction toy,” in Proceedings of TEI ’08, pp. 39–46, NY:
ACM, 2008.
[253] P. Wellner, W. Mackay, and R. Gold, “Computer-augmented environments.
Back to the real world,” Communications of the ACM, vol. 36, no. 7, pp. 24–26,
1993.
[254] S. A. Wensveen, J. P. Djajadiningrat, and C. J. Overbeeke, “Interaction frog-
ger: A design framework to couple action and function through feedback and
feedforward,” in Proceedings of DIS ’04, pp. 177–184, NY: ACM, 2004.
[255] J. Werner, R. Wettach, and E. Hornecker, “United-pulse: Feeling your partners
pulse,” in Proceedings of MobileHCI 2008, pp. 535–553, NY: ACM, 2008.
[256] T. Westeyn, J. Kientz, T. Starner, and G. Abowd, “Designing toys with
automatic play characterization for supporting the assessment of a childs
development,” in Proceedings of IDC08, pp. 89–92, NY: ACM, 2008.
[257] F. R. Wilson, The Hand — How Its Use Shapes the Brain, Language, and
Human Culture. Vintagebooks/Random House, 1998.
[258] P. Wyeth and H. C. Purchase, “Tangible programming elements for young
children,” in Proceedings of CHI ’02 Extended Abstracts, pp. 774–775, NY:
ACM, 2002.
[259] R. Young, D. Pezzutti, S. Pill, and R. Sharp, “The development of tools to
assist the design of motion in system operated products,” in Proceedings of
Design and Semantics of Form and Movement — DesForM 2005, pp. 13–22,
Koninklijke Philips Electronics N.V, Eindhoven, 2005.
[260] R. Young, D. Pezzutti, S. Pill, and R. Sharp, “The language of motion in
industrial design,” in Proceedings of Design and Semantics of Form and Move-
ment — DesForM 2005, pp. 6–12, Koninklijke Philips Electronics N.V., Eind-
hoven, 2005.
[261] J. Zhang, “The nature of external representations in problem solving,” Cog-
nitive Science, vol. 21, pp. 179–217, 1997.
[262] J. Zhang and D. A. Norman, “Representations in distributed cognitive tasks,”
Cognitive Science, vol. 18, no. 1, pp. 87–122, 1994.
[263] Z. Zhou, A. D. Cheok, T. Chan, J. H. Pan, and Y. Li, “Interactive enter-
tainment systems using tangible cubes,” in Proceedings of IE2004, Australian
Workshop on Interactive Entertainment, 2004.
Full text available at: http://dx.doi.org/10.1561/1100000026

References 141

[264] J. Zigelbaum, M. Horn, O. Shaer, and R. Jacob, “The tangible video editor:
Collaborative video editing with active tokens,” in Proceedings of TEI ‘07,
pp. 43–46, NY: ACM, 2007.
[265] J. Zigelbaum, A. Kumpf, A. Vazquez, and H. Ishii, Slurp: Tangibility, spatial-
ity, and an eyedropper. CHI Extended Abstracts pp. 2565–2574, NY: ACM,
2008.
[266] O. Zuckerman, S. Arida, and M. Resnick, “Extending tangible interfaces for
education: Digital montessori-inspired manipulatives,” in Proceedings of CHI
’05, pp. 859–868, NY: ACM, 2005.
[267] G. Zufferey, P. Jermann, A. Lucchi, and P. Dillenbourg, “TinkerSheets: Using
paper forms to control and visualize tangible simulations,” in Proceedings of
TEI09, pp. 377–384, NY: ACM, 2009.

You might also like