TY - GEN
T1 - Movementable
T2 - 15th IFIP TC 13 International Conference on Human–Computer Interaction, INTERACT 2015
AU - Takashima, Kazuki
AU - Asari, Yusuke
AU - Yokoyama, Hitomi
AU - Sharlin, Ehud
AU - Kitamura, Yoshifumi
N1 - Funding Information:
This work was supported in part by JSPS KAKENHI Grant Number 26730101, by an NSERC Discovery Grant, and the Cooperative Research Project of the Research Institute of Electrical Communication, Tohoku University.
Publisher Copyright:
© IFIP International Federation for Information Processing 2015.
PY - 2015
Y1 - 2015
N2 - MovemenTable (MT) is an exploration of moving interactive tabletops which can physically move, gather together or depart according to people’s dynamically varying interaction tasks and collaborative needs. We present the design and implementation of a set of MT prototypes and discuss a technique that allows MT to augment its visual content in order to provide motion cues to users. We outline a set of interaction scenarios using single and multiple MTs in public, social and collaborative settings and discuss four user studies based on these scenarios, assessing how people perceive MT movements, how these movements affect their interaction, and how synchronized movements of multiple MTs impacts people’s collaborative interactions. Our findings confirm that MT’s augmentation of its visual content was helpful in providing motion cues to users, and that MT’s movement had significant effects on people’s spatial behaviors during interaction, effects that peaked in collaborative scenarios with multiple MTs.
AB - MovemenTable (MT) is an exploration of moving interactive tabletops which can physically move, gather together or depart according to people’s dynamically varying interaction tasks and collaborative needs. We present the design and implementation of a set of MT prototypes and discuss a technique that allows MT to augment its visual content in order to provide motion cues to users. We outline a set of interaction scenarios using single and multiple MTs in public, social and collaborative settings and discuss four user studies based on these scenarios, assessing how people perceive MT movements, how these movements affect their interaction, and how synchronized movements of multiple MTs impacts people’s collaborative interactions. Our findings confirm that MT’s augmentation of its visual content was helpful in providing motion cues to users, and that MT’s movement had significant effects on people’s spatial behaviors during interaction, effects that peaked in collaborative scenarios with multiple MTs.
KW - CSCW
KW - Human-robot interaction
KW - Social interfaces
UR - http://www.scopus.com/inward/record.url?scp=84946065420&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84946065420&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-22698-9_19
DO - 10.1007/978-3-319-22698-9_19
M3 - Conference contribution
AN - SCOPUS:84946065420
SN - 9783319226972
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 296
EP - 314
BT - Human-Computer Interaction – INTERACT 2015 - 15th IFIP TC 13 International Conference, Proceedings
A2 - Palanque, Philippe
A2 - Gross, Tom
A2 - Fetter, Mirko
A2 - Barbosa, Simone
A2 - Winckler, Marco
A2 - Abascal, Julio
PB - Springer Verlag
Y2 - 14 September 2015 through 18 September 2015
ER -