TY - GEN
T1 - Towards object based manipulation in remote guidance
AU - Ranatunga, Dulitha
AU - Feng, David
AU - Adcock, Matt
AU - Thomas, Bruce
PY - 2013
Y1 - 2013
N2 - This paper presents a method for using object based manipulation and spatial augmented reality for the purpose of remote guidance. Previous remote guidance methods have typically not made use of any semantic information about the physical properties of the environment and require the helper and worker to provide context. Our new prototype system introduces a level of abstraction to the remote expert, allowing them to directly specify the object movements required of a local worker. We use 3D tracking to create a hidden virtual reality scene, mirroring the real world, with which the remote expert interacts while viewing a camera feed of the physical workspace. The intended manipulations are then rendered to the local worker using Spatial Augmented Reality (SAR). We report on the implementation of a functional prototype that demonstrates an instance of this approach. We anticipate that techniques such as the one we present will allow more efficient collaborative remote guidance in a range of physical tasks.
AB - This paper presents a method for using object based manipulation and spatial augmented reality for the purpose of remote guidance. Previous remote guidance methods have typically not made use of any semantic information about the physical properties of the environment and require the helper and worker to provide context. Our new prototype system introduces a level of abstraction to the remote expert, allowing them to directly specify the object movements required of a local worker. We use 3D tracking to create a hidden virtual reality scene, mirroring the real world, with which the remote expert interacts while viewing a camera feed of the physical workspace. The intended manipulations are then rendered to the local worker using Spatial Augmented Reality (SAR). We report on the implementation of a functional prototype that demonstrates an instance of this approach. We anticipate that techniques such as the one we present will allow more efficient collaborative remote guidance in a range of physical tasks.
KW - 3D CHI
KW - Multi touch interaction
KW - Object Manipulation
KW - Remote Guidance
KW - Spatially Augmented Reality
UR - http://www.scopus.com/inward/record.url?scp=84893266241&partnerID=8YFLogxK
U2 - 10.1109/ISMAR.2013.6671839
DO - 10.1109/ISMAR.2013.6671839
M3 - Conference contribution
SN - 9781479928699
T3 - 2013 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2013
BT - 2013 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2013
T2 - 12th IEEE and ACM International Symposium on Mixed and Augmented Reality, ISMAR 2013
Y2 - 1 October 2013 through 4 October 2013
ER -