As domestic robots and smart appliances become
increasingly common, they require a simple, universal
interface to control their motion. Such an interface must
support a simple selection of a connected device, highlight
its capabilities and allow for an intuitive manipulation. We
propose "exTouch", an embodied spatially-aware approach
to touch and control devices through an augmented reality
mediated mobile interface. The "exTouch" system extends
the users touchscreen interactions into the real world by
enabling spatial control over the actuated object. When
users touch a device shown in live video on the screen, they
can change its position and orientation through multi-touch
gestures or by physically moving the screen in relation to
the controlled object. We demonstrate that the system can
be used for applications such as an omnidirectional vehicle,
a drone, and moving furniture for reconfigurable room.
7th International Conference on Tangible, Embedded and Embodied Interaction