Tool use as gesture: new challenges for training and rehabilitation

Christopher Baber, Manish Parekh

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

There are many ways to capture human gestures. In this paper, consideration is given to an extension to the growing trend to use sensors to capture movements and interpret these as gestures. However, rather than have sensors on people, the focus is on the attachment of sensors (i.e., strain gauges and accelerometers) to the tools that people use. By instrumenting a set of handles, which can be fitted with a variety of effectors (e.g., knives, forks, spoons, screwdrivers, spanners, saws etc.), it is possible to capture the variation in grip force applied to the handle as the tool is used and the movements made using the handle. These data can be sent wirelessly (using Zigbee) to a computer where distinct patterns of movement can be classified. Different approaches to the classification of activity are considered. This provides an approach to combining the use of real tools in physical space with the representation of actions on a computer. This approach could be used to capture actions during manual tasks, say in maintenance work, or to support development of movements, say in rehabilitation.
Original languageEnglish
Title of host publicationProceedings of the 24th British Computer Society Conference on Human Computer Interaction
PublisherBritish Computer Society
Publication statusPublished - 2010

Fingerprint

Dive into the research topics of 'Tool use as gesture: new challenges for training and rehabilitation'. Together they form a unique fingerprint.

Cite this