feelabuzz

Touch can convey emotions on a very direct level. feelabuzz is a system implementing a remote touch connection using standard mobilephone hardware. Accelerometer data is mapped to vibration strength on two smartphones connected via the Internet. This is done using direct mapping techniques, without any abstraction of the acceleration signal. By this, feelabuzz can be used for implicit context communication, i. e. the background monitoring of the natural movements of the users themselves or their environments, as well as for direct communication, i. e. voluntary and symbolic signalling through this new channel. Two people can thereby feel each other’s movements and tune into te activities of the other at any time, using nothing more than their mobile phones.

Touch is arguably the most immediate, the most affective, and — when it comes to media — one of the most neglected modalities used for human communication. It can convey emotions and feelings on a direct and primordial level. We developed feelabuzz — a system to directly transform one user’s motion into the vibrotactile output of another, typically remote device. Unlike previous work on tactile communication we do so using only mobile phones without any additional gear. This is possible because mobile phones these days almost universally have accelerometers as well as vibration motors which can be used for the sensing of movement and vibrotactile actuation respectively. Mobile phones have the key advantages of not only being widespread to the point of omnipresence but also to usually be worn on the user’s body. Furthermore, not having to buy and more importantly to carry around an extra piece of hardware is a property whose importance cannot be overstated. Using phones also makes it easy to integrate the new haptic channel with existing auditory, visual and maybe textual channels, thereby extending the phone’s capabilities as a communication device. As we have our phones with us or nearby most of the time, they are well suited not only for direct communication but also for implicit context communication. Imagine walking or riding the bus. Being able to assess a contact’s current context could equally be important when it comes to determining a good time to call.

The choice of vibration as an output modality not merely stems from its prevalence on the chosen platform and its availability and unobtrusiveness when carrying the phone in a pocket but also from the fact that movement such as impacts or strokes naturally transforms into tangible vibration in the real world e.g. footsteps on the floor or someone stirring on a sofa or even the feedback to one’s own hand when stroking something. On the phones we gather the accelerometer data which is then preprocessed, transmitted and mapped to the vibrotactile actuator of the other phone. The data is transmitted between the paired devices. The data is run over a wireless network connection. On the device itself we are using our developed algorithms to preprocess the sensor data, to connect the devices over the network and eventually to excite the vibration motor.

REFERENCES
  • Fee­labuzz — Di­rect Tac­tile Com­mu­ni­ca­tion with Mo­bile Phones, C. Le­ich­sen­ring, R. Tünner­mann, and T. Her­mann,  In­ter­na­tion­al Jour­nal of Mo­bile Human Com­put­er In­ter­ac­tion, 2011.
  • Di­rect Tac­tile Cou­pling of Mo­bile Phones with the fee­labuzz Sys­tem, R. Tünner­mann, C. Le­ich­sen­ring, and T. Her­mann,  in Mo­bile So­cial Sig­nal Pro­cess­ing, LNCS, 2011.

This is a joint work together with C. Leisenring and T. Hermann.