When Touch Goes Digital

 

What if I told you that a simple touch on the shoulder could change how much you tip, how quickly you recover from an illness, or even how well you hit a baseball?

There’s this whole field of science called Social Touch. It combines psychology, neuroscience, and engineering, and in one of the early studies from the 1980s, they discovered that a waitress could significantly influence the amount you tip, just by lightly touching your shoulder. Similarly, in a different study it was shown that a doctor can influence your recovery rate just by touching your forearm. In baseball, you’re more likely to hit the ball if your coach gives you tap on the back. Waiters, doctors, and people in sports all know this to be true. The effect is called Midas Touch and it’s super powerful. Even in business, a deal is more likely to close if you start negotiations with a good handshake.

But with Covid and everything moving online, this physical social touch, the power of the Midas touch, is lost. So in our project, the Touchless project, we set out to create tactile technologies that could emulate social touch but in a virtual and online environment. Core ingredients of the project included hardware engineering, user experience interaction design, neuroscience, and AI development. The main idea was to build interfaces that can deliver tactile sensations and teach machine algorithms how to use them, and when to trigger them. And we did that, quite impressively actually, but the most interesting and long lasting contribution for me wasn’t the actual technology prototypes because those will become outdated in 4-5 years anyway.

 
 

The main contribution was the realisation that we humans have a right to our own sensory autonomy, which is the right to control your sensory inputs, and that tactile technologies have the ability to invade that, unless we do something about it.

Let me give you an example. I don’t like horror films, I can close my eyes or look away. I don’t like your music, I can plug my ears or put on some noise-canceling headphones. I don’t like the smell of your food, I can hold my nose or go to a different room. I can’t do that for touch, there’s no off-switch for my skin. It’s always on. So in the context of digital touch, that is tactile devices like wearable smartwatches, wristbands, vests, and of course smartphones that vibrate when i touch them or while in my pocket, they can potentially challenge our sensory autonomy by sending me vibrating stimuli, perceptible or subliminal, that can influence my behaviour. They could make me like something, agree with someone, or buy an item that I wasn’t sure about before. So this is an interesting and not so well known pathway of how AI and emerging technologies might influence human behaviour, and the paradox here is that while this project started as an attempt to create and build such capabilities, and we did, but we also ended up discovering the need to block them.

What can we do about this?

First of all, we should ensure that there’s an easy off-switch and that you can opt-out if you want to. And second, there needs to be transparency as to who, when, and why you are receiving some tactile stimulation. Basically, we need to approach digital touch interactions with the same level of diligence and caution that we apply to privacy and data security.

The Touchless project has received funding from the EU Horizon 2020 research and innovation programme under grant agreement No 101017746

 
Next
Next

How I got myself into emerging tech space