Did you know that your heart is one-of-a-kind? Not metaphorically, speaking. Literally, your heartbeat (cardiac signature) is as unique to you as your fingerprint. While cardiac signatures are most commonly used to detect fatal heart conditions (the Apple Watch’s EKG monitor is already saving people’s lives), your heartbeat will soon be used to identify who you are.
And just like the research in all other areas of biometrics (fingerprints, irises, palm prints, faces, gait, DNA, typing rhythm, and voice), cardiac signatures will fall into the identification purgatory of security and surveillance:
A new device, developed for the Pentagon after US Special Forces requested it, can identify people without seeing their face: instead it detects their unique cardiac signature with an infrared laser. While it works at 200 meters (219 yards), longer distances could be possible with a better laser.
David Hambling, MIT Technology Review
The new device, called Jetson, uses a technique known as laser vibrometry to detect the surface movement caused by the heartbeat. This works though typical clothing like a shirt and a jacket (though not thicker clothing such as a winter coat).
Cardiac signatures are already used for security identification. The Canadian company Nymi has developed a wrist-worn pulse sensor as an alternative to fingerprint identification. The technology has been trialed by the Halifax building society in the UK.
Before long (if not already), cardiac signatures will be added into the FBI’s Next Generation Identification, which is the expansion of their fingerprint identification system to include more biometrics such as palm prints, irises, and facial ID.
How exactly the FBI will collect heartbeats, I’m not sure. However, I’d imagine that wearable device companies will play some role here since some wearables are already “fingerprinting” our heartbeats with EKG functionality (Withings and Apple Watch). While I don’t think that either of these two companies will comply with the government, others will likely follow suit.
Biometrics have long piqued our security interests. Honestly, I’m still waiting for an iris scanner so I can build my high-tech, secure bunker like a James Bond villain. But it’s biometrics for the purposes of surveillance, both governmentally and corporately, where the money is being dumped.
Corporate Biometric Uses
Biometrics will take tracking to another extreme. In the movie Ocean’s Thirteen, The Bank casino deploys a system called the Greco Player Tracker which analyzes a person’s heart rate, body temperature, facial reactions, and pupil dilation to determine whether a win is legitimate or not – basically to catch cheaters in the act:
In 2007, it seemed futuristic and impossible. 12 years later, all of those technologies exist from the cardiac signature detection to the AI models that compile the data.
This gives you an idea how a casino would use this technology for security purposes. But the same technology could be used for tracking and identifying emotions – feeding into the Future of Emotion-Based Advertising.
The next 20 years of advertising will be defined by emotional intelligence and the ability of advertisers to understand the feelings and emotions that customers have toward their interests at any given time.
Emotion-Based Ads
The New York Times rolled out a tool earlier this year called Project Feels that lets advertisers target ads to content based on emotional responses the content is predicted to have. With the ability to understand the mood that someone will be in after they read an article, The New York Times can more accurately queue up subsequent articles to keep their attention.
Facebook and Google could employ biometric tracking to improve their advertising offering, giving better feedback on the effectiveness of an advertiser’s messages. Physical retail stores could track a person’s movements and reactions – thus predicting preferences. These are a bit more far-future. But we aren’t taking the steps to prevent biometric tracking today.
Ethically Deprived
The main issue I see is that our ethics and norms have yet to catch up to encompass surveillance possibilities, let alone how biometrics play into this. And it’s not customary with all technology:
- With robotic automation, we have a visceral uproar to the loss of jobs
- With invasive advertising tracking, we involuntarily get creeped out
- With predatory robo-calls, we get very, very annoyed
With surveillance and the talk of losing our privacy, we say, “I have nothing to hide” and move on with our day. And this just makes no sense to me.
When you’re at a restaurant talking with whomever you’re dining with, you’d be pretty irked if someone was eavesdropping on your conversation. You might talk quieter or even tell them to piss off. Yet, for some reason when this happens digitally, our social values go right out the window. And all of a sudden, it’s alright for eavesdroppers to tune into our conversation. Suddenly, we have “nothing to hide”.
First of all, if you’re in a profession where you need to protect information – lawyers, judges, doctors, politicians, government officials, police officers, members of the armed forces or security services, as well as journalists, activists, trade unionists, or are associated with anyone who is in any of these categories – then government surveillance (accessing this information) should frighten you.
Additionally, biometric surveillance will eliminate any and all hope of walking around anonymously:
Hong Kong protesters have had to take extraordinary precautions to avoid leaving digital trails that would prove their attendance. The shifting framework for human rights in the age of surveillance is presenting us with some thorny questions: we do not allow the police to enter and search our property without a warrant, so why would we let them take our face, voice, fingerprint or DNA data when this can be far more revealing?
Azeem Azhar, Exponential View
Imagine if you couldn’t go to a protest without being fired from your job. Or if you couldn’t speak your mind on Twitter without being put on a no-fly, government blacklist.
Everyone always thinks “That’ll never happen to me”… right up until it happens to them.