Companies are learning to read your mind, figuring out if you want to become pregnant, commit a crime or turn the channel.
You’re watching your favorite show, and a commercial comes on. How do you really feel about that product, that show?
Now, facial recognition technology can register the emotion on your face and transmit it to the company, possibly revealing your true thoughts—what you used to think of as personal information.
The BBC is looking at using this new face reading technology to see your “conscious or unconscious” reactions, reported the Daily Mail, after a pilot test monitored the faces of 5,000 people.
“The goal was to see if viewers reacted more positively or negatively if the content was clearly labelled as coming from an advertiser,” the article said. You could register six emotions—sadness, puzzlement, happiness, fear, rejection and surprise.
The results?
“The study showed that clearly labelled content boosted ‘explicit’ feelings of positivity towards the brand by 77 per cent,” the Daily Mail reported. “But it also revealed that the effect was subconscious as well, with exposure leading to a 14 per cent boost ‘subconscious positivity’ towards the brand.”
Your thoughts, your feelings, your subconscious, on their way to become data in someone else’s bank, and up for auction.
“From electronic commerce to social media, you have become the product,” Dave Lewis, founder of Liquidmatrix Security Digest.
And you are not in control.
“It has become very difficult to manage your personal data, especially as organizations seek to assemble and correlate data on you as an individual,” said James Arlen of Leviathan Security.
Hot or not?
From free lunches to expensive ski trips, pharmaceutical sales reps used to spend lot of money trying to influence doctors.
Now, they can use a new service that will tell them if the doc is even worth approaching, reported Bloomberg.
Zephyr promises to tell you the “value of a doctor’s prescription pad,” by creating a “digital dossier” of that physician, scouring data to see what prescriptions the doctor writes, and aggregating other information with it to create a ranking, according to Bloomberg.
“Gathering these strands, Zephyr generates profiles that score each doctor’s influence and ability to drive sales on a scale of 1 to 10,” the article said.
“This sort of specialized data is valuable,” Lewis told Archer News. “The companies stand to make a significant amount of money as the doctors are now their product.”
Your dossier
It’s not just for doctors. Marketing companies already have a digital dossier on you, and they are finding new ways to add to it.
“This happens to the average person who uses basically any service online,” Lewis explained. “The lynchpin in many cases is the mobile phone number of an individual.”
“If a company can get access to this sort of data, usually under the guise of ‘better security,’ this ties data from other sources together. Your purchasing habits, likes, dislikes and so forth, are laid bare,” he said.
Behavior tracking
You may already be accustomed to some kinds of behavior tracking.
“Humans are creatures of habit and our preferences are welcomed when we feel it is to our benefit,” said David Foose with Emerson Process Management.
“We enjoy having that waitress at your favorite restaurant know your order, and Pandora or iTunes know your favorite music so it can suggest or play the next track,” he told Archer News. “It becomes intrusive when you realize this information can be used by strangers to manipulate your decisions.”
Knowing your next move
Employee wellness firms are profiling your personality, where you shop, your personality type, if you vote, whether or not you are taking your medications, reported the Wall Street Journal.
The theory—that companies can predict your health choices and consequences, and help you stay healthier.
“I bet I could better predict your risk of a heart attack by where you shop and where you eat than by your genome,” Harry Greenspan with Deloitte LLP’s Center for Health Solutions, told WSJ.
He said that workers who shop at bikes stores are likely to be healthier than workers who buy video games, and people with lower credit scores may be more likely to be no-shows for follow-up appointments and refilling prescriptions, according to the article.
Another firm told the WSJ that people who vote in mid-term elections are usually healthier than the non-voters.
Are you expecting to be expecting?
And then there is the pregnancy issue.
A wellness company called Castlight started a new service that checks to see if women have stopped buying their birth control meds, or who have searched for information on fertility in the Castlight health app, reported WSJ.
Castlight’s chief research and development officer said they also check how old the woman is—and how old her kids are, if she has them—to see if she might be thinking of getting pregnant, according to the article.
“She would then start receiving emails or in-app messages with tips for choosing an obstetrician or other prenatal care. If the algorithm guessed wrong, she could opt out of receiving similar messages,” WSJ reported.
Helpful info?
For some, this kind of profiling and personal forecasting may be helpful.
You might change your health habits and start a new exercise routine, ultimately extending your life.
“Being predictable and having ‘relevant’ data presented to you is often a good thing,” said Foose. “We already get alerts on our credit cards when we make unusual purchases for items or places.”
But that data about you—that digital personality—is not yours, some cybersecurity experts say.
“You shed digital footprints every day,” said Arlen back in 2009, in a paper he co-wrote with computer law attorney Tiffany Rad.
For example, surveillance cameras across cities often document and store your image and travel patterns.
“When a lot of the video feeds are from private entities, who owns the data that is captured on camera? Know that you do not,” they wrote.
Reading your mind
At the time, the Department of Homeland Security was testing a system called FAST, or Future Attribute Screening Technologies.
The cameras measured your heart rate, body temp, breathing, and pheromone responses, according to Arlen, to see if you were nervous because you were thinking about breaking the law.
“Your thoughts are accessible and soon will be remotely readable,” Arlen and Rad said.
The paper described a security tool in use by casinos. Employees wear wrist bands that monitor their heart rates. If the rate shoots up suddenly, the wristband sends a signal to management.
“The premise is that if a casino employee’s heart starts suddenly beating rapidly, they are likely under stress,” the paper said. “This could be due to some emergency such as a robbery, or possibly because the employee is planning a theft.”
“What you commonly think of as your thoughts and your mind may no longer be entirely your own,” they said.
Reading your eyes
Law enforcement officers are currently taking eye-detection—as opposed to lie detection—tests to get their jobs in Utah, Texas, Florida and more, according to the start-up Converus in Motherboard.
The company said infrared cameras scan your eye movement as you read statements like, “I am innocent of engaging in terrorist activity,” Motherboard reported.
Under stress, your eyes change, with things like slight dilation giving you away, the company said, claiming up to 85% accuracy.
“The answers are loaded into a secure server, and uploaded to a cloud, which generates a ‘credibility score’ in five minutes,” Motherboard said.
Encrypted and secure?
Your digital you is not only cavorting with marketers and government agencies, but could also end up in some very uncomfortable places.
“There are enormous potential risks” in the wellness firms’ data profiling efforts, like your personal health data being exposed to your bosses or others, law professor Frank Pasquale told WSJ.
Cybersecurity experts say some companies and government agencies do not protect your information enough.
“When you consider the sheer quantity of breaches and how the outcomes of that data as a ‘big data’ project become more and more useful, you’re in a scary situation that has no positive outcomes,” Arlen told Archer News.
“Consider the ‘big data’ combination of the OPM [Office of Personnel Management] breach when combined with the Ashley Madison breach. That’s an interesting dataset to mine!” he added.
What now?
In some ways, it may be too late, some cybersecurity experts say.
“The problem is that people are only waking up to this sort of activity long after the ship has left the dock—and sunk,” said Lewis.
It would be hard to live in the modern world without adding to your digital dossier every day.
“Ultimately, you’d like to control your identity and information being stored about you, but I highly doubt you truly can live ‘off the grid’ enough to make a difference,” said Foose.
Awareness and more
For Arlen and Rad, awareness is key—and possibly new laws to protect what companies do you with your digital you.
The digital rights group Electronic Frontier Foundation says on its site, “National and international laws have yet to catch up with the evolving need for privacy that comes with new technology,” and offers guides on “surveillance self-defense.”
In the UK, the Information Commissioner’s Office is working on guidelines for dealing with the new tracking technologies and their uses, reported The Guardian.
The ICO warned the public that some stores can and do track their customers by their phones and facial recognition technology, offering you deals based on your face and seeing where you go in the store.
“Even if the identification of individuals is not the intended purpose, the implications of intelligent video analytics for privacy, data protection, and other human rights are still significant,” said the ICO’s Simon Rice in the article.
Some of this information can be used to help you. Maybe you would like a deal based on your behavior the last time you shopped, or maybe the store can build a better layout based on your preferences.
“But the scope for invasive tracking and profiling is large,” Rice warned in The Guardian, “If the technology is not used with privacy in mind.”