Archer

App developers get a warning from the Federal Trade Commission over hidden app audio kits.

You saw the app on Google Play. You downloaded it. But you never got any sort of notification that you were actually turning your phone into a secret listening device.

The Federal Trade Commission sent a warning letter to a dozen app developers, telling them that their secret audio apps might be in violation of the law.

“These apps were capable of listening in the background and collecting information about consumers without notifying them,” said Jessica Rich, the FTC’s Bureau of Consumer Protection Director in a press release.

“Companies should tell people what information is collected, how it is collected, and who it’s shared with,” she said.

SilverPush

The apps are using SilverPush code, the FTC said.

SilverPush is an India-based company that says it shows marketing companies what people are watching on TV, for how long, and when.

You download a SilverPush app, and it uses your phone’s microphone to listen for ultrasonic, or inaudible, tones sent by things like ads on television. As you watch an ad or a program that sends out a silent beacon, your phone will pick it up, allowing companies to monitor your habits, likes and dislikes.

You will not hear it the audio beacon. But you know about it because you agreed to it, right?

SilverPush told Forbes in November that it notifies you when you download the app and agree to let the app use the microphone. 

“Allowing access to microphone would enable the app to hear for TV ads. By allowing access you will get exclusive offers and deals about those ads,” the SilverPush app screen says in small print.

Some companies will even pay you to let them listen in and track what you watch on TV, as Archer News reported last month

No warning

But the apps now under scrutiny gave no alert, no notification, no warning, the FTC said. The apps may not be affiliated with the SilverPush company itself, the agency’s warning letter said they appear to be using the SilverPush listening technology code.

“It’s important to give the straight story about what your app can do and be transparent about your privacy practices,” the FTC says on its web advisory page for companies developing apps. “Tell the truth about what your app can do.”

“False or misleading claims, as well as the omission of certain important information, can tick off users and land you in legal hot water,” it says.

Audio beacons on your phone & TV?

SilverPush has not yet responded to Archer News’ request for information, but Forbes reported in November that SilverPush said all 100 of its clients monitoring people with audio beacons are in India.

Do these apps with SilverPush code mean that another company is using the same kind of technology to track you in the U.S.?

Possibly. But the code may also have ended up in the apps unintentionally, said Aaron Lint, Chief Scientist at Arxan Technologies.

Developers may not be aware that the code they are using can do more than just the intended function, he said.

“Many times, the developer is only doing something that makes his job more convenient—but has wide-reaching security consequence,” he told Archer News.  

Also, a developer may be using “unaudited” code from a third-party, Lint said.

“It may appear to function adequately, but it is very easy to embed code that could undermine the security of the app, even without the developer’s knowledge,” he explained.

In addition, someone else could take an app and reverse engineer it, modify it, and repackage it with malware embedded, said Lint.

“All of these attacks can be prevented by taking an end-to-end approach of building secure, verifiable code from the beginning, and taking steps to ensure that integrity of the app is maintained all the way to the user’s device,” he said.

Arxan’s 2016 State of Application Security report showed that 98% of apps tested lacked binary code protection and could be reverse engineered or modified, and 50% of organizations have zero budget allocated to protecting mobile apps.

And a survey by the Ponemon Institute announced last week found that many companies are failing to focus on security in app development.

Can you tell?

What can people do to check for audio beacon technology on their phones?

“To answer your question, from an end user perspective, there’s frankly very little that they could do,” said Stephen McCarney, VP of Marketing at Arxan.

“Many folks mistakenly believe that they would be able to know if something were ‘off,’ etc.,” he said. “There is really little a typical end user could do to check, since the app would likely function and operate as intended.”

A review of apps using your microphone may give you some information. The FTC said the apps they are concerned about showed no real reason to access your microphone. 

A security review for apps would help, said McCarney. Then people could check the “risk” of an app before they download it. 

“Why not a security rating?” he asked. “After all, it would follow a similar premise as a supermarket, where food nutrition labels are provided so end users could know what they will be consuming before they purchase. More transparency would be helpful, but for end users to check themselves would not be effective.”

Tracking worries

Some who work in the area of privacy are concerned about what companies can do with this kind of audio beacon monitoring.

The Center for Democracy & Technology wrote a letter to the FTC last fall asking for the the agency to look into SilverPush technology and the concept of cross-tracking—using the monitoring of all of your different devices to put together a picture of you. 

The letter gave an example of a company tying your devices together with audio beacons, then watching as you do a computer search for symptoms of a sexually transmitted disease on your computer, look up directions to a Planned Parenthood on your phone, visit a pharmacy and return home.

“While previously the various components of this journey would be scattered among several services, cross-device tracking allows companies to infer that the user received treatment for an STD,” the letter said.

“The combination of information across devices not only creates serious privacy concerns, but also allows for companies to make incorrect and possibly harmful assumptions about individuals,” it continued.

Creepy?

“Yes, it can be creepy,” said McCarney.

“The privacy laws and regulatory requirements, etc., simply aren’t able to keep pace, which is allowing companies to forge quickly ahead with features and capabilities that for all intents and purposes might have legitimate purposes for the good,” he said.

But it is important to understand how these things could have unintended consequences, added McCarney.

“And, at what point do end users need to be made aware of such risks and possibilities. And where is that line drawn,” he said.

“For now, this is part of the conundrum and confusion—and malicious/devious actors know this and tend tgo after easy, vulnerable soft targets,” said McCarney.