An explosive spy ware report reveals the bounds of iOS and Android safety


Enlarge / A report this week shows that the problem of high-quality spyware is far more widespread than previously feared.

The shadowy world of private spyware has long been a concern in cybersecurity circles as authoritarian governments have been repeatedly caught attacking the smartphones of activists, journalists and political rivals with malware bought from unscrupulous brokers. The surveillance tools from these companies often target iOS and Android, which appear to have been unable to keep up with the threat. However, a new report suggests the scale of the problem is far greater than feared – and has put additional pressure on mobile technology makers, especially Apple, from security researchers seeking remedies.

This week, an international group of researchers and journalists from Amnesty International, Forbidden Stories and more than a dozen other organizations released forensic evidence that a number of governments around the world – including Hungary, India, Mexico, Morocco, Saudi Arabia and the United Arabs Emirates – possibly customers of the infamous Israeli spyware company NSO Group. The researchers examined a leaked list of 50,000 phone numbers of activists, journalists, executives and politicians, all of which were potential surveillance targets. They also examined 37 devices that were infected with or attacked by NSO’s invasive Pegasus spyware. They even developed a tool that you can use to check if your iPhone has been compromised.

The NSO Group marked the study Tuesday in a strong denial as “false claims by a consortium of media”. A spokesman for the NSO Group said, “The list is not a list of Pegasus targets or potential targets. The numbers in the list are in no way related to the NSO Group. Any claim that a name on the list is necessarily associated with a Pegasus target or potential target is false and fake. “On Wednesday, the NSO Group said it will no longer respond to media inquiries.

NSO Group isn’t the only spyware provider, but it has the highest profile. WhatsApp sued the company in 2019 for allegedly attacking over a thousand of its users. And Apple’s BlastDoor feature, introduced in iOS 14 earlier this year, was an attempt to block “zero-click exploits,” attacks that don’t require victims to type or download. The protection doesn’t seem to have worked as well as intended; The company released a patch for iOS on Tuesday to address the latest round of alleged hacking by the NSO group.

In light of the report, many security researchers say that both Apple and Google can and should do more to protect their users from these sophisticated surveillance tools


“It definitely shows challenges in general with mobile device security and investigative capabilities today,” says independent researcher Cedric Owens. “I also think that both Android and iOS zero-click infections from NSO show that motivated and resourceful attackers can still succeed despite the control Apple has over its products and its ecosystem.”

Tension has long simmered between Apple and the security community about the limits of researchers’ ability to conduct forensic investigations on iOS devices and provide surveillance tools. More access to the operating system would potentially help intercept more attacks in real time, so researchers could gain a deeper understanding of how these attacks were even constructed. Right now, security researchers rely on a small number of indicators within iOS as well as the occasional jailbreak. And while Android is inherently more open, it also limits what’s called “observability”. To effectively combat high-profile spyware like Pegasus, some researchers believe it would require things like access to read a device’s file system, the ability to examine what processes are running, access to system logs, and other telemetry.

Much criticism has centered on Apple in this regard, as the company has historically provided stronger security protections to its users than the fragmented Android ecosystem.

“The truth is that we’re keeping Apple to a higher standard precisely because they’re doing so much better,” said Juan Andres Guerrero-Saade, SentinelOne’s principal threat researcher. “Android is free. I don’t think anyone expects Android’s security to improve to the point where we only have to worry about targeted attacks with zero-day exploits.”

In fact, Amnesty International researchers said that Apple devices infected with Pegasus malware actually found it easier to find and investigate indicators of compromise than devices running standard Android.

“In Amnesty International’s experience, investigators have significantly more forensic evidence on Apple iOS devices than on standard Android devices, so our methodology focuses on the former,” the group wrote in a detailed technical analysis of their findings on Pegasus. “As a result, the most recent cases of confirmed Pegasus infections have been in iPhones.”

Part of the focus on Apple also comes from its own emphasis on privacy and security in its product design and marketing.

“Apple tries, but the problem is they don’t try as hard as their reputation suggests,” said Matthew Green, cryptographer at Johns Hopkins University.

However, despite its more open approach, Google is facing similar criticism of the visibility security researchers can get into its mobile operating system.

“Android and iOS have different types of logs. It’s really hard to compare them, ”said Zuk Avraham, CEO of the research group ZecOps and a long-time advocate of access to mobile system information. “Each has an advantage, but both are equally insufficient and allow threat actors to hide.”


However, Apple and Google both seem reluctant to reveal more about digital forensic sausage making. And while most independent security researchers advocate the postponement, some also concede that improved access to system telemetry would also help bad actors.

“Although we know that persistent logs would be more useful for forensic applications like those described by researchers at Amnesty International, they would also be useful for attackers,” a Google spokesman said in a statement to WIRED. “We continuously balance these different needs.”

Ivan Krstić, head of Apple Security Engineering and Architecture, said in a statement: “Apple clearly condemns cyberattacks against journalists, human rights activists and others who want to make the world a better place. Apple has been the industry leader in security innovation for over a decade, and as a result, security researchers agree that the iPhone is the safest, most secure mobile device out there. Attacks like the one described are sophisticated, cost millions of dollars to develop, are often short-lived, and are used to target specific individuals. While that means they’re not a threat to the overwhelming majority of our users, we continue to work tirelessly to keep all of our customers safe and we’re constantly adding new safeguards to their devices and data. “

The trick is to find the right balance between offering more system indicators without inadvertently making things easier for the attackers. “Apple could do a lot in a very safe way to enable the observation and imaging of iOS devices to detect this type of bad behavior, but that doesn’t seem to be a priority,” said iOS security researcher Will Strafach. “I’m sure they have fair political reasons for doing this, but I disagree and would like to see changes in that thinking.”

Thomas Reed, director of Mac and Mobile Platforms at antivirus maker Malwarebytes, agrees that more insights into iOS would help keep users safe. However, he adds that allowing specific, trustworthy surveillance software would come with real risks. He points out that there are already suspicious and potentially unwanted programs in macOS that Antivirus cannot completely remove because the operating system provides them with this special kind of system trust, possibly incorrectly. The same problem with rogue system analysis tools would almost inevitably show up on iOS as well.

“We are also constantly seeing malware from nation states on desktop systems that is discovered after several years of undetected deployment,” added Reed. “And that on systems where many different security solutions are already available. Many eyes looking for this malware are better than few. I’m just worried about what we will have to trade for this visibility.”

The Pegasus Project, as the research consortium calls the new findings, underscores the fact that Apple and Google are unlikely to be able to solve the threat posed by private spyware providers on their own. The extent and scope of the potential Pegasus targeting suggest that a global ban on private spyware may be necessary.

“A moratorium on the intrusion software trade is the bare minimum for a credible response – sheer triage,” tweeted NSA surveillance whistleblower Edward Snowden on Tuesday in response to the results of the Pegasus project. “Everything less and the problem gets worse.”

On Monday, Amazon Web Services took its own step and closed the cloud infrastructure connected to NSO.

Regardless of what happens to the NSO Group in particular, or the private surveillance market in general, stealth targeted attacks from any source will still take place on user devices. Even if Google and Apple cannot be expected to solve the problem on their own, they must keep working on a better way.

This story originally appeared on