Changes to the AI feature dubbed Recall come as Microsoft navigates the fallout of a string of high-profile security breaches. 

A logo of US company Microsoft is displayed during the Vivatech technology startups and innovation fair, at the Porte de Versailles exhibition center in Paris, on May 22, 2024. (Photo by JULIEN DE ROSA / AFP / Getty Images)

Microsoft on Friday said that it would make major changes to a recently announced AI product that relied on screenshots of users’ screens to make a searchable log of past activity, a move that comes after withering criticism from security researchers. 

When Microsoft announced the feature it dubbed Recall last month, CEO Satya Nadella referred to it as “photographic memory” that could “recreate moments from the past” of anything a user does, using the company’s proprietary artificial intelligence models running on the upcoming Copilot+ PCs

Security researchers quickly pointed out that such screenshots would include sensitive information, including usernames and passwords, but Microsoft said the data was secure. Prominent security experts, including Kevin Beaumont, quickly pointed out that the data was, in fact, stored in plain text, and called the decision to roll out the product, which was slated to be released June 18, “the dumbest cybersecurity move in a decade.”

Exploiting the feature turned out to be fairly easy. Alex Hagenah, a researcher with SIX Group AG, for example, created a tool called “TotalRecall” that copied the database and parsed it for interesting details.


On Friday, Microsoft announced a series of changes to the product, including making Recall opt-in, and off by default. Additionally, the product will require biometric enrollment through the company’s Windows Hello product to enable, and proof of presence to view the timeline of screenshots and search in Recall. The company also said it would enhance the encryption of the Recall database. 

“Even before making Recall available to customers, we have heard a clear signal that we can make it easier for people to choose to enable Recall on their Copilot+ PC and improve privacy and security safeguards,” Pavan Davuluri, Microsoft’s corporate vice president of Windows and devices, said in a blog post Friday.

The changes to Recall come on the heels of Microsoft’s vow to prioritize security in its product development in the aftermath of a series of high-profile security breaches at the hands of Russian and Chinese state-aligned hackers. A report by the U.S. Cyber Safety Review Board concluded that Microsoft had created a corporate culture that devalued security. 

That report prompted Nadella to issue a dictum to Microsoft employees ordering them to prioritize the security of its products. “If you’re faced with the tradeoff between security and another priority, your answer is clear: Do security,” he wrote in a memo to the company. 

Security experts have pointed to the security issues with Recall as evidence that Microsoft is not yet living up to that pledge, and Beaumont said Friday that details of how the changes are implemented will matter, suggesting that security researchers pursue a “deep dive in the coming weeks” on the enhanced security claims made by Microsoft. 

AJ Vicens

Written by AJ Vicens

AJ covers nation-state threats and cybercrime. He was previously a reporter at Mother Jones. Get in touch via Signal/WhatsApp: (810-206-9411).

Latest Podcasts