In a week where tech companies are trying hard to convince us that our data is perfectly safe nestled deep within their digital bosoms, Microsoft has come marching in with what appears to be both a groundbreaking innovation and possibly a fresh nightmare for privacy advocates. The culprit is Recall, a new AI-powered feature baked into Microsoft’s Copilot+ PCs, presumably named such because it recalls everything. Literally.
Recall’s elevator pitch is simple and boldly optimistic. It quietly takes periodic screenshots of a user’s activity, then uses AI magic to let them search through their past actions with what Microsoft describes as photographic memory. Sure, it sounds handy if you’ve ever lost track of that one Excel spreadsheet you opened thirty tabs ago, but critics are not exactly applauding just yet. In fact, they’re coughing pointedly into their cyber-handkerchiefs.
Among the first to cry foul was security researcher Kevin Beaumont, who spent the better part of this week sounding the alarm bells oh so politely. Beaumont noted that Recall stores data in a fairly accessible database stored locally and in plain text, or in the words of everyone’s least favorite heist movie trope, “just sitting there waiting to be stolen.” In fairness, Microsoft assures users that encryption is involved through BitLocker, which would be more reassuring if Recall didn’t also require you to log in with your Microsoft account to even use the PC, meaning your decryption key is sitting right next to your data like a roommate who never moves out.
Microsoft, for its part, has clarified that screenshots are stored locally and never leave the device unless you personally email them to someone or, say, accidentally upload your entire digital life to a cloud drive during a late-night productivity binge. Also, only apps with proper accessibility permissions can view the captured data, which sounds like a good measure until you remember that malware authors have never been too shy about asking for permissions. Very politely, of course.
Defenders argue that Recall is optional and can be turned off, which is true in the same way that parachutes are optional if you never plan to leave the plane. Meanwhile, others have pointed out that Microsoft has announced this is just the beginning and future iterations may come with further safeguards and controls, which is corporate speak for “we’re hoping this blows over in a month.”
So as Microsoft moves boldly into the era of always-on memory and forever screenshots, users are left weighing the convenience of never forgetting anything against the possibility of someone else remembering it instead. Rest assured, you may finally track down your forgotten passwords from six months ago, but you may also be sharing them inadvertently with the neighborhood hacker.
Memory may now be infinite, but so is irony.

