Skip to main content

Microsoft accidentally released 38TB of private data in a major leak

It’s just been revealed that Microsoft researchers accidentally leaked 38TB of confidential information onto the company’s GitHub page, where potentially anyone could see it. Among the data trove was a backup of two former employees’ workstations, which contained keys, passwords, secrets, and more than 30,000 private Teams messages.

According to cloud security firm Wiz, the leak was published on Microsoft’s artificial intelligence (AI) GitHub repository and was accidentally included in a tranche of open-source training data. That means visitors were encouraged to download it, meaning it could have fallen into the wrong hands again and again.

A large monitor displaying a security hacking breach warning.
Stock Depot / Getty Images

Data breaches can come from all kinds of sources, but it will be particularly embarrassing for Microsoft that this one originated with its own AI researchers. The Wiz report states that Microsoft uploaded the data using Shared Access Signature (SAS) tokens, an Azure feature, that lets users share data through Azure Storage accounts.

Recommended Videos

Visitors to the repository were told to download the training data from a provided URL. However, the web address granted access to much more than just the planned training data, and allowed users to browse files and folders that were not intended to be publicly accessible.

Please enable Javascript to view this content

Full control

A person using a laptop with a set of code seen on the display.
Sora Shimazaki / Pexels

It gets worse. The access token that allowed all this was misconfigured to provide full control permissions, Wiz reported, rather than more restrictive read-only permissions. In practice, that meant that anyone who visited the URL could delete and overwrite the files they found, not merely view them.

Wiz explains that this could have had dire consequences. As the repository was full of AI training data, the intention was for users to download it and feed it into a script, thereby improving their own AI models.

Yet because it was open to manipulation thanks to its wrongly configured permissions, “an attacker could have injected malicious code into all the AI models in this storage account, and every user who trusts Microsoft’s GitHub repository would’ve been infected by it,” Wiz explains.

Potential disaster

A digital depiction of a laptop being hacked by a hacker.
Digital Trends

The report also noted that the creation of SAS tokens – which grant access to Azure Storage folders such as this one – does not create any kind of paper trail, meaning “there is no way for an administrator to know this token exists and where it circulates.” When a token has full-access permissions like this one did, the results can be potentially disastrous.

Fortunately, Wiz explains that it reported the issue to Microsoft in June 2023. The leaky SAS token was replaced in July, and Microsoft completed its internal investigation in August. The security lapse has only just been reported to the public to allow time to fully fix it.

It’s a reminder that even seemingly innocent actions can potentially lead to data breaches. Luckily the issue has been patched, but it’s unknown whether hackers gained access to any of the sensitive user data before it was removed.

Alex Blake
Alex Blake has been working with Digital Trends since 2019, where he spends most of his time writing about Mac computers…
Bing Chat’s ads are sending users to dangerous malware sites
Bing Chat shown on a laptop.

Since it launched, Microsoft’s Bing Chat has been generating headlines left, right, and center -- and not all of them have been positive. Now, there’s a new headache for the artificial intelligence (AI) chatbot, as it’s been found it has a tendency to send you to malware websites that can infect your PC.

The discovery was made by antivirus firm Malwarebytes, which discussed the incident in a blog post. According to the company, Bing Chat is displaying malware advertisements that send users to malicious websites instead of filtering them out.

Read more
Microsoft Bing and Edge are getting a big DALL-E 3 upgrade
Microsoft Copilot comes to Bing and Edge.

Microsoft Copilot is coming to Bing and Edge Microsoft

You'll soon be hearing more about Microsoft Copilot and Bing Image Creator as these innovative technologies come to Microsoft Edge and Bing. The news of their arrival was delivered at Microsoft's Surface Event, along with several more AI and hardware announcements.

Read more
Microsoft’s Copilot AI will have an ‘energy,’ apparently
The Microsoft Windows logo surrounded by colors of red, green, yellow and blue.

Microsoft has just unveiled the latest version of Windows 11, and it features updates across the operating system, from AI to new tools and features.

Among the updates are changes to Microsoft’s Copilot AI tool, which will have more features to help users in apps like Word and Excel, as well as within Windows 11 itself. Copilot can be used to summarize meetings, write emails, help with analysis, and much more.

Read more