Microsoft Corrects Glaring Security Gaffe Exposing 38 Terabytes of Private Data

by time news

Microsoft Addresses Security Flaw that Exposed 38 Terabytes of Private Data

September 19, 2023

In a recent announcement, Microsoft has acknowledged and taken steps to rectify a major security lapse that resulted in the exposure of 38 terabytes of private data. The leak was discovered on the company’s AI GitHub repository and was unintentionally made public when a bucket of open-source training data was published. This dataset also included a disk backup of two former employees’ workstations, which contained sensitive information such as secrets, keys, passwords, and over 30,000 internal Teams messages.

The repository in question, named “robust-models-transfer,” has since been taken down. Before its removal, it featured source code and machine learning models related to a 2020 research paper titled “Do Adversarially Robust ImageNet Models Transfer Better?”

Wiz, a security research firm, stated in a report that the exposure was due to an overly permissive SAS token, an Azure feature that enables users to share data securely. The token’s misconfiguration allowed access to the entire storage account, making additional private data vulnerable. Wiz reported the issue to Microsoft on June 22, 2023.

The README.md file of the repository urged developers to download the models from an Azure Storage URL. Unfortunately, this URL inadvertently granted access to the complete storage account, allowing unauthorized individuals to view, delete, and overwrite files.

Microsoft has responded to the incident, assuring customers that there hasn’t been any evidence of unauthorized exposure of customer data and that no other internal services were compromised. The company promptly revoked the SAS token and blocked external access to the storage account. The issue was resolved within two days of responsible disclosure.

To prevent similar risks in the future, Microsoft has expanded its secret scanning service to detect SAS tokens with overly permissive expirations or privileges. The company also identified a bug in its scanning system that incorrectly flagged the specific SAS URL in the repository as a false positive.

The researchers at Wiz have emphasized the importance of handling Account SAS tokens with caution, treating them as sensitive as the account key itself. They have recommended avoiding the use of Account SAS tokens for external sharing, citing potential vulnerabilities in token creation that can lead to data exposure.

This incident is not the first time misconfigured Azure storage accounts have raised security concerns. In July 2022, JUMPSEC Labs highlighted a similar scenario where threat actors could exploit such accounts to gain unauthorized access to enterprise on-premise environments.

This security gaffe adds to a series of recent incidents at Microsoft, including a cyberattack where hackers from China infiltrated the company’s systems and stole a highly sensitive signing key. As AI technology gains momentum, data scientists and engineers must prioritize additional security measures to safeguard the vast amounts of data they handle.

Wiz CTO and co-founder, Ami Luttwak, emphasized the need for increased monitoring and precautions in dealing with large datasets, particularly when collaborating on public open-source projects.

Follow us on Twitter and LinkedIn for more exclusive content.

Source: The Hacker’s News

You may also like

Leave a Comment