Cloud security company Wiz has announced that 38TB of confidential data was leaked when Microsoft's AI research department published an open source AI learning model to a GitHub repository in July ...
Microsoft has learned an important lesson after having to clean up a major data leak resulting from an “overly permissive” shared access signature (SAS) token accidentally disclosed by one of its ...
Microsoft indicated on Monday that it had revoked an overly permissioned Shared Access Signature (SAS) token, said to have exposed "38TB" of internal Microsoft data. The action comes after ...
The Microsoft AI research division accidentally leaked dozens of terabytes of sensitive data starting in July 2020 while contributing open-source AI learning models to a public GitHub repository.
An overly permissive file-sharing link allowed public access to a massive 38TB storage bucket containing private Microsoft data, leaving a variety of development secrets — including passwords, Teams ...
Microsoft is warning IT administrators about an increase in attacks aimed at Azure Blob Storage, saying threat actors are taking advantage of exposed credentials, weak access controls and ...