23.3 C
New York
Sunday, October 1, 2023

Microsoft: blunder leaks terabytes of sensitive information

Must read

When a team of researchers working on artificial intelligence uncovered a vast collection of data, the human factor once again caused a mistake that could have been very costly to Microsoft. A terabyte of confidential internal information was leaked.

The intentions were noble. Distribute and make available large collections of useful data in your digital warehouse; storage bucket Released in a GitHub repository with the goal of speeding the development of AI-based solutions. But its execution wasn’t perfect.

Access a 38 TB collection of sensitive Microsoft data

As the publication progresses tech crunchI discovered to start Specialized in security for cloud storage solutions, or cloud, Wiz. The source points out that his GitHub repository, which belongs to Microsoft’s specialized AI division, made a critical mistake that leaked sensitive information from the company itself.

As we worked on new image recognition algorithms and AI models, content made freely available by Microsoft through the same repository, something else was present. According to Wiz, someone… cloud.

All of 38 terabytes Information stored by Microsoft cloud The files in question contain more than the shared AI model and may be accessible to anyone involved. In other words, in addition to what was expected, they also had access to what was not expected.

File was accidentally published via GitHub

microsoft

The improperly published files included several files backup Personal computers of at least two Microsoft employees. It also contained useful information from many other employees, ranging from passwords for Microsoft platforms and services to his more than 30,000 internal messages sent via Microsoft Teams.

That being said, all I needed was a URL (hyperlink) to access the huge collection dating back to 2020. This is all because someone accidentally clicked and selected the “Full Control” option instead of granting “View Only” permissions. Mr. Wiz points out.

An increased risk is that a malicious attacker could, for example, replace some of these files with malicious software. In fact, here you can get an access vector, a pass into Microsoft and its services.

Upon discovering and realizing the magnitude of the error, the organization Wiz immediately notified Microsoft on June 22nd. Microsoft has since revoked access to the collection in question, just two days later, on June 24th.

Meanwhile, the company plans to complete its security investigation by August 16th. For now, we must wait to see the real impact this accidental breach may have had on North American technology.

Microsoft’s AI research team accidentally exposed 38 terabytes of sensitive data, including private keys and passwords, while publishing an open source training data storage bucket on GitHub https://t.co/AxIjdlpEAz. It’s gone.

— Carly Page (@CarlyPage_) September 18, 2023

4gnews editors recommend the following:

Source: 4G News

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article