Advertisement

Responsive Advertisement

Unsecured AWS server exposed classified military intel

Sensitive military data found on an unsecured Amazon server belonging to the U.S. Army Intelligence and Security Command (INSCOM), a joint Intelligence effort with the NSA, was accessible to the public and included information on project Red Disk, an Army cloud-based intelligence platform, an auxiliary to the Distributed Common Ground System (Army DCGS-A), that failed. 
“Among the most compelling downloadable assets revealed from within the exposed bucket is a virtual hard drive used for communications within secure federal IT environments, which, when opened, reveals classified data labeled NOFORN – a restriction indicating a high level of sensitivity, prohibited from being disseminated even to foreign allies,” wrote researchers at UpGuard who revealed the Pentagon’s latest exposure. “The exposed data also reveals sensitive details concerning” the DCGS-A. 

UpGuard Director of Cyber Risk Research Chris Vickery came across an Amazon Web Services S3 cloud storage bucket within the AWS “inscom” subdomain, and set to public, on September 27. The main repository contained 47 viewable files and folders; three were downloadable and confirms the contents “highly sensitive nature,” according to an UpGuard blog post.

“The three downloadable files contained in the bucket confirm the highly sensitive nature of the contents, exposing national security data, some of it explicitly classified.

“The largest file is an Oracle Virtual Appliance (.ova) file titled “ssdev,” which, when loaded into VirtualBox, is revealed to contain a virtual hard drive and Linux-based operating system likely used for receiving Defense Department data from a remote location,” the post said. “While the virtual OS and HD can be browsed in their functional states, most of the data cannot be accessed without connecting to Pentagon systems – an intrusion that malicious actors could have attempted, had they found this bucket.”

Some of files contain top secret information and technical configurations, as well as the NOFORN classification. Metadata shows work by Invertix, a now-defunct third-party defense contractor and once INSCOM partner.
“Finally, also exposed within are private keys used for accessing distributed intelligence systems, belonging to Invertix administrators, as well as hashed passwords which, if still valid and cracked, could be used to further access internal systems,” according to the blog post. 

“Over the past month we have seen a number of enterprise organizations fail because they inadvertently did not configure existing security controls properly,” said Carl Wright, chief revenue officer (CRO) at AttackIQ. “This is called a protection failure and indicates that these organizations are doing little, to no testing to validate that existing security controls are working properly.” 

Organizations assume a “infinitesimal cost to validate” security controls “compared to the cost of a data breach,” said Wright. “It is a disturbing state of IT and security management when the attackers are routinely able to find protection failures before corporate or government security teams.”

Technology advances have leapt ahead of security, creating gaps for organizations.

The market’s investment in services and tools to automate business processes without incurring heavy maintenance costs has outpaced investment in the methods to secure them,” said Threat Stack CSO Sam Bisbee.  “Sometimes it’s safer to bring commoditized systems that are likely to leak sensitive information, such as log aggregation, into your own environment since they have become to cheap to maintain.” 

Bisbee said that the proliferation of services like GitHub and AWS S3 should drive organizations of all sizes to “understand whether the services they use to store data are in fact risk-appropriate for the type of data they put into them.”

He maintained that “security and operations teams have an opportunity to work together to help their enterprises manage the risk of data breach by auditing their current environments to understand what data is expected to be stored in them versus what is actually stored in them, the relative safety of the storage services, and then establishing appropriate controls and monitoring for when, how, and where data is accessed.”

Enregistrer un commentaire

0 Commentaires