Earlier this year, Sony Pictures released one hell of an internal IT assessment. The report showed that not only was the company ignoring basic security protocol, its IT security was plagued with unmonitored devices, miscommunication, and a lack of accountability. It's dated Sept 25th, almost two months to the day before hackers exposed thousands of the company's most sensitive documents.
In an email to a group consisting largely of information security higher-ups and legal department heads, Sony's corporate audit department released a report detailing both its past protocol for dealing with "security incidents," and how management plans to prevent similar events in the future. What they found wasn't pretty.
Until September of 2013, Sony Pictures Entertainment (SPE) had outsourced its IT security to a third party security service—nothing unusual there. For huge companies like Sony, hiring out generally makes sense both in terms of cost and in-house resources (or lack thereof). Then, when something does go wrong, an additional third party is usually called in to figure out where the first one went wrong. SPE did this, too, until late last year.
According to the report, in 2013, Sony—SPE's parent company—decided to put its Global Security Incident Response Team (GSIRT) in charge of overseeing core responsibilities and general monitoring for the company's various subsidiaries, including Sony Pictures. While GSIRT would monitor security overall, the third-party team that SPE had been using was still responsible for implementing various security measures (firewalls, intrusion prevention systems, etc.).
It appears that's when things started breaking down.
According to the report, after the GSIRT took over monitoring duties, 1 out of 42 of SPE's firewalls and 148 non-security devices (e.g. routers and servers) went totally unmonitored because, according to the report, SPE's third-party security vendor never explicitly told its new overseer to do so. Which means that were any breach to take place on one of these devices, it's possible that no one would have any way of knowing.
It gets worse. From the report (emphasis added):
In addition, procedures have not been developed to reconcile the population of security devices that are being monitored by GSIRT to the actual SPE security devices that should be monitored to validate accuracy and completeness. As a result, additions, changes, and deletions not communicated by SPE to GSIRT may not be detected, and critical security devices may not be monitored.
To put it simply: Yes, parent company Sony's IT management is aware that it's leaving a significant number of devices unmonitored and, consequently, vulnerable. What's more, it has no process in place to prevent that problem from getting worse. As Jérôme Segura, a senior security researcher at Malwarebytes, explained to us over email, "Miscommunication, or lack of it can create serious issues that may go unnoticed but will come back to haunt."
While we don't know whether these particular vulnerabilities had anything to do with the devastating attacks that came to light last week—or if those attacks could have been prevented even with air-tight monitoring—that they existed and that Sony was so slow to react to them indicates a culture that failed to prioritize information security. Segura points out that "attackers only need to find one weakness, whether it may be an employee that they spear phish or an insecure network, to perpetrate their crime. Thinking that a company, because of its size and resources, would be hard to compromise is a fallacy."
To make matters worse, bringing in GSIRT—with its own set of priorities—appears to have led to some internal head-butting. Before the Sony parent company's security unit took over monitoring duties, SPE's own IT would receive regular security reports from whichever firm had previously been in charge of monitoring. Once GSIRT came on, though, those reports came to a halt. Segura notes that, "The reason given by GSIRT is that other things have 'have taken priority over executive reporting.' One can imagine how SPE's IT must have reacted to this comment." In other words, in terms of vital security, SPE's own IT was being left in the dark, and GSIRT's best response was that turning on the lights "would come later."
And what exactly was in these reports that GSIRT no longer felt the need send over? According to the audit (emphasis added):
The reports provided by the prior security monitoring providers included security threat trending (e.g.,common threats across SPE), log monitoring statistics (e.g., total events for a given month and how they are addressed), top attack categories for a given month, top sources of attacks by country, security devices providing the most alerts, top devices contributing to event correlation, the number of events triggered by more than one source (correlated events) and a summary of what SPE could do to reduce speciﬁc attacks.
In other words, SPE's IT no longer had access to the same sort of data that could prove invaluable in thwarting a hacker.
Again, it's likely that this attack could still have taken place even if Sony had done everything right. According to Matthew Green, a cryptographer and research professor at Johns Hopkins University, "I'm not sure if you can draw a direct line from this to the fact that they were hacked. Even working and monitored Intrusion Detection Systems systems are hardly a silver bullet when it comes to detecting sophisticated attacks like this one."
And as Green further points out, "It's not clear that nobody was monitoring them, just that some portions of the network weren't being monitored by the central security department." So while we do know for a fact that GSIRT wasn't actively monitoring a number of SPE's own devices, we can't say with any certainty that someone else wasn't filling that void.
Still, while it's important not to draw too many lines between this report and what increasingly appears to be the worst corporate hack in history, it does illustrate SPE's at times dysfunctional relationship with information technology. Even after SPE was made fully aware of the vulnerabilities it faced, it appears not to have been proactive in fixing them. That "would come later."
Art by Jim Cooke