How Capital One’s AWS Worker Breach Could Have Been Avoided

    Capital One Data Breach
    Source: Toronto Star

    Organizations are desperate to defend themselves from the many threats they face, from hacking and other cyber-security risks to system outages and compliance mistakes. But being desperate to avoid disasters and taking the right steps to do so are two different things. Whether it’s banks, healthcare companies, or governments, too many organizations are not doing enough to ensure they have the protection they crave. 

    Failing to have the correct infrastructure and protocols in place, many organizations also suffer from directing their attention and resources in the wrong ways. While malicious hacking is a risk, organizations could end up spending too much time worrying about the spectre of some obscure hacking group while failing to recognize dangers of insider threats from their own staff. Capital One’s AWS worker breach is a perfect example of this.

    The Background 

    The incident occurred earlier this year and involved a Seattle woman, Paige A. Thompson, who illegally accessed data at Capital One Financial Corp. Thompson was a former Amazon Web Services employee, and she accessed the data via the cloud data storage service that Capital One was using. However, although AWS was storing the stolen data, Amazon pointed out that the theft was not a result of any vulnerability in their systems.

    The stolen data affected more than 100 million customers, and mainly consisted of details belonging to consumers and small businesses that had applied for credit cards over the past 10-to-15 years. Personal information such as addresses, phone numbers, income, and credit scores was among the data. Tens of thousands of social security numbers and bank account details were also accessed. 

    One of Thompson’s goals, according to the indictment, was to use the stolen data to facilitate cryptojacking, a process in which the perpetrator uses the cloud computing power of multiple victims to mine cryptocurrency. It’s an easier and quicker method of seeing financial gain from stolen data than many other ways of using the data. She was accused of hacking 30 organizations that used the same cloud provider, including a telecommunications conglomerate and a public research university.

    capital one data breach
    Source: Bankrate.com

    Thompson, a software engineer, accessed the data by creating a tool that allowed her “to identify servers for which web application firewall misconfigurations permitted commands sent from outside the servers to reach and be executed by the servers,” according to court documents. She used the Tor network along with a virtual private networking service known as iPredator to conceal her identity.

    Why the Incident Should Never Have Happened

    Thompson doesn’t appear to have been any kind of tech genius (she was even stupid enough to reveal her identity by bragging about what she had done to GitHub users), she wasn’t working with any group for some bigger political or social cause, and she didn’t have high-level access to sensitive data. This should mean she was never able to do what she wanted to do, but somehow she succeeded. 

    James Hadley, CEO at the startup Immersive Labs, told Threat Post: “From reading their description of the breach, you would be forgiven for thinking it was an elite hacker exploiting a vulnerability. In reality, as stated by the FBI, it was simply a poorly configured firewall that allowed the hacker in.”

    Thompson seems to have been a low-level admin user who will now cost the companies involved millions of dollars in legal fees, PR, and damage limitation. That’s a figure much higher than she probably ever planned to gain from her crime. It’s particularly embarrassing that she was only caught because her own pride led her to brag publicly about what she had done. The incident has also cost Capital One CISO Michael Johnson his job as security chief, but all of this could have been avoided. 

    As mentioned, the problem is not with using cloud services like AWS. Far from it. But the incident does highlight the dire need for more to be done to prevent insider threats within companies whose business is not primarily in the tech sphere. Particularly financial organizations that have a large responsibility towards their clients and their finances. And yet, for every indicted criminal who attacked their own organization, there are likely many more who get away with their crime. 

    What Should be Done 

    The solution to the problem of too many insider attacks is two pronged. The first course of action is to improve the technical side of cyber-security and to assess what went wrong. The second is to have the right employment protocols in place to discourage employees from exploiting any weaknesses that may continue to exist. In short, HR departments and IT departments both need to be involved in the solution.

    A lack of strong encryption has been highlighted as a possible failing at Capital One. Many companies are guilty of not using good enough encryption, because it is expensive and requires complex specialized software. Capital One are not alone in their error. Plenty of big organizations fail to use end-to-end encryption, which turns text into a cipher that can only be read with a digital key, and tokenization to cloak data with a randomly generated value that needs a second system to unlock it.

    Capital One could even be accused of being irresponsible simply because of how long they held onto the data. Storing data for more than ten years seems largely unnecessary. Europe’s General Data Protection Regulation says that no company should hold onto data for longer than is necessary, and this should soon be prevailing wisdom around the world. 

    As well as improving encryption and being more selective about what data they store, companies need to put procedures in place to prevent anyone from even trying to access the data. 

    One of the first things an organization should do is to conduct a comprehensive risk assessment, taking into account all possible threats both physical and virtual. Risk assessments should be repeated often.

    Employee activity on systems should be regularly monitored and audited, and a log correlation engine and event management system should be employed. Extensive historical logs should be kept in case an investigation is needed. 

    Permissions should be audited to make sure they are allocated only where they absolutely need to be. Keeping a close eye on who should be doing what makes it easier to spot any unusual and suspicious activity. However, nothing does this better than artificial intelligence. AI can immediately detect anything happening within a system that shouldn’t be, and can send alerts. Given how lax organizations have been in other areas of cyber-security, we might expect that they won’t all have sophisticated AI working on their side any time soon. But the sooner the better.

    Other important steps include requiring authorization from multiple employees (at least two) for any data to be copied or transferred to new locations, including removable devices. Remote access should be closely monitored, and all employees should have unique passwords.

    Changing the HR approach

    There are plenty of things that need to be done from a non-technical point of view too. Companies need to make their employees aware that if they have access to sensitive data, they will be subject to monitoring and must be accountable. Concerns about hurting someone’s feelings or not treating employees “nicely” shouldn’t come into it. Red tape that blocks scrutiny should be kept to a minimum. Organizations could even go as far as taking on more security guards for the premises and installing more CCTV.

    Capital One Data Breach
    Source: New York Post

    Training for all employees should include cyber-security learning. For staff who are hired as tech experts, that training should be highly sophisticated. Companies could perform staged attacks on employee’s inboxes to make sure they respond properly. Having taken a training course is not the same as passing it, and anyone who doesn’t show competency in spotting threats should receive additional training. 

    Guidelines about the expectations of employee conduct, including reporting suspicious behavior among their co-workers and people they interact with externally, should be clearly stated and given to each and every employee as soon as they begin their role.

    At Sanvada, we take data protection very seriously. We place emphasis on using Security Technical Implementation Guides, the protection framework used by the Department of Defense (DoD). We also use Department of Commerce’s National Institute of Standards and Technology (NIST) Zero-Trust Architecture, which allows many conditions to be taken into account before someone is given access to data. This includes using penetration testing and vulnerability scans, as well as Monte Carlo simulations for risk analysis.