One of the hot new security trends in 2021 is managing your attack surface. But how did this become a thing? Well simple, security trends are born out of hindsight. It is through forensic investigation and response that we learn what methodology a threat actor uses. Armed with this information, we identify how we can prevent and detect threats with the tools we have. When we find gaps in our coverage, we incorporate something new to address our coverage needs.
In this blog, I will highlight cybersecurity industry trends that resulted from some of the most impactful breaches of our time. We will explore these trends and security practices, most of which, we still invoke today.
2010 – 2012: Zero Days gone WILD!
We kicked off the beginning of the decade with a breach disclosure at Google, later known as Operation Aurora. The impact was far beyond just Google and included many other large organizations. This breach brought with it a trend that would remain within the security community for many years to come. What was this trend? The trend of abandoning the default browser IE (Internet Explorer) in Windows for alternates such as Firefox or Chrome. A trend still seen today.
Shortly thereafter we saw the discovery of the Stuxnet worm. The combined sophistication of using 4 zero-days, one of which was targeting a PLC (programmable logic controller) to evade the standard security controls, had far-reaching implications. Sandboxing soon emerged as a way to emulate the launching of a file or replay the browser content to see if abnormal behavior such as a zero-day exploit could be detected.
As the decade progressed it brought with it some very public cyberattacks that reached beyond traditional commercial security measures and into our homes. From large commercial banks to the Sony Playstation Network, breaches highlighted the gap in security for personal password hygiene. The need to have a different password for all your accounts became a serious challenge and grew as consumer data breaches increased.
As we continue through the decade, we saw a mix of breaches that extended its reach into Mac OS and mobile devices. Cybersecurity solutions began to rise up to address these issues. One of which was Mobile Device Management, a security tool that aimed to gain some visibility and control over the corporate data as it left the traditional network and landed within a handheld device.
Large social media platforms, as well as SaaS organizations, also began to feel the burn of compromise and found their users’ passwords leaked and the data they hosted stolen. Most notably was the hack against journalist Matt Honan which resulted in the erasure of Matt’s digital assets (Gmail, iPhone/mac backups all of it) and his Twitter compromised to spew hate. He specifically recognized that if he had enabled two-factor authentication, this would have been prevented. This sparked the 2FA trend and led to the beginning of MFA as a standard practice.
2013 – 2017: All the Data Belongs to Them
Consumer businesses were also learning hard lessons that resulted in major innovations and redesigns of network architecture. The most notable breach of its time — Target. The Target breach resulted in a better understanding of the importance of network segmentation and brought with it a trend of the adoption of new tools. Tools like Security Information and Event Management (SIEMs) which had been historically used for compliance, now became fundamental to responding to a breach. In addition, enrichment tools and intelligence also started trending as feeds into a SIEM to give the needed visibility and context to responders. Specifically, Endpoint Detection and Response (EDR) software and Cyber Threat Intelligence (CTI) when merged with the data within a SIEM, made responding to alerts less time-consuming for Security Operations Centers. As more forensic details emerged from this breach, we saw 3rd party risk start to become a focus.
As we approach the latter years of the decade we see what I like to refer to as, “getting back to the basics”. Breaches like the one at Equifax reminded the security community of the importance of having a patch and vulnerability management function in place. Fundamentally, this strategy is dependent on knowing where your assets are across the Internet and keeping a constant inventory.
As businesses shifted and migrated to the cloud, the network and software ecosystem complexity increased dramatically. With this complexity came the increasingly difficult task of knowing what and where your assets are on the Internet. As we sprawl outside of the traditional network, we open ports and protocols to allow connections between these disparate components through API integration or direct hooks. Fundamental questions like what does my company’s attack surface look like are hard to answer when you include home networks, cloud platforms providing IaaS, IoT devices, 3rd party SaaS applications, and legacy systems all of which can increase an organization’s risk.
2018 – 2020: Technology, Intel, and Expertise OH MY!
By the end of the decade, we find ourselves facing an issue where the number of unfilled cybersecurity jobs by 2021 is estimated to be over 3.5 million. This immense shortage and increasing complexity has caused several trends in the industry. From a pure human power or capacity stance, we see more organizations offloading work to 3rd parties like Managed Security Service Providers (MSSPs), Managed Detection and Response (MDRs), and crowdsourcing for testing and security assessments. From a technology perspective, we see software vendors adopting more machine learning and artificial intelligence algorithms, Robotic Process Automation, and Security Orchestration Automation and Response functionality.
As we adapt to these changes, systematic, repeatable, and regular risk management has become more important than ever. In less than 10 years, we went from simple and controlled traditional security components of firewalls, anti-virus, web gateways, spam filters, and VPNs to a very complex ecosystem that requires a new approach to protect. One such approach that is being more widely adopted every day is viewing your attack surface from a malicious actor’s point of view.
As we closed the decade, we became aware of the largest breach to date – SolarWinds. The SolarWinds breach, which as I write this is still being investigated, reminds us of what is currently the most important trend of all, the importance of building an accurate and ongoing attack surface management function in your organization. An attack surface management function ensures the proper security measures are in place for any exposed and Internet-facing assets and as we’ve seen over the years, is critical to preventing the next breach. We must have a vigilant eye on the network (traditional and cloud), mobile devices, endpoints, and 3rd party supply chain risk.
Where to go from here?
What are some of the key takeaways of these breaches and industry trends? For one, they all came at a cost. From fines to brand and reputation damage to loss of intellectual property, the consequences of poor information security can have lasting impacts. Another important pattern to recognize is that all these breaches were related to attack surface management in one way or another. The devices your employees use at home or in transit are as much a part of your attack surface as your traditional Internet presence, only highlighting the urgent need to be able to effectively manage these risks.
As security practitioners, we will continue to try and answer these questions: Are the networks your employees use to connect to the organization secure? Do we have ports or protocols exposed that shouldn’t be? Are we running web software with known vulnerabilities?
At Censys, we can answer those questions and more. Interested in speaking with Alexis about your attack surface? Learn more and schedule your demo here!