From a certain vantage point, the cybersecurity industry has never been healthier. Businesses and other organizations are spending record amounts on security tools, solutions and hardware, while investors have spent the past few years showering startups with billions of dollars to develop new and emerging defense technologies. All of this activity has been underscored by a now-daily deluge of reporting about the latest big breach, ransomware attack or supply chain compromise.
Underneath those rosy numbers, the industry is increasingly struggling to meet the demands of the modern digital threat landscape and adapt to and match innovations from criminal and state-aligned hacking groups.
Click here to access all coverage of the 2021 SC Awards.
While the world unfortunately does not want for a lack of cybersecurity problems, two recent trends seem to be eclipsing all the others: speed and software insecurity. If necessity is the mother of invention, then these twin calamities are pushing the cybersecurity industry to develop new strategies, processes and technologies that can measure up.
As SC Media honors the people, products and companies shaping how organizations protect their most critical assets, we examine the nature of the threat today, and how it’s shaping developments tomorrow.
From weeks to hours, attackers are ‘priming the pump’
Talk to anyone involved in network defense and the view is nearly unanimous: the speed and cadence of cyber attacks have increased exponentially in just the last few years alone. Whereas many intrusions traditionally involved weeks of sometimes-clumsy reconnaissance, dwell time and lateral movement, some hacking groups, particularly in the ransomware space, have shaved that process down to days or hours.
“Five hours in and out from a spearphishing compromise to dropping ransomware and heading out of the network really speaks to the level of tools and access that the adversaries have,” said Jeremy Brown, vice president of threat analysis at Trinity Cyber, and winner of the SC Award for Innovator of the Year.
The reasons behind this increased pace on the attacker side vary. The growing professionalization of cybercriminal groups has led to more efficient operations, while the availability of a vast ecosystem of open source, commercial and underground hacking tools have flattened the difficulty curve for “skids” – short for “script kiddie,” a derisive nickname given by the hacker community to amateurs with low technical skills. This ups the ability to carry out lucrative, professional-grade cyber attacks on the cheap.
“I think there are a couple of factors at play for why this [increased speed] is happening,” said Allie Mellen, an analyst who researches cybersecurity and automation at technology research Forrester. “Tthe first is definitely just ease of use for attackers and criminals looking to get started with malware. There continues to be malware-as-a-service, ransomware-as-a-service cropping up that gives non-technical individuals the ability to use malware that they wouldn’t otherwise have access to.”
Also, nations have increasingly viewed cyber operations as a quieter, more attractive option for pursuing their geopolitical goals compared to military action or other clandestine activities. As advanced persistent threat groups have pummeled businesses and governments with zero-day exploits, their work is often picked up and reported by threat intelligence companies and the media to raise awareness and spur mitigation activities across industry. A secondary effect is that such research and reporting, done to serve the larger public, can also lead to the spread of technical details and proof of concept exploits that are quickly seized on by criminal groups who are lower on the hacking food chain.
It’s made proactive network defense capabilities and speed an essential part of defense when a new vulnerability and patch comes out.
“One of the things that we’re beginning to see from an attacker pattern perspective is that it’s not opportunistic anymore,” said Vincent Liu, CEO of Bishop Fox, a security consultant firm that provides continuous security testing services and winner of the 2021 SC Award for Best Emerging Technology. “A lot of the work that goes into the defense of a network is all about getting ahead of the bad guys, because a lot of the work they’re doing is premeditated, it’s pre-planned. They are priming the pump for the moment they have an opportunity to take advantage of a new exploit…and move to their market, which is breaking in.”
At the same time, the vast majority of network defense tools in use today are largely reactive, ingesting signatures and indicators from previous cyber attacks and malware samples in an effort to detect them in your network today. In a world where breaches routinely leverage older, unpatched vulnerability, such a model certainly still has value, but it is also in a sense akin to reading yesterday’s newspaper in an effort to predict tomorrow’s news.
“We would build our architectures with that perimeter defense model where we’re going to have a firewall and we’re going to deny everything except for those things that we want to let through,” said Greg Touhill, former U.S. chief information security officer and currently director of the CERT team at the Software Engineering Institute at Carnegie Mellon University. “And that’s been overcome. That model has been overcome by things like [smartphones] and mobility and the firewalls are very difficult to configure and maintain. We’ve drilled holes in with VPNs, which are…25-year-old technology. So we’ve got to rethink things.”
Harder, better, faster, stronger
The industry has responded to these concerns in part by putting a greater emphasis on building automation into existing technologies and processes while developing newer tools, like security orchestration and response (SOAR), security information and event management (SIEM) and endpoint (EDR) and extended detection and response (XDR). These tools are designed to react to and respond in real time, or at least at a speed that is more in line with the network-based threats that they regularly face.
According to an annual survey from the SANS Institute, cybersecurity and threat intelligence professionals have found the most success incorporating automation into SIEM and other security analytics platforms, intrusion monitoring platforms and network traffic analysis tools and for activities like data standardization and deduplication. One of the areas where security personnel want to see more built-in automation is in the mature stages of the threat intelligence process, specifically “the process of making technical [cyber threat intelligence] data relevant to organizations’ decision makers.”
But there are limitations to today’s automated defense tools. They tend to be costly for the average small or medium-sized business, with many automated detection platform licenses running between $50-$100 per endpoint, something that can quickly add up to tens or hundreds of thousands of dollars. Depending on the size of the company, that might eclipse their entire annual cybersecurity budget.
That’s if their IT environment is even capable of leveraging these tools in the first place. Mellen said there are two kinds of organizations that struggle to get the most out of automated defense technologies: ones that have failed to put in place the basic cybersecurity fundamentals that underpin them and ones that operate in exceedingly complex IT environments.
One the lower end of the maturity scale, many organizations still struggle to implement protections and processes around patching, multifactor authentication, misconfigurations and other baseline best practices. Without those in place, even the most advanced platforms or tools will fail. Ironically, despite their marketing automation platforms in fact often require robust staffing, something that again is out of reach for thousands of businesses and organizations who may have just one security employee.
On the other end, overly complex information technology environments present a “huge challenge” for effective automation, especially for enterprise businesses. Despite their reliance on standardized processes to be effective, most large organizations have highly specific IT operations that require substantial customization.
“The challenge that comes into play, particularly for larger organizations, is when you’re trying to actually automate the stuff that is so specific to that organization because of these ad hoc additions to the actual infrastructure of the environment,” said Mellen. “And that’s where the key struggle lies: how do we automate these really nice things that just apply to our environment, that just apply to our organization?”
It also tends to require large amounts of high-quality data, from end points, logs, third-party and supply chain partners and a host of other sources, while also integrating with internal software systems, databases and infrastructure.
“As Scotty [from Star Trek] said, the more complex you make it, the easier it is to break it,” said Touhill.
Software is eating the cybersecurity industry
There’s a saying in tech circles that software is eating the world, but the current state of software insecurity represents an existential crisis for the cybersecurity industry. Vendors spend billions of dollars designing products and tools to protect organizations from criminal and nation state hackers, but if their customers happened to be running SolarWinds Orion software, or operating a Microsoft Exchange server, or hosting their code on Codecov, virtually none of those investments mattered.
The same is true for the federal government, which has spent billions of dollars on two programs – EINSTEIN and Continuous Diagnostics and Mitigation – that are ostensibly designed to help departments and agencies defend network-based threats. And yet none of the officials involved in overseeing these program claim either would be even remotely capable of detecting hacks like SolarWinds, where the company itself signed the update certificate for a broadly used piece of software. Nor have they been relevant when government agencies are victimized by vulnerabilities in Microsoft Exchange, Codecov and other upstream software-based attacks. In many of these cases, the government might still be in the dark if not for private threat intelligence companies like FireEye discovering these campaigns and notifying the public.
Experts who have been quietly pushing for industry and government to coalesce around developing a framework to create software bills of materials are having their moment, with the Biden administration reportedly set to issue a new executive order mandating their user for government contractors and similar initiatives popping across the energy sector and other industries.
But even proponents of the idea acknowledge that while it would present a much-needed breakthrough, it is only one step in the chain.
It’s “a necessary but not sufficient part” of tackling the kind of damaging software-based attacks that have wrought havoc up and down the industrial and government supply chains, said Allan Friedman, director of Cybersecurity Initiatives at NTIA.
“You cannot build a defense against [a SolarWinds] kind of attack without a software bill of materials, but of course you need more,” he said.
There is simply too much software in the world today to fix all the bugs, vulnerabilities and misconfigurations that malicious hackers exploit. This is true for the code and software programs themselves, as well as the vast ecosystems of application programming interfaces they’re plugged into in order to talk and interact with other systems.
Companies like Bishop Fox have invested their research and development dollars around technologies that provide continuous testing of externally-facing software programs. Liu argued that many of the security processes and tools in place today are designed to help developers catch inadvertent security flaws introduced into the process, not intentional and malicious poisoning or exploitation of the code by bad actors.
“Malicious insertion of code like what we saw in SolarWinds is a different problem, it’s actually much more akin to the type of things you need to look for on a system when you’re trying to identify malware or malicious behavior,” he said.
Others have argued for a more targeted approach. Edna Conway, who spent decades studying supply chain and third-party security, has argued for industries to identify and prioritize fixing high value software assets, modules and code for improvements, rather than trying to eat the elephant of solving software insecurity more generally.
“We need to say in a risk-based approach, what is the type of software that we care about at this level and what is the type of software that we care about at a different level,” said Conway, vice president and chief security risk officer for Microsoft Azure, at an event in April hosted by the McCrary Institute. “We certainly care about all of it, but we care about each element of it in a different way, and [the questions is] what is the scope of information we need to share with regard to the elements of our software?”
"industry" - Google News
May 03, 2021 at 07:20PM
https://ift.tt/3uf9ydz
Where do we go from here? The cyber industry's struggle for speed and superiority - SC Magazine
"industry" - Google News
https://ift.tt/2RrQtUH
https://ift.tt/2zJ3SAW
Bagikan Berita Ini
0 Response to "Where do we go from here? The cyber industry's struggle for speed and superiority - SC Magazine"
Post a Comment