The average cost of a data breach is $3.86 million across all sectors, according to IBM’s Cost of a Data Breach Report 2020. The report also states that it takes around 280 days or nine months to identify and contain a breach.
Before COVID-19, such breaches mostly targeted financial firms. However, as the pandemic modified global socioeconomic systems and led to the rise of the digital application economy, the risk of a breach is universal.
What makes breaches so prevalent? The sheer volume of code and the number of interconnected devices is a primary reason.
HBR reports that the average car runs on roughly 100 million lines of code, whereas Microsoft Office has up to 30 million lines of code. Add a mesh of interconnected, vulnerable devices to the mix, and what cyber attackers get are thousands of potential entry points.
Think back to the notorious Wannacry ransomware or the NotPetya malware. All it took was a single vulnerability to cripple giant corporations like Maersk ($250-$300 million), FedEx ($400 million), Merck ($870 million) or NHS ($120 million).
With COVID-19, digitization has received an adrenaline boost. The pandemic has forced rapid adoption of work from home measures, digital collaboration tools, and online services — multiplying the number of vulnerabilities within any company’s digital landscape.
What makes a dire situation worse is the rush to adopt digital applications and services. In an economy that favors speed to market, some companies have failed to understand the technology and thoroughly evaluate the risks. It’s no wonder that cyberattacks have at least tripled as a consequence.
Addressing security challenges to the current digital landscape requires more than just a technological fix. It needs two fundamental changes in mindset more than anything else.
The first is to stop viewing cybersecurity as merely the responsibility of IT. Instead, companies should look at cybersecurity as a business risk, and identify its impact on business activities.
For instance, consider the implications of an attack on the supply chain or the manufacturing processes. Start by mapping core business activities, imagine what would happen if one of these activities is disrupted, and project the subsequent revenue losses.
This approach helps businesses map what needs protection and make a business case for cybersecurity investments. Eventually, such a mindset leads to companies integrating cybersecurity into their core processes.
The second shift needed is assuming that an attack is inevitable in present times and preparing for the worst-case scenario.
In 2019, data breaches compromised 52 million Google users and 50 million Facebook user accounts. If such powerful tech giants have failed to keep their systems safe, it’s naïve to assume others can. That’s why planning for the worst and considering how a cyber attacker might exploit the system is key to building resilience into digital applications.
Besides these shifts in mindset, businesses must also adopt four fundamental security best practices:
- Stay updated on the most recent cyberattacks and keep all your workforce well-informed.
- Transition from traditional castle-and-moat security models (firewalls or VPNs) to granular identity-based approaches like Zero Trust Architectures.
- Incorporate multi-factor authentications (MFAs) across all applications and devices, mainly since an increasing number of personal devices in the age of social distancing, cloud computing, remote work and IoT.
- Educate your employees on cybersecurity measures. All cybercriminals need to gain access and download malware onto a system is one employee clicking a bogus link.
In an increasingly digitized, hyper-connected world, cyberattacks will only become more probable. The key to minimizing the fallout from such attacks is understanding your vulnerabilities, adopting granular access and authentication policies, and keeping your employees informed on the latest scams.