Software security: your biggest GDPR oversight

Executive summary

Security flaws in custom web and mobile applications bypass technical safeguards prescribed by GDPR. Absence of GDPR clauses in speaking specifically about software security leaves companies vulnerable to 2nd-level of GDPR fines – up to EUR 10,000,000 or 2% of global turnover, whichever is greater. Software security engineering practices are essential to lower exposure to GDPR, satisfy its system design criteria, and manage non-compliance risk.

Did you know there are two levels of fines?

The infamous GDPR fine of 4% from global turnover, or $20mln – whichever is greater – seems to have grabbed all the attention. This left the lesser, but still scary, fines of up to 2% from global turnover, or $10mln – whichever is greater – much less publicizedGDPR Articles 83.5 and 83.4 respectively describe the consequences of non-compliance.

Basically, these two levels of fines can be understood as:

  •  1st level, $20mln – when “major” requirements are violated – data subjects’ (persons’) rights and basic principles of processing.
  • 2nd level, $10mln – when requirements on tactics of processing are violated, which includes technical safeguards like “state of the art … technical and organizational measures to ensure a level of security appropriate to the risk” (Article 32.1)

This demonstrates that GDPR is not just about consent forms, privacy notes, and internal policies on data processing. It’s also about organizations keeping personal information safe from hackers, or risking a $10mln+ fine.

The managerial issue

These days, people mostly interact and share their personal data with organizations by means of custom web and mobile apps, which turned almost every organization into a software company. Everyone has an app.

Still, the misunderstandings among organizations of how custom software is hacked, and what really lowers the risks of it getting hacked, is dangerous. The fact the cool-sounding “DevSecOps” has gained so much traction – in absence of any software security engineering – is a perfect illustration of this.

Even companies with large security budgets are exposed to application security breaches. Just remember the Equifax hack, or the British Airways hack last week (analysis of which we published recently)? Both incidents were caused by a lack of security management in custom software development processes.

The technical issue

GDPR’s technical controls don’t account for software security. Pseudonymisation and encryption are the two controls described In GDPR that at least somehow relate to software security (articles 25.132.1(a)recital 78). While they got mentioned 15 and 4 times in GDPR respectively, OWASP, application or software security didn’t get a single mention.

Both of the recommended techniques relate to protecting data at rest, which would be very useful if a DB server, existing in a vacuum and holding private data, was hacked or stolen. In real life, DBs serve applications that query and process their data.

Thus, by design, in order to function applications must hold code and decryption keys for whatever is encrypted in the DB, and to restore attribution of whatever data was pseudonymized. This means, when an application is hacked, its intended level of access to the protected data is well enough for hackers to siphon off all data unobscured.

While both controls would decrease the risks from insiders, it doesn’t make much of a difference. Access to prod systems is usually limited and tightly controlled, and if it isn’t – no amount of pseudonymisation and encryption can save the organization. Also, most data breaches are due to external actors, while very few are due to insiders. See breach statistics when filtered by tag “hacked” vs “inside job” (circles, ignore dots):

vs insiders

From this comparison in external and internal attacks statistics, it’s not obvious why GDPR includes data security controls mostly aimed at internal threats.

Avoiding GDPR fines

GDPR is about organizations protecting confidentiality, integrity, and availability of personal data. It’s not only about consent, transparency, and procedures. It’s about preventing hacks.

Not only should your align your organization with GDPR, you also should not get your data breached – or your app hacked. Regardless of how much paperwork you generate, how much you spend on privacy consultants, if you get hacked – you’re in way more trouble than you would be before GDPR.

Secure software engineering

The single way to harden custom software against a targeted attack by a skilled adversary is to engineer it for security. This means making security an integral part of software development, led by skilled, dedicated engineers, focused narrowly on making the product hacker-proof.

Without a purposeful effort inside the RnD team to assure security – like what QA mean for quality, or DevOps for agile development – the engineering will inevitably prioritize functional requirements under heavy deadlines, with security getting forgotten.

GDPR Article 32 – “Security of Processing”

Even though it’s weak on technical requirements, GDPR lays out solid strategic directions for systems development, that can be applied to software-centric world of 2018, that SoftSeq – as a technical software security professionals – can stand behind.

Specifically, Article 32.2 states that

In assessing the appropriate level of security account shall be taken in particular of the risks that are presented by processing, in particular from accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to personal data transmitted, stored or otherwise processed.

For software companies, or those using custom software for processing data, the largest risks to come from software hacks. Sadly, even with the tide slowly turning in 2018, software security remains either ignored during risk assessments, or its risks get spectacularly mismanaged.

Also, Article 32.1.b states that

… the controller and the processor shall implement appropriate technical and organizational measures … including … the ability to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services;

For software that grows and changes continuously, it’s a clear reference to Secure Software engineering since no other processes meet the requirement of “ongoing” security assurance – in contrast to a yearly pentest, for example – an archaic approach, lacking engineering depth, breadth, and transparency.

In the end, Article 32.1.d prescribes that

… the controller and the processor shall implement appropriate technical and organizational measures … including … a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring the security of the processing.

Though with Secure Software engineering established, “regular testing, assessing and evaluating” is already a part of your development process, in a fully transparent and auditable way!

Résumé

Lacking understanding of software security specifics among security managers leads to it being the least protected cyber-security domain. This is clear to hackers, the technical guys, who are actively breaking custom software to breach networks and data.

A lot of companies tried to stay ahead of the curve by preparing to GDPR before it came into force. Now, you can stay ahead of the curve by tackling the widest security gap you have – custom software.

If you like and agree with the article – please click the share buttons below. Thank you!

Have questions?

We have answers. Write us at security@softseq.com

    Your message has been sent!

    We'll get in touch shortly