Thursday 9 August 2018

The Problem of Sharing Secrets under GDPR Article 28 Mandatory Contracts

Automating the GDPR - The Article 28 Condumdrum

This article shows how the GDPR sets up a conflict in trust between companies in particular circumstances, which can only be resolved by using the automation of a cryptographic audit trail.

Under the EU's GDPR law virtually every company is a Controller, and virtually all Controllers use at least one Processor. When a Processor is engaged, the GDPR requires that a contract with very specific contents is signed. The GDPR requires that Controllers and Processors cooperate together in order to deliver data protection, and this cooperation needs to be very carefully managed to maintain the security and other guarantees that the GDPR also requires.

In other words, the GDPR simultaneously requires strong cooperation and strong security between companies who can't be expected to have common goals or to trust each other. This is difficult to resolve.

About Controllers and Processors

If you are familiar with the European Union's GDPR, and the roles of the Controller and Processor, then you will be aware of the need for a formal agreement (usually a contract) between every Processor a Controller uses.

Effectively, every company is at least a Controller of personal data, certainly if there are employees or customers. Most companies use at least one Processor for data under their control, from an IT support company, to online storage providers, to companies who consultant and outsource in many ways. A contract between a Processor and a Controller is very carefully specified in legal terms in the GDPR, but the technology implications are not mentioned. This is all in the GDPR Article 28.

About Sharing Access to Data

Not sharing data, but access to the data - for example, does an employee of the Controller log on to the computer system in the Processor? And if so how? This is the kind of scenario that puts shivers down the spine of security professionals, yet here it is in the GDPR.

Controllers and Processors could be almost any company using almost any system, so sharing the access to the personal data across organisations just wouldn't work. Personal data is stored be stored in a different way in every organisation - at a minimum, in difference choices from the 200-odd major databases on the market for instance, besides all the non-database systems in use, and the policies and habits unique to every company.

But the same is not true for the secret keys used to get access to this personal data. No matter how diverse the storage mechanism for the personal data, the secret keys are going to be one of a few types. Most often passwords, but also multifactor authentication codes, or cryptographic key files, or one of a small list of other means of authentication.

Article 28 says that these passwords or other keys need to be available for sharing between Controllers and Processors at any time. And yet no company is happy handing out passwords to their internal systems to unknown people, and anyway this could easily become a breach of the GDPR and the forthcoming ePrivacy legislation.

Where Computer Science Comes In

When a Controller engages a Processor, there are many circumstances under the GDPR when these secret keys need to be shared between these parties, parties who should not trust each other. Therefore, without respect to what may happen with the personal data, the handling of the keys to the personal data is of crucial importance. The law requires that you give some degree of access, perhaps a lot of access, to a company whom you have never met and have no reason to trust. Computer Science has given us many ways to think about interacting with people we do not trust, so this is a problem that can be solved.

Article 28 strongly implies that a particular kind of cryptographically guaranteed auditing process is needed for the keys required to access data, when taken with the post-trilogue texts for upcoming laws including the ePrivacy Regulation, the EU Cybersecurity Act and the European Communications Code. The Cybersecurity Act and the EU NIS Directive are urgently pressing standards in these areas, as are the EU-level security and privacy bodies ENISA and the EU Data Protection Board. With all this human rights-based legal pressure, what is needed is a computer science view of how to implement what the law calls for. Article 28(3)(c) says "takes all measures required pursuant to Article 32" so Article 32 is a part of the mandatory contract, and Article 32 is about security, which also implies computer science.

To discover exactly what kind of cryptographic solution will work, we need to look at the information flows the GDPR requires.

GDPR Article 28 Information Flow

A close reading of the mandatory instruments (normally contracts, but not necessarily) in GDPR Article 28 shows that the required flow of information between Controllers and Processors is entirely one way, from the Processor to the Controller. The Processor has to make numerous undertakings and promises to the Controller, stated in a legally binding manner.

In addition there is a lot of mandated potential communication from the Processor to the Controller, meaning that in various circumstances, there will be communication from the Controller to the Processor if the Controller wishes. At any time the Controller can demand the Processor produce information to prove that processing is compliant, or to require the Processor to assist the Controller in certain activities. The Controller is bound by the GDPR to be able to prove at all times that processing is compliant whether or not a Processor has been engaged.

Relationship of the Parties to Article 28 Contracts

Basic security practice is that the parties to such information flows should not trust each other; they are independent entities who in many cases will not have any other dealings. In addition, each are under very strict legal requirements of the GDPR and the (imminent) ePrivacy Regulations, and the (imminent) EU Electronic Communications Code.

Article 28(1) says "the controller shall use only processors providing sufficient guarantees". According to the computer science referred to in this article, it is possible to define a minimum value of "sufficient guarantee" under the GDPR, but even without that analysis, the Controller must seek some guarantees from the Processor and they need to be not only good guarantees but sufficient to back up the rest of Article 28.

This means that parties to Article 28 contracts are required to meet a particular standard, but also that the parties should not trust each other to meet this standard or any other good behaviour.

Article 28 is All About Processors

Article 28 is all about the Processor being bound to the Controller, with the Controller saying and doing nothing outside what is already said in the rest of the GDPR text. The only reference to a Controller in Article 28 is that the contract must "set out the obligations and rights of the controller" (Art 28(3)) which appears to mean effectively stating "Yes I acknowledge I am a Controller and I am acting according to the GDPR".

There are just two references in the entire GDPR requiring the Controller taking action with respect to using a Processor. The first is ensuring that there is a contract in place that complies with the GDPR. The second is in Article 32(4), which says "the controller and processor shall take steps to ensure that any natural person acting under the authority of the controller or the processor who has access to personal data does not process them except on instructions from the controller".

Technical Comments

Article 32 emphasises the phrase "state of the art", an English expression that has caused much confusion. The phrase is only ambiguous within the confines of English, and since the GDPR is authoritative in multiple languages we can easily compare with German and French and see that multiple versions all agree with one of the English meanings. Therefore "State of the art" means "the art as it is practised today in general", as practiced by peers and as defined by standards bodies and the like. It does not mean the best, most efficient or most advanced technology in existence. It does not mean the most recent. This article considers technologies mostly developed decades ago and very widely recommended and deployed today, which definitely are "state of the art".

Technical Analysis About Audit Records. A log file (Unix) or an EventLog (Windows) is not a quality audit record; it has often been held to be a sufficient audit record in courts worldwide, but in that context it is about balance of probabilities and taking into account other log entries created on other systems at the same time - basically a detective hunt by an expert witness. That sort of thing is an audit process but not a good one and typically only ever involves one logging party. The GDPR Article 28 contract requires that there shall be at least two parties to the audit trail whose actions will be logged, which has not been the case in any law previously. The new EU security and privacy laws use the words "appropriate", "should" and "state of the art" so much that I think it is non-controversial that the audit standard required is much higher. There needs to be a cryptographically guaranteed, non-repudiable audit trail for activities where none of the actors involved (including auditors) need to trust each other, and no special expertise or context is required to interpret the record.

Technical Analysis About Keys A key of some sort is always required to get access to personal data, be it a password, passphrases, physical door pinpad code, two factor authentication or whatever else guards the access to the systems with personal data on it. The Article 28 mandated contract specifies that under many circumstances a Controller and a Processor release keys to each other and therefore to natural persons in the employ of each other. By auditing the use of the keys, we are auditing the access to personal data. In order to remain in compliance with Article 32, we can change passwords/keys at any time, reset the list of authorised persons and therefore also resetting the audit trail. A cryptographically secured audit facility can detect the first time that someone accesses a key.

Technical Analysis About the ePrivacy Regulation I have tracked down the different versions presented for Trilogue, which has now finished. ePrivacy following Trilogue appears to include EU Parliament LIBE Committee amendments from October 2017, including Article 26(a) “In order to safeguard the security and integrity of networks and services, the use of end-to-end encryption should be promoted and, where necessary, be mandatory. Member States should not impose... backdoors". If we are having an audit facility for keys to personal data then it should be end-to-end. Like all end-to-end solutions it will upset government spy agencies or any other party that might want to falsify the record through government-imposed backdoors, because such backdoors cannot work according to mathematics.

Technical Analysis About the EU Code of Communications The Code is broader than ePrivacy (which, it can be argued, is limited by its lex specialis relationship to GDPR.) The Code says: "In order to safeguard security of networks and services, and without prejudice to the Member States' powers to ensure the protection of their essential security interests and public security, and to permit the investigation, detection and prosecution of criminal offences, the use of encryption for example, end-to end where appropriate should be promoted and, where necessary, encryption should be mandatory in accordance with the principles of security and privacy by default and design." We know from Snowden and others that the "without prejudice" phrase is just being polite, because there is no technical means to implement "no backdoors end-to-end crypto" and also not make government spy agencies upset.

Minimum Audit Records Required by Article 28

Detail of Required Audit Records, with their basis in law:

  1. Audit records that list of all natural persons who have access to keys to the personal data, and the changes to that list over time:
    • Article 28(2) "shall not engage another processor", so everyone can see whether or not an unexpected person was authorised for access to keys
    • Article 32(4) "any natural person acting under the authority of the controller or the processor who has access to personal data", so we need an audit log of who *can* have access to keys
    • Article 32(4) "any natural person acting under the authority of the controller or the processor ...  does not process them except on instructions", so we need an audit log of who actually *did* access the keys at least once.

  2. Audit records for who has accessed the audit records above:

    • Article 28(3) "obligations and rights of the controller", shows the controller is watching the processor

These audit records can be technically implement regardless of what IT systems the Controller and the Processor have, because they are only about dealing with the keys. Whoever has the keys has the personal data, and the keys themselves are protected by the GDPR in any case. These audit records are about storing passwords (or other access means.)

Computer Science doesn't seem to allow any way of meeting Article 28 "sufficient guarantee" without a zero-trust encrypted audit model, which these types of audit records enable.


Conclusion 1:
the above minimum audit records are required to fulfill an Article 28 contract between Processor and Controller
Conclusion 2:
if implemented, these records rises to an Article 28(1) "sufficient guarantee" of a Processor being acceptable and therefore the contract being acceptable
Conclusion 3:
there does not seem to be any alternative way of achieving a "sufficient guarantee".
Conclusion 4:
The GDPR requires cryptographic audit facilities to exist and therefore, there is a market for companies to provide these facilities.

1 comment:

Constructive and polite please!