Sharing Secrets under GDPR Article 28 Mandatory Contracts

From Granizada

Jump to: navigation, search

Contents

Summary

The GDPR requires a controller (which is every company) to have a contract in place when using a processor (which nearly every company does, from the local IT support company to online storage providers to consultants and outsourcers of all kinds.) The contract is very carefully specified, but its technology implications seem not to have not previously been analysed. This is all covered in the GDPR Article 28

The point is that it is impractical to find a standardised way to access personal data across organisations, because it will be stored in a different way in every organisation. In contrast however the secret keys used to get access to this personal data are a different matter. They will be passwords, or multifactor authentication codes, or one of a small list of other means used to get access to things.

When a Controller engages a Processor, there are many circumstances under the GDPR when these secret keys need to be shared between these parties, parties who should not trust each other. Therefore, without respect to what may happen with the personal data, the handling of the *keys* to the personal data is of crucial importance.

Taken with the post-trilogue texts for laws including the ePrivacy Regulation, the EU Cybersecurity Act and the European Communications Code, Article 28 strongly implies that a particular kind of cryptographically guaranteed auditing process be used for the keys required to access data. If this is accepted and implemented, the state of the art would be significantly advanced in a standardised manner. The Cybersecurity Act, the EU NIS Directive and other instruments are urgently pressing standardised improvements, as are two EU-level security and privacy bodies (ENISA and the EU Data Protection Board.) With all this human rights-based legal pressure, what is needed is a computer scientists view of how to implement what the law calls for.

GDPR Article 28 Information Flow

A close reading of the mandatory instruments (normally contracts, but not necessarily) in GDPR Article 28 shows that the required flow of information between Controllers and Processors is entirely one way, from the Processor to the Controller. The Processor has to make numerous undertakings and promises to the Controller, stated in a legally binding manner.

In addition there is lots of mandated *potential* communication from the Processor to the Controller, clearly implying there will be communication back the other way if these this communication is activated. At any time the Controller can demand the Processor produce various bits of information to prove that processing is compliant, or to assist in various activities. The Controller is bound by the GDPR to be able to prove at all times that processing is compliant whether or not a Processor has been engaged.

Nature of the Parties to Article 28 Contracts

Basic security practice is that the parties to these information flows (typically not parties with any other connection) should not trust each other; they are independent entities who in many cases will not have any other dealings. In addition, both are under very strict legal requirements of the GDPR and the (imminent) ePrivacy Regulations, and the (imminent) EU Electronic Communications Code.

Article 28(1) says "the controller shall use only processors providing sufficient guarantees". This analysis defines a minimum value of "sufficient guarantee" under the GDPR.

Article 28 is All About Processors

The only reference to a Controller in Article 28 is that the contract must "set out the obligations and rights of the controller" (Art 28(3)) which presumably means "Yes I acknowledge am a Controller and I am acting according to the GDPR". Otherwise this is all about the Processor being bound to the Controller, with the Controller saying and doing nothing in addition to the GDPR text.

There are two references requiring the Controller taking action with respect to using a Processor. The first is ensuring that there is a contract in place that complies with the GDPR. The second is in Article 32(4), which says "the controller and processor shall take steps to ensure that any natural person acting under the authority of the controller or the processor who has access to personal data does not process them except on instructions from the controller". We also know that Article 32 is a part of the mandatory contract because Article 28(3)(c) says "takes all measures required pursuant to Article 32".

Technical Comments

Article 32 emphasises the phrase "state of the art", meaning "as it is practiced today", not the most advanced technology in existence. I confine myself to technologies developed at least 25 years ago and very widely deployed today.

Technical Analysis About Audit Records. A log file (Unix) or an EventLog (Windows) is not a quality audit record; it has often been held to be a sufficient audit record in courts worldwide, but in that context it is about balance of probabilities and taking into account other log entries created on other systems at the same time - basically a detective hunt by an expert witness. That sort of thing is an audit process but not a good one and typically only ever involves one logging party. The GDPR Article 28 contract requires that there shall be at least two parties to the audit trail whose actions will be logged, which has not been the case in any law previously. The new EU security and privacy laws use the words "appropriate", "should" and "state of the art" so much that I think it is non-controversial that the audit standard required is much higher. There needs to be a cryptographically guaranteed, non-repudiable audit trail for activities where none of the actors involved (including auditors) need to trust each other, and no special expertise or context is required to interpret the record.

Technical Analysis About Keys A key of some sort is always required to get access to personal data, be it a password, passphrases, physical door pinpad code, two factor authentication or whatever else guards the access to the systems with personal data on it. The Article 28 mandated contract specifies that under many circumstances a Controller and a Processor release keys to each other and therefore to natural persons in the employ of each other. By auditing the use of the keys, we are auditing the access to personal data. In order to remain in compliance with Article 32, we can change passwords/keys at any time, reset the list of authorised persons and therefore also resetting the audit trail. A cryptographically secured audit facility can detect the first time that someone accesses a key.

Technical Analysis About the ePrivacy Regulation I have tracked down the different versions presented for Trilogue, which has now finished. ePrivacy following Trilogue appears to include EU Parliament LIBE Committee amendments from October 2017, including Article 26(a) “In order to safeguard the security and integrity of networks and services, the use of end-to-end encryption should be promoted and, where necessary, be mandatory. Member States should not impose... backdoors". If we are having an audit facility for keys to personal data then it should be end-to-end. Like all end-to-end solutions it will upset government spy agencies or any other party that might want to falsify the record through government-imposed backdoors, because such backdoors cannot work according to mathematics.

Technical Analysis About the EU Code of Communications The Code is broader than ePrivacy (which, it can be argued, is limited by its lex specialis relationship to GDPR.) The Code says: "In order to safeguard security of networks and services, and without prejudice to the Member States' powers to ensure the protection of their essential security interests and public security, and to permit the investigation, detection and prosecution of criminal offences, the use of encryption for example, end-to end where appropriate should be promoted and, where necessary, encryption should be mandatory in accordance with the principles of security and privacy by default and design." We know from Snowden and others that the "without prejudice" phrase is just being polite, because there is no technical means to implement "no backdoors end-to-end crypto" and also not make government spy agencies upset.

Conclusions

Conclusion 1: the following minimum audit records are required to fulfill an Article 28 contract between Processor and Controller

Conclusion 2: this rises to an Article 28(1) "sufficient guarantee" where little or nothing else does

Detail of Required Audit Records, with their basis in law:

1. Audit records that list of all natural persons who have access to keys to the personal data, and the changes to that list over time:

  • Article 28(2) "shall not engage another processor", so everyone can see whether or not an unexpected person was authorised for access to keys
  • Article 32(4) "any natural person acting under the authority of the controller or the processor who has access to personal data", so we need an audit log of who *can* have access to keys
  • Article 32(4) "any natural person acting under the authority of the controller or the processor ... does not process them except on instructions", so we need an audit log of who actually *did* access the keys at least once

2. Audit records for who has accessed the audit records above:

  • Article 28(3) "obligations and rights of the controller", shows the controller is watching the processor


These audit records will work regardless of what IT systems the Controller and the Processor have, because they are only about dealing with the keys. Whoever has the keys has the personal data, and the keys themselves are protected by the GDPR in any case.

It would be difficult but not impossible to construct a demonstration manual procedure with paper that meets these requirements for exchanging Article 28 secrets, however for all practical purposes it must be implemented with computers. Computer Science doesn't seem to allow any way of meeting Article 28 "sufficient guarantee" without a zero-trust encrypted audit model.

Personal tools
Navigation