GDPR and health data – the questions you need to ask as a doctor.

As a doctor, I have always been very aware of the importance of patient confidentiality. Not only for ethical or legal reasons but also for purely practical purposes. If you don’t have all the information you can’t make the right decisions, and you will only get all the embarrassing information if patients are confident it won’t go any further.

However, from a legal perspective, it is not always that clear, especially when we are talking about health data which now comes from sources other than just the patient. Fitness trackers, for example, give useful information, but how should I store that data?

And if you are looking to buy into some new digital technology, what are the questions you need to ask?

If you are still using paper records or are outside of the EU, this too affects you as all data are covered by articles 2 and 3 of the GDPR.

Historically this has been recognised as a concern as early as 1970 with privacy being covered in the European Convention on Human Rights. Data protection was mentioned in 1981 in the Convention 108 for the Protection of Individuals with regard to Automatic Processing of Personal Data. Therefore the right to data protection is a fundamental right. Now, most people will have heard of the General Data Protection Regulation or GDPR which came into effect in May 2018, if only because of the pop-ups requesting permissions or in the case of certain non-EU websites, refusing access altogether.

For doctors, the essential concepts to understand about data processing or actions on information that can identify their patient are:

  1. Data controller: Person who decides what data is collected, how this data is collected and for which purpose. As a doctor, you or your institution can be a data controller.
  2. Data processor: Person or service who processes the data under the instructions of the controller and as a doctor using #digitaltech this can be the software you are buying storing the data and which needs to be formalised with a contract.
  3. Data subject: Patient or identifiable person.

Article 5 of the GDPR covers data processing, and as a doctor/data controller, you need to be aware that the data you collect should be:

  1. Lawful, fair and transparent.
  2. Limited to purpose – you need to be recording data with a specific, limited and explicit purpose.
  3. Minimised – irrelevant data should not be recorded.
  4. Accurate – doctors are used to keeping treatment changes, for example, and we are all aware of the legal consequences of not keeping legible notes.
  5. Storage limitation – this refers to not keeping the data for longer than required. Health is probably one of the few exceptions where you can argue that the data should be stored for the entire life of a person to give the best care.
  6. Integrity and confidentiality. This refers to the fact that the data must be protected appropriately through technical and organisational means. You need to consider not only loss and damage (accidental or other) but also that it is not accessed inappropriately by different members of staff. This is a core question when being presented with a new medical application or technology for your practice. Larger institutions such as hospitals will have an information security officer, but if you practise in a smaller setting, this responsibility will be yours.

Finally, to process any data, you need to be sure that there are legal grounds for processing the data you have collected. For doctors, the concepts are familiar:

  1. Consent has been given.
  2. It is necessary for a contract to be carried out and specifically, in the health care setting, this includes an agreement to medical treatment either implicitly or explicitly.
  3. You are complying with a legal obligation.
  4. You are protecting the vital interests of a patient.
  5. You are carrying out a task in the public interest or in your capacity as an official authority.
  6. There exists a legitimate interest for processing.

Sensitive data, as health data is, get more privacy protection, and Article 9 covers this specifically. Safeguards used include:

  1. Pseudonymisation: This is removing identifying fields such as name, date of birth and address but in health needs to go even further. A diagnosis of a specific disease and treating hospital plus gender may be enough to identify the patient. With big data and large amounts of patients, it becomes harder to identify individuals, but even there it is important to think about unusual characteristics which may make the patient stand out. Some doctors have fallen foul to this on twitter when making what they thought were generic comments about a type of patient they may have seen during a specific shift. However, at the same time you still have to have the correct data to treat your patient. This means that you need additional information in order to access all the information about your specific patient.
  2. Anonymisation: This means that you strip away all the identifying aspects from the data and can no longer identify the patient. This is a valid technique for research. You can no longer identify the person even if you have the additional information. As mentioned previously, it is very hard to anonymise medical data and there is a chilling report here for al those with any level of data protection responsibility about how supposedly anonymised health data sets were not so anonymous once compared to local newspaper reports. 43% of the individuals were identified.
  3. Encryption: This encoding of the data is very much more a technical aspect.  Most doctors would find it hard to know what questions to ask and then interpret the answers. However, thinking of specific clinical contexts may make the technical team think about uses and deviations which they had not come across.

In general, observing good medical practice will set you on the right road, but the questions come when you want to contract a new software.

  1. What / who is the data processor you use? Are they compliant with GDPR and what sort of guarantees do they offer?
  2. As this is sensitive data, how is it:
    1. Pseudonymised?
    2. Encrypted?
  3. How are you complying with data protection by design and default?

Although most clinicians without any programming or technical knowledge would find it hard to ask specific questions and then understand the answers. However, technicians don’t have the situation-specific understanding of how this data will be used and going through a typical consultation together step by step can help uncover moments when there may be data compliance issues. This is the data protection by default – only the sensitive data needed for the specific process can be processed. For example:

  • How do you lock the screen temporarily while examining a patient when family members may be present?
  • How do you deal with multiple doctors using the same computer?
  • How are blood results transferred between the laboratory and your EHR?
  • Are emails encrypted if you have to do a referral to a colleague?

The company selling you any software should be able to give you clear answers and explanations as to how they are helping you comply with your obligations as a data controller in the clinical setting. Your obligations when contracting a data processor are set out in Article 28, and even if you don’t know the article in detail (!), the people selling you the EHR should.

Consent

Consent….

is the current buzzword in healthcare tech and especially in the context of the looming GDPR deadline or European regulation of data protection. Those in charge of controlling the data are getting nervous about what they do with the information they have and the rest of us are slowly realising how much we are giving over. Whichever side of the fence you are, the word consent seems to lead to a rise in heart beat and a wish that we could go back to the simple life before we were aware of how much Facebook was playing with our data.

For practising physicians, consent is the basis of all clinical interventions, drummed in to all levels and all types of interventions. From the implied consent to draw blood by putting your arm out to the needle to a 3 page pre-operation document, consent is part of our daily bread. Doubts come (usually retrospectively when faced with a complaint) as to whether written consent using up an extra desperately needed 5 minutes in the emergency room is enough, or whether we should wing it on oral consent which still has to be documented in the notes. Whichever it is, there is not doubt that it has to be part of the process. Revised guidance from the Royal College of Emergency Medicine details practically how to deal with consent issues in the Emergency Department in the UK in a short document.

The big question which is more complicated is does that person have capacity to take the decision to give consent. For physicians this question comes up obviously when faced with an intoxicated or unconscious person although the answer is not for that reason easy. Although some of the principles may seem to be more specific to a medical setting such as best interests and the least restrictive intervention, others lead to interesting questions for the tech industry when it comes to look at future legislation. The Mental Capacity Act 2005 in the UK gives the possibility of an individual to make what might be seen as unwise or eccentric decisions and that there is a presumption of capacity. How does this translate when the consent is being given online and where so the responsibilities of the data controllers lie when it comes to assessing the capacity to give that consent?

Age is another complicating factor. Although Article 8 of the GDPR states that consent can only be given by an individual over the age of 16, consent to medical treatment in England can be given without their guardian if the patient younger than 16 is determined to be Gillick competent. It is based on full comprehension of the medical treatment being proposed in order to be able to give that consent. Take that into the data protection arena, many over 16 may not be able to fully comprehend what will happen to their data. Should there be something similar for data protection? Or is a blanket ban on all minors (less than 16 in this case), a valid if easy way to think about it?