Lost USB? Hacked? What to do in the case of a data protection breach?

Despite all the best will in the world and processes in places, data breaches can happen. It can be as simple as a lost USB with patient information or a more sustained hacking attempt which affects only your clinic or you as part of a wider organisation which has been maliciously attacked.

Informing the supervisory body.

The most important point is that you have 72h to inform the supervisory body as soon as you are aware of the breach as per Article 33. If you don’t do this within 72h, you must give reasons as to why this wasn’t done. The information you will need to provide is:

  • Nature of the breach:
    • Categories of data subjects
    • Numbers of data subjects.
    • Numbers and categories of data records affected.
  • Data protection officer contact details as well as those of other people who may be able to give relevant information.
  • Explain the potential consequences of this breach.
  • Explain what you have done so far and what you plan to do to mitigate the effects of the breach.

Informing the patient.

Once you have informed the supervisory authority, you need to notify the person whose data has been breached (data subject) in clear and plain language. As per Article 34, you do not need to inform the patient if:

  1. The data was encrypted or used other methods to ensure that it is unintelligible to persons not authorised to access it.
  2. The data controller has taken extra measures to ensure the risks of the data breach are not likely to materialise.
  3. It would involve a disproportionate effort. Public communication would be the alternative in this case.

If the supervisory authority feels that this is a high-risk situation and you have not informed your patient/data subject, they make take on the task of informing patients about the data breach and its potential consequences.

When can you (temporarily) skip the medical data protection?

Health data is by definition and function sensitive data, but as anyone seeing patients knows, it is not always practical to get consent when treating a sick patient.

It is not necessary to encrypt or anonymise patient data if:

  1. The patient as given express consent.
  2. It is in the vital interest of the patient, and the patient is unable to give consent. E.g., an unconscious patient arrives in the ER or if the patient is a minor.
  3. The professional processing the data to provide health care is already under a professional obligation to treat patients according to a code of confidentiality. This is the Hippocratic oath and all other versions which have followed.

When you do find out more information about, for example, an unconscious patient, you are under the obligation to update records immediately. Again standard practise for medical professionals before the GDPR was brought in.

It’s a short article because it’s a short message.

Don’t let the fear of data protection legislation stop you saving lives!

Data protection for app developers & large organisations.

You may think that ensuring compliance with data protection in a large organisation is even harder than in a smaller clinic. However, it can be the complete opposite as you may find yourself having to appoint a Data Protection Officer (DPO) who takes over this role. Whether you need to do this or not will depend on the conclusions of a Data Protection Impact Assessment (DPIA) as per Article 35.

The use of new technologies such as EHR or health apps combined with large quantities of sensitive data such as in the case of a hospital means it is necessary to carry out a DPIA following the advice of a DPO. It is the data controller (doctor or other in charge of the data) who has to instigate this.

Data processors too have to think about a DPIA and if you are developing a health app this means you also have a responsibility:

When appointing a DPO, whether in the context of a larger clinical setting or app development, you can use the same DPO as other establishments as long as you easy access to that person. They can be part of your staff (and potentially fulfil other functions). You must communicate who your DPO is to the supervisory authority.

Even if a DPO is appointed the data controller is still required to record all the processing activities.

GDPR and fitness apps.

Do you own a fitness tracker? Or even just activate the steps counter on your phone?

Most of us have used some sort of health or fitness app, whether to go running or record more intimate details. Most of us have also ticked all the terms and conditions automatically. To comply with GDPR, the information should be clear, and the data collection limited to what is needed by the app. Is geolocation and access to your contacts always necessary? How do you feel about your age and gender combined with your fitness level being shared with undisclosed third parties? While medical data for clinical trials usually have to be anonymised, this is not necessarily the case for your data which is then shared with your insurer or your mortgage broker…without you even knowing it. This is when the targeted ads for new running shoes pale into insignificance. Higher health insurance premiums or rejected mortgage applications have a real impact on our life.

As a doctor, you will be the controller of the fitness data of the data subject, who is your patient. In the context of fitness trackers, you need to be sure that you comply with Article 5, being especially mindful that the data you collect is limited to the specific healthcare purpose. As apps can often collect a lot more data than you would imagine, as a doctor and controller, you need to be sure that you don’t end up collecting everything indiscriminately. This same data can make it unexpectedly easy to identify patients even if you remove the distinct identifiers such as name, age and gender.

Personal data is any data that can identify you as an individual and more specifically, health data is anything that refers to your specific health status. Furthermore, this is classed as sensitive data as the consequences of this data becoming more widely known can have more serious implications as previously mentioned.

If you are integrating the information from an app as part of an EHR program you have contracted, this is one of the questions to ask the EHR seller. How do you ensure that only relevant information is brought across? This is something they may not even have thought about.

If you are incorporating the information in a report format generated by the app that the patient has sent you by email for example, then just make sure you have a copy of preferably written consent. It should cover the data being incorporated into their EHR and therefore, everyone else who also has access to the EHR.

Although fitness trackers can be a good way of getting people or your patients to a better state of health, you may want to have a chat about “free” trackers. Some health insurance companies are offering almost free fitness trackers. However, they then access your data and premiums may be affected by how the health company evaluates your fitness and therefore, your risk for future illness. They might not turn out to be so cheap after all.  There are many less expensive if less prestigious fitness trackers on the market. In reality, most people only need an activity monitor and heart rate monitor. The ECG monitoring option has been controversial and may not be relevant to your patient. It is a fast changing industry and clinical need rather than opportunity should be the dictator as to whether you incorporate a fitness tracker or other wearable into your practise. It is important to think if the information provided is useful or will potentially lead to more testing as with the incidentalomas (incidental imaging finding) which appeared when full body CTs became available. Just because you can, doesn’t mean you should!

GDPR and health data – the questions you need to ask as a doctor.

As a doctor, I have always been very aware of the importance of patient confidentiality. Not only for ethical or legal reasons but also for purely practical purposes. If you don’t have all the information you can’t make the right decisions, and you will only get all the embarrassing information if patients are confident it won’t go any further.

However, from a legal perspective, it is not always that clear, especially when we are talking about health data which now comes from sources other than just the patient. Fitness trackers, for example, give useful information, but how should I store that data?

And if you are looking to buy into some new digital technology, what are the questions you need to ask?

If you are still using paper records or are outside of the EU, this too affects you as all data are covered by articles 2 and 3 of the GDPR.

Historically this has been recognised as a concern as early as 1970 with privacy being covered in the European Convention on Human Rights. Data protection was mentioned in 1981 in the Convention 108 for the Protection of Individuals with regard to Automatic Processing of Personal Data. Therefore the right to data protection is a fundamental right. Now, most people will have heard of the General Data Protection Regulation or GDPR which came into effect in May 2018, if only because of the pop-ups requesting permissions or in the case of certain non-EU websites, refusing access altogether.

For doctors, the essential concepts to understand about data processing or actions on information that can identify their patient are:

  1. Data controller: Person who decides what data is collected, how this data is collected and for which purpose. As a doctor, you or your institution can be a data controller.
  2. Data processor: Person or service who processes the data under the instructions of the controller and as a doctor using #digitaltech this can be the software you are buying storing the data and which needs to be formalised with a contract.
  3. Data subject: Patient or identifiable person.

Article 5 of the GDPR covers data processing, and as a doctor/data controller, you need to be aware that the data you collect should be:

  1. Lawful, fair and transparent.
  2. Limited to purpose – you need to be recording data with a specific, limited and explicit purpose.
  3. Minimised – irrelevant data should not be recorded.
  4. Accurate – doctors are used to keeping treatment changes, for example, and we are all aware of the legal consequences of not keeping legible notes.
  5. Storage limitation – this refers to not keeping the data for longer than required. Health is probably one of the few exceptions where you can argue that the data should be stored for the entire life of a person to give the best care.
  6. Integrity and confidentiality. This refers to the fact that the data must be protected appropriately through technical and organisational means. You need to consider not only loss and damage (accidental or other) but also that it is not accessed inappropriately by different members of staff. This is a core question when being presented with a new medical application or technology for your practice. Larger institutions such as hospitals will have an information security officer, but if you practise in a smaller setting, this responsibility will be yours.

Finally, to process any data, you need to be sure that there are legal grounds for processing the data you have collected. For doctors, the concepts are familiar:

  1. Consent has been given.
  2. It is necessary for a contract to be carried out and specifically, in the health care setting, this includes an agreement to medical treatment either implicitly or explicitly.
  3. You are complying with a legal obligation.
  4. You are protecting the vital interests of a patient.
  5. You are carrying out a task in the public interest or in your capacity as an official authority.
  6. There exists a legitimate interest for processing.

Sensitive data, as health data is, get more privacy protection, and Article 9 covers this specifically. Safeguards used include:

  1. Pseudonymisation: This is removing identifying fields such as name, date of birth and address but in health needs to go even further. A diagnosis of a specific disease and treating hospital plus gender may be enough to identify the patient. With big data and large amounts of patients, it becomes harder to identify individuals, but even there it is important to think about unusual characteristics which may make the patient stand out. Some doctors have fallen foul to this on twitter when making what they thought were generic comments about a type of patient they may have seen during a specific shift. However, at the same time you still have to have the correct data to treat your patient. This means that you need additional information in order to access all the information about your specific patient.
  2. Anonymisation: This means that you strip away all the identifying aspects from the data and can no longer identify the patient. This is a valid technique for research. You can no longer identify the person even if you have the additional information. As mentioned previously, it is very hard to anonymise medical data and there is a chilling report here for al those with any level of data protection responsibility about how supposedly anonymised health data sets were not so anonymous once compared to local newspaper reports. 43% of the individuals were identified.
  3. Encryption: This encoding of the data is very much more a technical aspect.  Most doctors would find it hard to know what questions to ask and then interpret the answers. However, thinking of specific clinical contexts may make the technical team think about uses and deviations which they had not come across.

In general, observing good medical practice will set you on the right road, but the questions come when you want to contract a new software.

  1. What / who is the data processor you use? Are they compliant with GDPR and what sort of guarantees do they offer?
  2. As this is sensitive data, how is it:
    1. Pseudonymised?
    2. Encrypted?
  3. How are you complying with data protection by design and default?

Although most clinicians without any programming or technical knowledge would find it hard to ask specific questions and then understand the answers. However, technicians don’t have the situation-specific understanding of how this data will be used and going through a typical consultation together step by step can help uncover moments when there may be data compliance issues. This is the data protection by default – only the sensitive data needed for the specific process can be processed. For example:

  • How do you lock the screen temporarily while examining a patient when family members may be present?
  • How do you deal with multiple doctors using the same computer?
  • How are blood results transferred between the laboratory and your EHR?
  • Are emails encrypted if you have to do a referral to a colleague?

The company selling you any software should be able to give you clear answers and explanations as to how they are helping you comply with your obligations as a data controller in the clinical setting. Your obligations when contracting a data processor are set out in Article 28, and even if you don’t know the article in detail (!), the people selling you the EHR should.