Health data – How long can / should I keep it?

Whether you are a data controller deciding which data should be used or a data processor in charge of keeping the health data in the cloud for example, how long you you should keep data for is something you should be proactively thinking about. The general principle is that you only keep it as long as is necessary, which of course can be open to debate and also regional variations.

The purpose for which the data has been collected will help you decide how long to store data so that you are not exposing yourself to a data breach for longer than needed. If you are developing an app then that time should be specified clearly in the terms and conditions. When looking at health data, for individual patient treatment and diagnostics, the concept of “as long as is needed” could be thought, from a clinicians point of view, to be for the duration of the individual’s life. For research, it can be and is argued that the data should be kept beyond an individual’s life. Theses decisions are often taken by the organisation’s data protection officer or DPO.

WIth health data, as long as you still have some responsibility for that patient, and the patient has recognised it, then you can and should keep their health data. As ever it is up to you to make sure that it is accurate and up to date. This includes making sure that contact details are current. Once you have decided which data you are collecting, the amount of time you decide to keep it is the easy bit.

When can you (temporarily) skip the medical data protection?

Health data is by definition and function sensitive data, but as anyone seeing patients knows, it is not always practical to get consent when treating a sick patient.

It is not necessary to encrypt or anonymise patient data if:

  1. The patient as given express consent.
  2. It is in the vital interest of the patient, and the patient is unable to give consent. E.g., an unconscious patient arrives in the ER or if the patient is a minor.
  3. The professional processing the data to provide health care is already under a professional obligation to treat patients according to a code of confidentiality. This is the Hippocratic oath and all other versions which have followed.

When you do find out more information about, for example, an unconscious patient, you are under the obligation to update records immediately. Again standard practise for medical professionals before the GDPR was brought in.

It’s a short article because it’s a short message.

Don’t let the fear of data protection legislation stop you saving lives!

Sharing & transferring health data.

When you share patient data as a doctor, for example, referring your patient to a cardiologist colleague, you are ‘disclosing personal data’. You don’t have to disclose the transfer of the information to the patient or data subject if you are still respecting professional confidentiality. The receiver or recipient of this data then becomes the data controller with the inherent obligations.

Patients too have the right to take their data with them wherever they go, this is the right to data portability.

Apps are not covered by professional confidentiality. So any changes in who has access to or is processing the data have to be informed in full to the app user including the identity of the new app data controller, the categories of data which will be used and the recipients of the data among others. It is a long list, but how many people just click on the “updated terms and conditions” without reading them? Most of us…

Being based outside the EU does not exempt an app from complying with GDPR if the data subject or app downloader is based in the EU. So unless you are 100% certain that you are complying with GDPR you should limit your app store access to countries not covered by the GDPR.

If the data is being shared outside the EU (of particular interest in the context of Brexit), then similar levels of protection should be requested. Chapter V covers the transfer of data outside of the EU and clearly states that once the EU has decided if the minimum requirements are met, this has to be reviewed every 4 years. It is the European Commission who decides if the standards are being met

GDPR and fitness apps.

Do you own a fitness tracker? Or even just activate the steps counter on your phone?

Most of us have used some sort of health or fitness app, whether to go running or record more intimate details. Most of us have also ticked all the terms and conditions automatically. To comply with GDPR, the information should be clear, and the data collection limited to what is needed by the app. Is geolocation and access to your contacts always necessary? How do you feel about your age and gender combined with your fitness level being shared with undisclosed third parties? While medical data for clinical trials usually have to be anonymised, this is not necessarily the case for your data which is then shared with your insurer or your mortgage broker…without you even knowing it. This is when the targeted ads for new running shoes pale into insignificance. Higher health insurance premiums or rejected mortgage applications have a real impact on our life.

As a doctor, you will be the controller of the fitness data of the data subject, who is your patient. In the context of fitness trackers, you need to be sure that you comply with Article 5, being especially mindful that the data you collect is limited to the specific healthcare purpose. As apps can often collect a lot more data than you would imagine, as a doctor and controller, you need to be sure that you don’t end up collecting everything indiscriminately. This same data can make it unexpectedly easy to identify patients even if you remove the distinct identifiers such as name, age and gender.

Personal data is any data that can identify you as an individual and more specifically, health data is anything that refers to your specific health status. Furthermore, this is classed as sensitive data as the consequences of this data becoming more widely known can have more serious implications as previously mentioned.

If you are integrating the information from an app as part of an EHR program you have contracted, this is one of the questions to ask the EHR seller. How do you ensure that only relevant information is brought across? This is something they may not even have thought about.

If you are incorporating the information in a report format generated by the app that the patient has sent you by email for example, then just make sure you have a copy of preferably written consent. It should cover the data being incorporated into their EHR and therefore, everyone else who also has access to the EHR.

Although fitness trackers can be a good way of getting people or your patients to a better state of health, you may want to have a chat about “free” trackers. Some health insurance companies are offering almost free fitness trackers. However, they then access your data and premiums may be affected by how the health company evaluates your fitness and therefore, your risk for future illness. They might not turn out to be so cheap after all.  There are many less expensive if less prestigious fitness trackers on the market. In reality, most people only need an activity monitor and heart rate monitor. The ECG monitoring option has been controversial and may not be relevant to your patient. It is a fast changing industry and clinical need rather than opportunity should be the dictator as to whether you incorporate a fitness tracker or other wearable into your practise. It is important to think if the information provided is useful or will potentially lead to more testing as with the incidentalomas (incidental imaging finding) which appeared when full body CTs became available. Just because you can, doesn’t mean you should!

GDPR and health data – the questions you need to ask as a doctor.

As a doctor, I have always been very aware of the importance of patient confidentiality. Not only for ethical or legal reasons but also for purely practical purposes. If you don’t have all the information you can’t make the right decisions, and you will only get all the embarrassing information if patients are confident it won’t go any further.

However, from a legal perspective, it is not always that clear, especially when we are talking about health data which now comes from sources other than just the patient. Fitness trackers, for example, give useful information, but how should I store that data?

And if you are looking to buy into some new digital technology, what are the questions you need to ask?

If you are still using paper records or are outside of the EU, this too affects you as all data are covered by articles 2 and 3 of the GDPR.

Historically this has been recognised as a concern as early as 1970 with privacy being covered in the European Convention on Human Rights. Data protection was mentioned in 1981 in the Convention 108 for the Protection of Individuals with regard to Automatic Processing of Personal Data. Therefore the right to data protection is a fundamental right. Now, most people will have heard of the General Data Protection Regulation or GDPR which came into effect in May 2018, if only because of the pop-ups requesting permissions or in the case of certain non-EU websites, refusing access altogether.

For doctors, the essential concepts to understand about data processing or actions on information that can identify their patient are:

  1. Data controller: Person who decides what data is collected, how this data is collected and for which purpose. As a doctor, you or your institution can be a data controller.
  2. Data processor: Person or service who processes the data under the instructions of the controller and as a doctor using #digitaltech this can be the software you are buying storing the data and which needs to be formalised with a contract.
  3. Data subject: Patient or identifiable person.

Article 5 of the GDPR covers data processing, and as a doctor/data controller, you need to be aware that the data you collect should be:

  1. Lawful, fair and transparent.
  2. Limited to purpose – you need to be recording data with a specific, limited and explicit purpose.
  3. Minimised – irrelevant data should not be recorded.
  4. Accurate – doctors are used to keeping treatment changes, for example, and we are all aware of the legal consequences of not keeping legible notes.
  5. Storage limitation – this refers to not keeping the data for longer than required. Health is probably one of the few exceptions where you can argue that the data should be stored for the entire life of a person to give the best care.
  6. Integrity and confidentiality. This refers to the fact that the data must be protected appropriately through technical and organisational means. You need to consider not only loss and damage (accidental or other) but also that it is not accessed inappropriately by different members of staff. This is a core question when being presented with a new medical application or technology for your practice. Larger institutions such as hospitals will have an information security officer, but if you practise in a smaller setting, this responsibility will be yours.

Finally, to process any data, you need to be sure that there are legal grounds for processing the data you have collected. For doctors, the concepts are familiar:

  1. Consent has been given.
  2. It is necessary for a contract to be carried out and specifically, in the health care setting, this includes an agreement to medical treatment either implicitly or explicitly.
  3. You are complying with a legal obligation.
  4. You are protecting the vital interests of a patient.
  5. You are carrying out a task in the public interest or in your capacity as an official authority.
  6. There exists a legitimate interest for processing.

Sensitive data, as health data is, get more privacy protection, and Article 9 covers this specifically. Safeguards used include:

  1. Pseudonymisation: This is removing identifying fields such as name, date of birth and address but in health needs to go even further. A diagnosis of a specific disease and treating hospital plus gender may be enough to identify the patient. With big data and large amounts of patients, it becomes harder to identify individuals, but even there it is important to think about unusual characteristics which may make the patient stand out. Some doctors have fallen foul to this on twitter when making what they thought were generic comments about a type of patient they may have seen during a specific shift. However, at the same time you still have to have the correct data to treat your patient. This means that you need additional information in order to access all the information about your specific patient.
  2. Anonymisation: This means that you strip away all the identifying aspects from the data and can no longer identify the patient. This is a valid technique for research. You can no longer identify the person even if you have the additional information. As mentioned previously, it is very hard to anonymise medical data and there is a chilling report here for al those with any level of data protection responsibility about how supposedly anonymised health data sets were not so anonymous once compared to local newspaper reports. 43% of the individuals were identified.
  3. Encryption: This encoding of the data is very much more a technical aspect.  Most doctors would find it hard to know what questions to ask and then interpret the answers. However, thinking of specific clinical contexts may make the technical team think about uses and deviations which they had not come across.

In general, observing good medical practice will set you on the right road, but the questions come when you want to contract a new software.

  1. What / who is the data processor you use? Are they compliant with GDPR and what sort of guarantees do they offer?
  2. As this is sensitive data, how is it:
    1. Pseudonymised?
    2. Encrypted?
  3. How are you complying with data protection by design and default?

Although most clinicians without any programming or technical knowledge would find it hard to ask specific questions and then understand the answers. However, technicians don’t have the situation-specific understanding of how this data will be used and going through a typical consultation together step by step can help uncover moments when there may be data compliance issues. This is the data protection by default – only the sensitive data needed for the specific process can be processed. For example:

  • How do you lock the screen temporarily while examining a patient when family members may be present?
  • How do you deal with multiple doctors using the same computer?
  • How are blood results transferred between the laboratory and your EHR?
  • Are emails encrypted if you have to do a referral to a colleague?

The company selling you any software should be able to give you clear answers and explanations as to how they are helping you comply with your obligations as a data controller in the clinical setting. Your obligations when contracting a data processor are set out in Article 28, and even if you don’t know the article in detail (!), the people selling you the EHR should.

Consent

Consent….

is the current buzzword in healthcare tech and especially in the context of the looming GDPR deadline or European regulation of data protection. Those in charge of controlling the data are getting nervous about what they do with the information they have and the rest of us are slowly realising how much we are giving over. Whichever side of the fence you are, the word consent seems to lead to a rise in heart beat and a wish that we could go back to the simple life before we were aware of how much Facebook was playing with our data.

For practising physicians, consent is the basis of all clinical interventions, drummed in to all levels and all types of interventions. From the implied consent to draw blood by putting your arm out to the needle to a 3 page pre-operation document, consent is part of our daily bread. Doubts come (usually retrospectively when faced with a complaint) as to whether written consent using up an extra desperately needed 5 minutes in the emergency room is enough, or whether we should wing it on oral consent which still has to be documented in the notes. Whichever it is, there is not doubt that it has to be part of the process. Revised guidance from the Royal College of Emergency Medicine details practically how to deal with consent issues in the Emergency Department in the UK in a short document.

The big question which is more complicated is does that person have capacity to take the decision to give consent. For physicians this question comes up obviously when faced with an intoxicated or unconscious person although the answer is not for that reason easy. Although some of the principles may seem to be more specific to a medical setting such as best interests and the least restrictive intervention, others lead to interesting questions for the tech industry when it comes to look at future legislation. The Mental Capacity Act 2005 in the UK gives the possibility of an individual to make what might be seen as unwise or eccentric decisions and that there is a presumption of capacity. How does this translate when the consent is being given online and where so the responsibilities of the data controllers lie when it comes to assessing the capacity to give that consent?

Age is another complicating factor. Although Article 8 of the GDPR states that consent can only be given by an individual over the age of 16, consent to medical treatment in England can be given without their guardian if the patient younger than 16 is determined to be Gillick competent. It is based on full comprehension of the medical treatment being proposed in order to be able to give that consent. Take that into the data protection arena, many over 16 may not be able to fully comprehend what will happen to their data. Should there be something similar for data protection? Or is a blanket ban on all minors (less than 16 in this case), a valid if easy way to think about it?