Code of practice for wellbeing and mental health analytics
Suggesting how education providers can ensure that their use of data to support wellbeing does not create risks for students or staff.
Summary
Whereas learning analytics uses data to inform decisions – from individual to curriculum level – on how to support students’ learning, data may also be used to inform decisions on how to support their wellbeing and mental health. Possible applications cover a very wide range: from screen-break reminders to alerts when a student appears to be at risk of suicide. Clearly such uses of data can involve both significant benefits and high risks.
This code of practice suggests how universities, colleges and other tertiary education providers can ensure that their use of data to support wellbeing does not create risks for students or staff, taking responsibility and demonstrating accountability for their actions in selecting, developing, implementing, operating and reviewing data-informed wellbeing processes.
This will involve working with groups and individuals across the institution:
- Responsibility
- Transparency and consent
- Privacy
- Validity
- Access
- Positive interventions
- Stewardship
These need to be developed with students, staff, data owners, IT services and university governance, as well as student support services and data protection officers. Universities UK refers to this as a “whole-university approach”; Student Minds’ University Mental Health Charter calls it a “cohesive ethos”.
To support these discussions, this code also includes practical tools – for Data Protection Impact Assessments (DPIA) and purpose compatibility assessment for data sources – that should help to ensure the institution’s activities are, and can be shown to be, both safe for individuals and compliant with the law.
Introduction
The approach taken by Jisc’s code of practice for learning analytics provides a good starting point for wellbeing and mental health applications.
This wellbeing and mental health code provides a detailed discussion of additional issues raised by the use of data for wellbeing purposes. Here we concentrate on the use of data in delivering wellbeing and mental health support: broader issues such as duty of care, healthcare treatment, human rights, equality and discrimination are not covered, though we have referenced relevant guidance on those issues where we are aware of it.
When delivering wellbeing and mental health support, institutions are likely to be processing personal data concerning health; some forms of analytics may aim to infer such data from other, behavioural, indicators, such as the student’s engagement with learning systems and processes. As well as meeting the legal standards that apply to all processing of personal data, wellbeing and mental health applications must satisfy the additional conditions and safeguards that apply to special category data.
This code of practice therefore includes safeguards from several areas of the General Data Protection Regulation (GDPR) and the UK Data Protection Act 2018 that may be relevant when addressing wellbeing and mental health. In particular:
- Voluntary wellbeing apps – where each individual makes a positive choice to report or be monitored – could be provided on the basis of “consent”, though this requires both clear and detailed information to be given to users and that their consent be freely given, informed, unambiguous, specific, explicit and recorded
- If, however, an institution wishes to provide support across all students, or all of a group – for example by increasing the information available to appropriately-trained tutors and support staff when they have conversations with students or by flagging students who may need to be contacted proactively – then consent cannot be used as a basis. To help institutions fulfil their responsibilities in these circumstances, this code includes safeguards applicable to processing in the substantial public interests of preventive medicine and “protecting the physical, mental or emotional well-being” of individuals who are at risk of one or more of those types of harm
- Where using existing, historic, data to develop and test statistical models, provisions and safeguards on research use of personal data may be most appropriate
A detailed discussion of these, and other, lawful bases for processing health data can be found in step four of Annex A: data protection impact assessment template for wellbeing and mental health analytics.
Increased use of data may help to ensure that additional support is offered consistently and effectively, where there is greatest need. However the overall level of such provision – in effect the threshold at which support is offered and the kinds and depth of support that are provided – is likely to remain an institutional choice. AMOSSHE’s discussion of universities’ duty of care suggests the level of provision likely to be required by law.
Where wellbeing or mental health information is derived from existing learning analytics processes, the stronger controls in this code should be used from the point where the wellbeing/health purpose separates from the learning analytics one. In other words, where the aim becomes to identify potential health issues rather than academic ones. For example:
- If the organisation decides to collect additional data for the wellbeing/health purpose, this code, rather than the learning analytics one, should apply to the decision to collect that data and thereafter
- If the organisation decides to use different algorithms for the wellbeing/health purpose, use this code from the decision to use those algorithms
- If tutors, or other support staff, are told “this pattern of learning problems may benefit from a wellbeing discussion”, use this code from the decision to create that instruction
- If, during normal tutorial conversations, an individual tutor suggests to a student that they might seek other kinds of help, that would be covered by normal tutorial processes, not this code of practice
Consideration of validity and enabling positive interventions and minimising adverse impacts may indicate that the purposes of learning analytics and wellbeing and mental health should separate earlier. For example those principles may reveal that learning analytics algorithms are not, in fact, the best predictors of wellbeing issues or that some interventions should take place in a health, rather than tutorial, context.
Key differences from learning analytics
Since wellbeing and mental health analytics is intended to improve students’ health it should be overseen by health professionals, in the same way as analytics to improve students’ learning should be overseen by learning professionals. Provided they remain under this authority and are subject to appropriate confidentiality rules, the day-to-day operation of wellbeing and mental health analytics may be conducted by appropriately trained and supported tutors and other staff (see responsibility section for more).
A wider range of data sources may be relevant to wellbeing and mental health than for learning analytics: both environmental indicators of when a student may be in a stressful situation (for example a change of course) and behavioural ones that suggest they may not be coping (for example a sudden change in study pattern). Some of these sources may have been collected for very different purposes, and students may not expect them to be re-used in this way. Institutions will therefore need processes to determine whether it is appropriate to include a particular data source and, if so, what additional measures may be needed (see transparency and consent, and Annex B: purpose and transparency for wellbeing and mental health analytics.)
Testing and validation of algorithms and processes are even more important for wellbeing and mental health analytics because of the serious consequences if they go wrong. However such testing must be conducted separately, using data pseudonymisation and anonymisation wherever possible, to ensure that information does not leak between the test and production processes. Testers must not see individual identities, counsellors must not be able to see data that was provided only for testing (see validity for more).
Since the likely legal justification for proactive wellbeing and mental health analytics is to provide support to individuals, institutions must ensure that adequate services to provide such support will actually be available to individuals when data, algorithms or other signals indicate that they may be needed (see under enabling positive interventions and minimising adverse impacts).
Wellbeing and health applications will require a formal Data Protection Impact Assessment (DPIA), involving stakeholders and the organisation’s data protection team (see responsibility and Annex A: data protection impact assessment template for wellbeing and mental health analytics).
Code of practice for wellbeing and mental health analytics
Jisc’s code of practice for learning analytics provides a baseline for supplementary uses of student data.
This code uses the same headings: for each it highlights key common areas (for which detail can be found in the learning analytics code) before a detailed discussion of additional issues raised by the use of data for wellbeing and mental health purposes.
Supporting documents
Annex A: data protection impact assessment template for wellbeing and mental health analytics (pdf)
This template is an example of how you can record your DPIA process and outcome. You should start to fill out the template at the start of any major project involving the use of personal data, or if you are making a significant change to an existing process.
Annex B: purpose and transparency for wellbeing and mental health analytics (pdf)
Useful questions aimed to help institutions gather and record the information needed to assess purpose compatibility and transparency requirements. It also shows how to create a heatmap to give a visual indication of how challenging incorporating particular data sources may be.
Responsibility
From the code of practice for learning analytics:
- "Institutions must decide who has overall responsibility for the legal, ethical and effective use of analytics"
- "Student representatives and key staff groups at institutions should be consulted about the objectives, design, development, roll-out and monitoring of analytics"
Confidence and trust among students, staff and wider stakeholders is essential if wellbeing activities are to be beneficial, rather than harmful.
To achieve this, institutions will need to show that they are taking responsibility:
- Consulting and planning carefully before implementing any policies, processes, systems or data gathering
- Checking to ensure they deliver the expected results
The GDPR’s principle of accountability addresses many of these issues - designing processes and systems to ensure they protect personal data and the rights of individuals, monitoring those processes to ensure they are followed, and reviewing them to see where they can be improved. This code suggests various documents and records – assessments of data protection impact and purpose compatibility; records of processing activity, mapping of data flows, and policies on use of special category data – that the institution can use to demonstrate accountability and reassure students, staff and stakeholders.
Applications that aim to derive information about an individual’s health are likely to represent a high risk to privacy, and thus require a formal Data Protection Impact Assessment (DPIA). This includes identifying the relevant legal basis or bases for processing and ensuring that their specific requirements are satisfied.
Several organisations have published processes for conducting DPIAs, including ucisa and the Information Commissioner’s Office. Annex A: data protection impact assessment template for wellbeing and mental health analytics (pdf) offers specific guidance on using these processes to assess proposed wellbeing activities.
Where a high risk cannot be mitigated – though a successful DPIA process should normally do this – the institution should consider whether to continue with the proposal. If it decides to do so, the law requires prior consultation with the national data protection regulator: in the UK, the Information Commissioner’s Office.
The law requires that processing for preventive medicine must be done “under the responsibility of a professional subject to the obligation of professional secrecy” (Data Protection Act 2018 s.11(1)(a)). For wellbeing and mental health applications, UUK suggests that such regulated professionals should be found in student support directorates; both Jisc and UUK recommend “extensive consultation with mental health and student counselling specialists”.
Provided policies and processes remain “under the responsibility” of such professionals, day-to-day operations can be assigned to appropriately trained and resourced tutors and other staff in accordance with appropriate confidentiality rules.
Student Minds’ University Mental Health Charter stresses that “it is vital that staff in these roles are properly equipped, qualified, registered and supervised. This need for quality assurance extends to other interventions, such as the provision of digitally based services”.
Transparency and consent
From the code of practice for learning analytics:
- “The data sources, the purposes of the analytics, the metrics used, who has access to the analytics, the boundaries around usage and how to interpret the data must be explained clearly to staff and students”
- “Collection and use of data for [new purposes] may require further measures, such as data protection impact assessments and obtaining additional consent”
Because health-related applications involve special category data, the legal standards for transparency and consent (if that is the chosen legal basis) are likely to be stronger than for learning analytics.
Privacy notices
Individuals must be informed which data will be used for wellbeing and mental health purposes. This may be done through a privacy notice at the time of collection and/or through additional communications before data are used; where information is received from third parties individuals must be informed before it is used, and at the latest one month after it is received. Such notices and communications also provide an opportunity to explain that institutions have responsibilities beyond just teaching.
The Information Commissioner’s Office has guidance on the content of privacy notices. All notices and communications must be written so as to enable individuals to make informed choices. Special care is needed to ensure clarity and fairness when addressing those under 18: in particular, when providing information about the processing and its consequences, offering choices to individuals, and explaining the rights they have and how to exercise them.
As well as transparency to individuals, institutions can also build trust and confidence more widely by being transparent about how they design and review their processes and systems.
Publishing data protection impact assessments, purpose compatibility assessments and records of processing activity can demonstrate both that the institution is thinking very carefully about what it does, and that it is providing important support services while minimising the risk to individuals.
Purpose compatibility
Whereas learning analytics will generally be based on data about the learning process, for wellbeing and mental health a wider range of data sources may contain relevant information.
This is likely to include both environmental indicators of when a student may be in a stressful situation and behavioural ones that suggest they may not be coping. Particular care must be taken to inform individuals if unexpected data (eg finance) are incorporated into wellbeing models or processes, and to enable them to check and correct this information.
Such data should always have a plausible, and explained, connection to wellbeing and mental health, not just a statistical correlation. For example, financial difficulties might well be a factor in reducing a student’s wellbeing. In addition, the original reason for collecting/obtaining the data must be compatible with the new purpose of offering wellbeing/mental health interventions.
This requirement is likely to be met where information is already used to provide individual academic or health support, but less so where information was originally collected for statistical, or other, purposes. For example, if you already provide additional support to students with no family experience of higher education then wellbeing support is more likely to be a compatible purpose than if you only collect that information for statistical reporting.
Where the original purpose is not compatible with wellbeing, the privacy notice must first be changed; only data collected after the notice is changed may then be used for the new purpose.
Transparency is likely to be a particular challenge where institutions receive information from third parties since these may offer limited, or no, control over privacy notices. Regular sharing of data with third parties should be covered by a data sharing agreement.
Annex B: purpose and transparency for wellbeing and mental health analytics (pdf) has a more detailed discussion of how to assess purpose compatibility and the need for notification.
Before including a particular data source into a wellbeing/health analytics model, institutions must therefore consider:
- How privacy notices will be provided
- (For existing data) whether wellbeing/health is compatible with the purpose(s) for which the data are currently collected
- How they will ensure the data are accurate (see validity)
- How students can exercise their legal rights over their data (see access)
Where a basis other than consent is used, institutions should have a policy document that sets out the legal basis/bases for the processing and describes how the processing satisfies the data protection principles. In particular, this document must state how long wellbeing/health data will be retained for, and how it will be erased. The institution must be able to demonstrate that it is complying with this retention and erasure policy, and that the policy document is being reviewed regularly and updated as necessary. The Information Commissioner’s guide to special category data has more information on when this “appropriate policy document” is required and what it should contain.
Consent
If consent is used as a basis for processing (eg for installing a wellbeing app, providing additional data, or informing a tutor of contact with a counselling service) there must be a separate “express statement of consent” to the use of health data for each purpose. So a student who volunteers health information in requesting special examination or lecture arrangements, for example, must have a separate choice whether or not that information is also used in wellbeing assessments.
Step four of Annex A: data protection impact assessment template for wellbeing and mental health analytics (pdf) discusses when consent will and will not be an appropriate basis for processing and the alternatives that exist.
Withdrawal or objection
Where the legal basis for wellbeing/health processing is consent, individuals always have the right to withdraw their consent at any time. Note, however, that so long as a statistical model does not contain personal data, such a withdrawal should not extend to requiring a model to be recalculated.
In other cases – except where institutions have a legal obligation to process health information, or when there is a threat to life and the individual is incapable of giving consent – individuals are likely to have a right to object.
Formally, this only requires the institution to consider whether the individual’s personal circumstances mean the processing places them at higher risk. Where the processing is intended to support the individual’s wellbeing and mental health, it may be better to treat such objections as a simple opt-out, and record that the individual’s data should not be used either for developing systems and processes or for providing personalised treatment. There is unlikely to be any benefit to the institution or to others that justifies continuing to process for wellbeing against an individual’s wishes. Since wellbeing support is designed to benefit the individual, institutions may wish to reflect on why such support was refused.
Privacy
From the code of practice for learning analytics:
- “Access to student data and analytics should be restricted to those identified by the institution as having legitimate need to view them”
- “Institutions should ensure that student data is protected when third parties are contracted to store or carry out analytics on it”
As for learning analytics, systems must be designed to protect individuals’ privacy. Health-related processing and data are likely to require tighter restrictions (both technical and organisational) than that relating to learning. Medical standards for confidentiality, granting and controlling access should be the norm.
Systems and processes must be designed to use no more data than is necessary (see validity); data obtained for one purpose must not be used for others without the individual’s agreement (see consent) data should have a defined retention period or event, and be deleted or anonymised once that passes.
Health or wellbeing information can only be shared with third parties if there is an appropriate legal basis for this. For example:
- If processing is based on consent then sharing must be covered by that prior consent
- If sharing is part of physical, mental or emotional wellbeing services then information may only be shared – under an appropriate data sharing agreement – with those providing those services
- If there is a legal duty to share, this must be limited to information covered by that duty, and under the safeguards prescribed
- If none of these applies then information may only be shared in life and death situations where the subject of the information is incapable of giving their consent
Validity
From the code of practice for learning analytics:
- "It is vital that institutions monitor the quality, robustness and validity of their data and analytics processes in order to develop and maintain confidence in analytics and ensure it is used to the benefit of students”
Given the high risks of adverse consequences, it is essential to ensure that data and predictions derived from them are relevant and accurate.
Systems and processes for wellbeing support may use personal data in three different ways:
- First, when developing models that suggest indicators of need,
- Second, “production”, when using the models to identify which individuals may benefit from intervention
- Third, when reviewing whether the intervention processes were beneficial
At each stage accurate data is essential to reduce the risk of inappropriate interventions. Students and staff should therefore be enabled and encouraged to exercise their right to correct errors and omissions in their data, but institutions should not rely on this as the only way to ensure accuracy. Processes for obtaining and handling data should also be designed with safeguards to avoid introducing errors, and to detect those that may nonetheless arise.
Processing to develop and review models, systems and processes is vital, but must be kept separate from processing leading to interventions with individuals to ensure that, for example, validation data does not leak into the intervention process and testers are not able to identify individuals.
At each of the three stages, the processing of personal data must be minimised (ie no more than is necessary to achieve the purpose), while delivering effective results.
Development and review are likely to require a wider range of personal data than production systems. To determine effectiveness, they need historic data on the outcome of past (non-)interventions. To identify the most informative data sources, they will consider data sources and fields that are subsequently excluded from production models, for example because tests conclude that they do not make a significant contribution to alerts, or because the risk of including them is not justified by the benefit, or because their accuracy cannot be ensured, or because the required privacy notices and individual rights cannot be supported.
The greater range of data used in development and review requires particular care to be taken to minimise the risk of data processed for these purposes being linked to individuals. Synthetic, anonymous or pseudonymous data should be used wherever possible: the GDPR recognises pseudonymisation as a safeguard, but still classes pseudonyms as personal data; processes for generating anonymous or synthetic data must be reviewed periodically to ensure they remain safe.
Those developing models should be aware of, and manage, the risks that they may inadvertently reveal personal data. The ICO’s AI framework (pdf) contains more detail on privacy attacks.
Development and periodic review must ensure that models are, and remain, proportionate. They should also be checked for signs of bias or discrimination. Models must provide useful information to guide the provision of support while involving the least possible risk to individuals: both those who are identified as needing support and those who are not. Any predictive system or process will make mistakes: organisations should consider, and balance, the risk of alerting someone who did not need support, as well as failing to alert someone who did (see also Enabling Positive Interventions, below). The ICO’s AI framework contains more detail on the use of algorithmic techniques with personal data.
Access
From the code of practice for learning analytics:
- “Students should be able to access all analytics performed on their data in meaningful, accessible formats”
- “They should normally also be able to view the metrics and labels attached to them”
As for learning analytics, individuals have a right of access to their personal data.
For data concerning health, however, institutions must first consult with the “relevant health professional” to ensure that disclosing the information is not likely to cause serious harm to the physical or mental health of the data subject or another individual (Data Protection Act 2018, schedule 3, part 2).
Enabling positive interventions and minimising adverse impacts
From the code of practice for learning analytics:
- “Institutions should specify under which circumstances they believe they should intervene”
- “The type and nature of interventions, and who is responsible for carrying them out, should be clearly specified”
- “The impact of interventions on staff roles, training requirements and workload should be considered”
- “Analytics systems and interventions should be carefully designed and regularly reviewed to ensure that: students maintain appropriate levels of autonomy in decision-making; knowledge that their activity is being monitored does not lead to negative impacts; adverse impacts are minimised; staff have a working understanding of legal, ethical and unethical practice”
As with access, some interventions carry a risk of making a wellbeing or mental health problem worse, rather than better. Talking to someone about stress, depression or suicide requires both training and readily available support.
Data and algorithms will flag individuals with widely differing needs: personalised support is likely to be needed. Note that this may also apply where a concern may has been raised but appears to be a false alarm: as well as reviewing the model and process that led to the concern being raised, institutions should consider whether such individuals now need support to avoid them becoming self-fulfilling prophecies.
Institutions should therefore consider which interventions should be provided in a medical context, in case of a negative reaction or consequences, and should ensure that they can provide appropriate support before implementing any wellbeing/health application.
Stewardship
From the code of practice for learning analytics:
- “Data for analytics must comply with existing institutional data policies and [relevant legislation]”
The involvement of institutional data protection officers (DPOs) will be essential to maintaining accountability and compliance in this complex and developing area. Regular reviews of the institution’s policies, practices and risk assessments should include both DPOs and appropriate health professionals.
These reviews should cover, as a minimum, the data protection impact assessment, purpose compatibility assessment, and the policy document and processes for special category data.
However responsible and respectful use of data is only likely to be ensured by an appropriate cross-institutional culture - in Universities UK’s terms a “whole-university approach” or Student Minds’ “cohesive ethos”.