Responsible use of Generative AI in social care: Guidance

Background and context

Generative AI is a type of artificial intelligence (AI) that can learn and create new content. Generative AI models are trained on very large datasets to identify patterns and structures within the data. Once trained, Generative AI models can generate new content that is similar, but not identical, to the data they were trained on – this can include text, images, music etc.

There has been growing interest in using Generative AI in social care, both to support “back office” functions, including writing reports and generating materials, and to interact directly with people who use care and support, for example in the use of chat bots (i.e. robots that can hold a conversation similar to one with a human, often with a human-sounding voice). We are in the early stages of using Generative AI technologies, and how and when these technologies will be used in future and by whom is something we do not yet fully understand or know.

Generative AI has enormous potential to contribute to improving care and support for millions of people. However, there are risks surrounding it, particularly given these technologies are relatively new and still developing. We don’t know how these technologies will develop in future, and there is little evidence available about the medium- or long-term impacts of their use. It is, therefore, vital that when Generative AI is used in care and support it is used carefully and responsibly.

We also need to be aware of the context into which these technologies are being introduced. Across the system of care and support we see growing levels of unmet, and under met need; a workforce impacted by low pay, poor retention and problems recruiting; significant cost and other pressures on providers; and statutory bodies being asked to meet growing demand with limited resources. These challenges mean that we need to be especially cautious about how, when and why we choose to use new technologies.

 

About this guidance

This guidance is designed to support the responsible use of Generative AI in social care. The ‘responsible use of (generative) AI in social care’ means that the use of AI systems in the care or related to the care of people supports and does not undermine, harm or unfairly breach fundamental values of care, including human rights, independence, choice and control, dignity, equality and wellbeing.

This guidance, which has been co-produced with people who draw on care and support, care workers, care providers, technology providers and others, sets out a set of key principles that need to be considered in implementing Generative AI tools in social care. Working in co-production is best practice in the social care field and is a way of work we hope to see embedded in approaches to Generative AI in social care.

It is expressed as a series of “I” and “we” statements in different domains – mirroring the format of Making It Real which is a framework for understanding what good social care looks and feels like, which has been incorporated into the Care Quality Commission’s framework for assessing social care. Similarly these “I” and “we” statements describe what good looks like using Generative AI in social care. They do not necessarily reflect people’s current experience, but we hope that they can inform a better future. The statements draw upon, and complement a range of products developed by and for different stakeholders as part of this project.

We hope that this document can provide a practical guide for all those considering how Generative AI might be used in care and support. They can use it to think through the key issues they need to consider and the policy and procedures they need to put in place to ensure that their use of Generative AI is responsible and contributes to better outcomes and experiences for people.

Those involved in designing and delivering care and support, including those planning and commissioning care and support, and those providing care services (“providers of care” below) can use the statements to reflect on their own practice: considering whether the “we” statements are a good reflection of the work they are doing; and whether they are confident their work will enable more people who draw on care and support and care workers to make the “I” statements set out.

Those designing and developing new technologies (“technology providers” below) can use the statements to ensure that their practice is helping to enhance people’s lives and to realise the rights of individuals expressed in the ‘I’ statements.

People who draw on care and support and people who work in the provision of social care – including care workers, social workers, registered nurses, occupational therapists and others – can use these statements to identify where the care and support services they interact with may be able to support them better or where they need to change.

The document uses the term “technologies” to refer to the wide range of products, services and tools which may bring Generative AI (and other AI tools) into use in the provision of care and support.

DomainAs people who draw on care and supportAs people who work in social careAs providers of careAs technology providers
Improving care and support"I am confident that the technologies used in my care and support, are designed to enhance and improve my care, and to prioritise my well-being."“I am confident that the technologies I am asked to use in my work help to improve outcomes for the people I support.”

“I have access to technologies that support me to do my work and which improve my experience at work.”
“We plan the use of technologies with the well-being of people who draw on care and support as our priority.”

“We use technologies to enhance the care and support we offer. Our goal is better care and support.”

“We ensure the technologies we use are suitable, safe and effective for the purposes for which we use them.”

“We consider the impact of the technologies we use on the people who work in our services.”

“We judge the effectiveness of the technologies we use in terms of outcomes for people who draw on care and support.”
“We develop technologies with a focus on the well-being of our users. We are motivated by delivering better care and support for people who draw on it.”

“We ensure our technologies are effective, and safe, for the purposes they were created.”
Choice and control “I am able to choose how and when technologies are used in the provision of my care and support.”

“I am able to access the support I need to make an informed choice and to change my mind if I wish.”

“I am offered alternatives to using technologies, where I choose not to use them.”
“I am empowered to support people in making informed choices around their use of technologies, including enabling people to change their minds about their use.”“We support people to make informed choices around their use of technologies, offering alternatives where people choose not to use them.”“We develop technologies that can be personalised in line with our users’ choices.”
Accessibility“I am able to use a range of appropriate technologies which are accessible to me.”

“I am able to use appropriate technologies to improve my access to the things that matter to me.”
“I am able to access a range of technologies which meet people’s accessibility requirements.”

“The technologies in use in my work are accessible to me.”

“I am not expected to cover the costs of the equipment and connectivity required to use technologies in my work.”
“We are committed to ensuring that the technologies we use are as accessible as possible”

“We ensure that no one is disadvantaged by a lack of access to technologies”
“We take proactive steps to ensure that our products meet accessibility standards and are tailored appropriately to our users.”

“We offer clear information about who our technologies are appropriate for, so that people can make informed choices.”
Training“I am able to access appropriate support and training to use technologies effectively.”“I am trained in the use of the technologies which I am asked to use in my work.”

“I am able to access training to support me in developing my skills in relation to the use of technologies in care and support.”
“We support our staff to develop skills and capabilities in relation to the effective and responsible use of technologies in care and support.”“We offer support and training to those who use our products.”
Data privacy“I have clear information about whether and how my data will be collected, stored, accessed, shared and used by the technologies I use, so that I am able to make an informed choice about whether and how to use them.”

“I have the right support to give informed consent around the use of my personal data.”

"I am confident that my personal data is being handled in accordance with all existing legal frameworks."
“I am supported to communicate clearly with the people I work with about how their data will be collected, stored, accessed, shared and used within the technologies being used in the provision of care and support. I am able to support people to make informed choices.”

“I am equipped and empowered to support people to give ongoing and active consent around their data.”

“I have clear information about how my personal data is being collected, stored, shared and used by the technologies in use in the services in which I work and whether, and how, I am able to opt out of using these technologies.”
“We have clear policies in place around data collection, storage, access, sharing, use in care and support, and we ensure that the technologies we use comply with these.”

“We understand how the technologies we use collect, store, share and use data, and are able to communicate this clearly internally and externally.”

“We provide clear and accessible information about how data is being collected, stored, shared and used, providing additional support as required, to enable informed consent by end users.”

“We recognise that consent is an ongoing process and we offer additional support where needed to secure meaningful consent to the use of technologies in care and support.”

“We ensure that the technologies we use comply with all existing legal frameworks for data protection.”
“We communicate clearly how our technologies collect, store, access, share and use data.”

“We provide clear and accessible information about how data is being collected, stored, shared and used within our technologies, providing additional support as required, to enable informed consent.”

“We ensure that the technologies we offer comply with all existing legal frameworks for data protection.”
Transparency"I have access to information and advice that is clear, timely, and tailored to me so that I can make informed decisions about my use of technologies."

“I understand when and how technologies are being used in my care and support.”

“I have the right support to give informed consent to the use of technologies in my care and support.”
“I understand when and how technologies are being used in the care and support I provide and am able to communicate about this clearly with the people I work with.”

“I am equipped and empowered to support people to give ongoing and active consent to the use of technologies in their care and support.”

“I have access to clear information about how technologies are being used by my employer in relation to my work.”
“We provide clear information about how technologies may be used in our services”

“We make it clear when people are interacting with technologies in their dealings with us.”

“We provide as much information as possible about the technologies we use and are open about the things we don’t know.”

“We recognise that consent is an ongoing process and we offer additional support where needed to secure meaningful consent to the use of technologies in care and support.”
“We offer clear and accessible information in a range of formats to support people in understanding how and when our products are in use.”

“We take steps to ensure that our users know when they are interacting with technologies rather than people.”
Accountability“I know who takes responsibility for any decisions made about my care and support, where these decisions are supported by technologies.”

“I know who can be held responsible if things go wrong with the technologies I am using in my care and support.”

“I am able to provide feedback on the technologies I use, and to seek redress when things go wrong.”

"I am supported to understand and manage any risks of using technologies in my care and support."
“I know about my responsibilities in relation to any technologies I am asked to use in my work.”

“I have the skills and capabilities to overrule technologies, where needed, wherever I am expected to take responsibility for the work done by these technologies.”

“I am clear about who is responsible if things go wrong in relation to the technologies being used to monitor or assess my work.”
“We have policies that set out clearly who is accountable for the work done by technologies that are in use in the care and support we provide.”

“We make clear to our staff when they are expected to take responsibility for checking the work done by the technologies in use in our services. We empower them with skills and capabilities to overrule technologies where needed.”

“We have clear policies around who is responsible for assuring the quality and safety of technologies in use in the care we provide.”

“We have accessible and effective mechanisms for people to tell us when things have gone wrong, and to ensure that complaints are acted upon.”

“We communicate clearly how individuals can seek redress if things go wrong when using technologies we provide.”
“We provide clear information about appropriate use of the products we provide and the levels of oversight needed to use them safely.”

“We offer accessible routes for feedback on our products and take responsibility for making changes when things go wrong.”
Human contact and connection"I have access to technologies which enhance my sense of connection and support me in forming and maintaining relationships that matter to me.”

“I am supported by people who see me as a unique person with strengths, abilities and aspirations. Technologies enhance my experience of support from these people.”

“I am able to exercise choice around how my needs for social and emotional support are met.”
“I am able to use technologies which free up more of my time to provide human connection and relational support to the people with whom I work.”

“My work is enhanced, rather than replaced, by technologies.”
“We use technologies to enhance the work that is done by people in care and support.”

“We do not make assumptions about whether people’s social or emotional needs can be met by technologies.”
“We communicate clearly how our technologies can complement and enhance the work people do in providing care and support.”
Addressing bias and avoiding discrimination“I can access technologies that treat me as a unique individual, recognising and respecting my unique identity, needs and circumstances.”

"I am confident that any technologies being used in my care and support are compliant with all relevant equality and human rights laws and frameworks .”

“I know who to contact if I feel I am being impacted by bias or discrimination as a result of the use of technologies in my care and support.”
“I am confident that the technologies used by my employer include appropriate safeguards against bias and discrimination against individuals or groups, including people with protected characteristics.”

“I am empowered to make decisions against the recommendations of technologies in use in my work where I feel discrimination may result. I am given appropriate training to do this.”

“I know who to contact if I feel I am being impacted by bias or discrimination as a result of the use of technologies in my work.”
“We ask the suppliers of the technologies we use how they safeguard against bias against individuals or groups, including those with protected characteristics, in their systems. We only use tools with adequate safeguards in place”

“We train our staff to spot when technologies may be biased against individuals or groups, including those with protected characteristics. We empower them to take action to ensure people are treated fairly.”

“We engage in proactive ongoing monitoring for discrimination and bias against individuals or groups, including those with protected characteristics, in our systems.”
“We recognise the risks of bias and discrimination against individuals or groups, including those with protected characteristics, which are inherent in some data sets. We take proactive steps to minimise these risks”.

“We monitor for discrimination and bias against individuals or groups, including those with protected characteristics, in our systems. We respond to feedback on these matters.”
Continuous improvement"I know that the technologies I use are continuously improving.”

“There are clear mechanisms for providing feedback on the technologies in use in my care and support. I know that my feedback is listened to and acted upon."
“I am able to provide feedback on the technologies in use in my work.”

“I am confident that the feedback I provide is listened to and acted upon.”
“We ask our technology suppliers to provide mechanisms for feedback”

“We favour providers who have clear processes in place for continuous improvement.”

“We regularly check the technologies we use are functioning well and are safe for the purposes for which we use them.”

“We share information on how to provide feedback on the technologies we use, with the people who work in our services and with people who draw on care and support.”
“We take a proactive approach to continuous improvement, and welcome feedback on our products.”

“We regularly check our technologies are functioning well and are safe for the purposes for which they are intended. And we alert end users if this changes.”

“We are committed to acting on feedback to improve our products”

“We are motivated by improving outcomes for people who draw on care and support.”
Co-production“I am offered opportunities and support to get involved in the development of specialist technologies which may be used in my care and support, and the plans and policies which determine how technologies are used.”

“I am able to contribute on an equal basis. My contributions are valued and I am compensated for the time I spend contributing to this work.”
“I am offered opportunities and support to get involved in the development of specialist technologies which may be used in my work, and to contribute to the plans and policies which determine how technologies are used.”

“I am able to contribute on an equal basis. My contributions are valued and I am paid for the time I spend contributing.”
“We develop our strategies for the use of technologies in care and support with people who draw on care and support, and the other users of these technologies, including people who work in our care services.”

“We follow best practice in relation to our approach to co-production of technology plans, strategies and procedures”

“We encourage and support our staff and the people to whom we provide care and support, to get involved in developing and improving technologies that are relevant to them.”

“When purchasing specialist technologies for use in care and support, we favour providers who can demonstrate their products have been coproduced with end users.”

“We take steps to involve people who draw on care and support and people who work in social care at every stage – from setting out strategies for the use of Generative AI, choosing which products to purchase, deciding how to implement them, and providing feedback to suppliers.”
“We actively involve people who draw on care and support and other users of our products, including people who work in social care, in the development and ongoing improvement of our products.”

“We follow best practice in relation to our approach to co-production of our products”
Sustainable technologies“I have clear information about who supplies the technologies, the costs, the terms and conditions of supply, and my rights if things change.”“I am recognised for the value of my work in my local community and national economy and technology is used to support me in this”“We understand the terms and conditions of supply of any technologies we use or provide and make contingency plans in case of changes”

“We understand the implications of our choices around the use of technologies in care and support for the wider local and national economy.”
“We offer clear information to end users about the terms and conditions of the products we supply to and are transparent about whether and how these might change in future.”

Read more

  • We recognise that some terms may be unfamiliar, please look at our AI Mythbuster for help with technical terms.
  • For more detailed guidance on co-production please visit the TLAP website.