AI, Social Work, and How Own My Story Uses It for Life Story Work
- Louis Moore

- Aug 25
- 6 min read
You may have heard the term “AI” (artificial intelligence) more often recently. To say it’s a trending topic is an understatement. AI is rapidly evolving, and social work is one of the many professions beginning to consider its opportunities and challenges. This article explores what AI is, how its currently shaping practice, and how Own My Story is using it to improve life story work.
What is AI and why does it matter to Social Work?
AI describes technologies that process information, learn from data, and generate outcomes such as predictions, recommendations, or creative content. In simple terms, it understands information, finds patterns, and produces responses in text, images, video or audio.
If you’re currently practicing as a social worker, you may be encountering AI in your work place. Tools such as Magic Notes and Copilot are being used to record meetings, write reports and produce assessments. It even provides recommendations for care planning. The attraction is clear; less time spent on admin, more time with children and families.
However, AI remains a contentious issue due to the risks associated with it. On one side of the debate, social work academics are asking the profession to “apply the brakes on AI” until clearer regulation is in place. On the other side, AI is being encouraged by the UK government to improve public services by reducing bureaucracy. The debate is ongoing and unresolved.
Policy context for social work
Amid this debate, the British Association of Social Workers (BASW) published the first guidance on AI in March 2025. It takes a balanced view, outlining both risks and benefits, and offers practical advice on how practitioners should engage with AI. BASW has also urged the government to issue statutory guidance to shape future practice.
Whilst we wait for the government’s response, Social Work England are holding discussions with social workers and organisations to understand how AI is shaping practice. They have commissioned two pieces of research investigating the implications for professional standards and ethical practice.
It’s fair to say the profession is currently in limbo. AI is here, but regulation and standards are yet to be implemented.
Benefits and risks of AI for social work practice
Research so far highlights more risks than benefits, perhaps unsurprising given how new AI is to social work. Luke Geoghegan (BASW Head of Policy and Research) uses the phrase “the jury is still out” to describe the profession’s uncertainty about AI. Although, this has not stopped 70% of all UK government services trialling AI systems according to The Guardian. Whilst there is certainly hesitation, its clearly offering value to curious public bodies.
The common risks associated with AI are clearly identified across the current research and guidance. These include:
Bias – AI is trained using information from the internet and select groups of people. This information can contain biases towards protected characteristics, such as race and gender, which may affect the output AI gives to social workers.
Privacy and consent – Some AI systems, such as online language models like ChatGPT, use your information to continuously train the AI model. The user is automatically opted into this setting raising questions around consent.
Contextual inaccuracy – Social workers have reported that AI does not always interpret meetings or visits accurately, for example missing physical or emotional queues. When it encounters knowledge gaps, AI has been found to make misrepresented judgements or use fictional content.
Overlooking ethical implications – There is a concern that big tech companies producing AI systems could minimise ethical issues to sell their products to social work organisations.
Reducing relationship-based practice – As we replace certain human functions with AI, there is a risk that face-to-face practice will be reduced to save money and resources.
The potential benefits of AI are also recognised alongside the risks. Research by Tambe and Rice (2018) highlights how both computer scientists and social workers aim to solve complex social problems. This suggests AI could provide solutions to service challenges which otherwise go unresolved.
Importantly, researchers stress that AI should be developed and monitored in partnership with social workers. This approach is already reflected in tools like Magic Notes, which was created with practitioner input. Its outputs are designed to be reviewed and edited, ensuring that decisions retain human input.
To harness the benefits of AI, social work needs to maintain its participation in the development of emerging technologies. An overly cautious approach could result in social work falling behind or having little influence over how it impacts the people we support.
How Own My Story uses AI
At Own My Story, we’ve chosen to work with a secure, reputable AI platform that integrates with tools already familiar to social workers. This ensures strong data protection and allows us to shape AI outputs around our values.
We use two main types of AI:
Data processing – Quickly turns large amounts of case information into clear timelines and themes.
Language generation – Helps draft child-centred, strengths-based narratives for life story books
Most importantly, all AI outputs are reviewed and refined by a qualified social worker. This combination of technology and professional judgement ensures every book is accurate, age-appropriate, and empowering.
Managing the risks
The risks of AI are well documented in social work literature, and these have directly shaped how we’ve designed Own My Story. As a service founded by an experienced social worker, our development process is rooted in professional standards and legislation, with the safety and wellbeing of young people, families, and practitioners always at the centre.
One of the clearest risks is bias and contextual inaccuracy. To mitigate this, every stage of our process involves a qualified social worker. AI may be a powerful tool, but it can never replace professional judgement. No matter how much information AI has absorbed,
it does not have the training or real world experience of a social worker.
Data protection is another major concern given the sensitivity of the information involved. We have chosen our AI platform specifically for its secure and trusted systems. We’re able to configure the settings to specifically meet UK GDPR standards. We're able to process and store information in one encrypted, contained environment, rather than spreading it across multiple platforms. This means everything stays secure in one place.
We also prioritise the “informed” aspect of consent. Young people deserve to understand how AI is being used in their life story work. As part of our referral process, social workers will complete forms alongside the child to gain their consent. We will provide an accessible, visual guide to explain AI in simple terms and outline how their rights will be safeguarded.
Finally, we’ve designed our service to address concerns about AI reducing relationship-based practice. We believe AI should enhance, not replace, direct work. By streamlining the time-consuming tasks of gathering and structuring information, Own My Story gives social workers more space for the face-to-face, therapeutic side of life story work. Our sessions are deliberately structured around existing visiting patterns, encouraging collaboration between young people and their social workers. We aim to promote relationship-based practice, while reducing the administrative burden that often gets in the way.
Why AI matters for life story work
Life story work is often neglected, delayed, or rushed due to lack of time and resources (NICE, 2021). Yet every child in care deserves a clear, compassionate record of their journey.
AI offers us a way to transform this. By unlocking stories hidden in complex files, reducing admin, and supporting practitioners, we can ensure more children receive high-quality life story books, created faster, but with the same care and empathy.
As AI continues to be introduced in social work, its regulation and impact are not yet certain. When used responsibly, AI has the potential to transform previously constrained areas of practice. Own My Story is just one application of AI being used to improve outcomes for young people whilst remaining conscious of the emerging risks.
We welcome your thoughts and perspectives on this topic as we continue to test and develop our AI systems. You can contact us through our blog, website or LinkedIn.
Thanks for reading! 📖 ✨


Comments