Monday, 27 April 2026

AI is Coming

For the 28th Annual McWilliams Probation Lecture on 14th July 2026, Professor Melissa Hamilton will speak on 'Artificial Intelligence in Probation: Opportunities, Risks, and Responsible Use'.

Melissa Hamilton is a Professor of Law & Criminal Justice at the University of Surrey and a Surrey AI Fellow with the Surrey Institute for People-Centred Artificial Intelligence. She holds a Juris Doctorate (law) and a PhD in Criminology and is a member of the Royal Statistical Society, International Corrections and Prisons Association, American Psychological Association, and the Association of Threat Assessment Professionals.

Her research is interdisciplinary and focuses on the use of AI and related technologies in criminal justice, sentencing practices, interpersonal violence, and trauma-informed approaches to legal and correctional decision-making. Before entering academia, Melissa worked as both a police officer and a prisons officer, experience that continues to inform her research and teaching.

Melissa’s work has been published across law, social science, and criminal justice journals. She also regularly contributes to public and professional discussions through print media, radio and television broadcasts, and online platforms including blogs and podcasts.

The respondent is David A Raho. David is a PhD researcher in Law and Criminology at the Institute of Law and Social Sciences at Sheffield Hallam University, investigating AI Maturity Models, AI Cultural Readiness, and the comparative adoption of artificial intelligence in probation and rehabilitation services across England & Wales, Brazil, and Japan. He has extensive frontline experience as a Probation practitioner spanning nearly four decades and now works as a member of the AI Team at HMPPS HQ. He has contributed to publications on the use of technology in Probation for both CEP and the UN, and is a member of the UNESCO expert network on AI and a Tutor at the University of Oxford on AI Governance. He is both a Fellow and a Trustee of the Probation Institute and a proud Napo member, having previously served as a Branch Chair in London and also as a National Vice Chair.

This event will be held at the Institute of Criminology in the lower ground floor seminar rooms.

Lunch will be served at 1pm. The lecture will begin at 2pm. Tea will be served at 3.45pm.

Please register for in-person attendance here.

Please register for online attendance here.

2 comments:

  1. AI isn’t a future prospect in probation, it’s already here. It’s not hard to see where this leads: keyboards and IT hardware replaced by transcription recordings and voice control, intake points using biometric scans, individuals tracked from entry to exit through linked systems. Electronic tagging evolves into continuous, remote supervision, potentially layered with facial recognition. Risk assessments, licence conditions, even sentencing recommendations, already, much of this can be generated at the press of a button.

    And the justification is familiar. We’re told the same story each time: workloads are too high, resources too thin, and this technology will make things more efficient, freeing up time for meaningful work. But in practice, that time rarely materialises as more time in the room with people under supervision. Instead, it’s absorbed elsewhere, redirected into managing the system itself and the PO becomes ignored and obsolete.

    The trajectory is clear, but the real question isn’t what AI can do, it’s where human judgment remains non-negotiable. At what point does a practitioner step in, rather than defer? When does a relationship, built through presence, trust, and discretion, stop being central to supervision? Because if decisions are increasingly shaped upstream by automated systems, there’s a risk that practitioners become implementers of outputs rather than authors of them, asking AI what to do before deciding what should be done.

    And underpinning all of this is a quieter, murkier issue: who is actually designing these systems, and whose assumptions and bias are being coded into them? If the logic behind risk, compliance, and intervention is embedded in software, then those choices don’t disappear, they’re just harder to see, and harder to challenge.

    See you at the lecture.

    ReplyDelete
  2. "As part of the "Transforming Rehabilitation" reforms introduced by then-Justice Secretary Chris Grayling in 2013-2014, the probation service in England and Wales began supervising an additional 40,000 to 50,000 offenders annually."

    The total annual probation caseload in England and Wales increased by 39% from ~175,000 in 2000, reaching 243,434 in 2008.

    caseloads at 30 Jun 2010 = 239,041.

    Total annual caseload reached approximately 241,000 by the end of 2015, following a low of 217,359 at the end of 2014.

    Total Caseload @ end of September 2025: 246,502 offenders.

    https://data.justice.gov.uk/probation/offender-management/caseload-total

    So where did the additional 50,000 cases go, as promised by grayling?

    Staffing crisis? Caseload crisis? What crises?

    Based on data from the UK Parliament, the total number of probation staff in England and Wales in 2000 was approximately 15,240

    175,000/15,240 = 11.48

    At the end of 2010, according to Q4 2010/11 data, there were approximately 19,369 FTE total staff

    239,041/19,369 = 12.34

    By December 2025, there were 21,407 full-time equivalent (FTE) staff

    246,052/21,407 = 11.49
    ______________________________________________________

    Smoke, mirrors, pisstake, scam.

    ReplyDelete