Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jamie Grace is active.

Publication


Featured researches published by Jamie Grace.


Medical Law Review | 2013

DISCLOSURE OF CONFIDENTIAL PATIENT INFORMATION AND THE DUTY TO CONSULT: THE ROLE OF THE HEALTH AND SOCIAL CARE INFORMATION CENTRE

Jamie Grace; Mark J. Taylor

Before disclosing confidential patient information for purposes not directly related to his or her care and treatment, there is currently a responsibility upon health professionals to consult with a patient wherever practicable. The Health and Social Care Act 2012 has diluted that responsibility to consult, at least in relation to any information that the Health and Social Care Information Centre requires health professionals to disclose. This is at odds with other moves to support an individuals involvement in decisions that affect them. Moreover, a responsibility to consult can be shown to be a procedural aspect of the fundamental right to respect for private and family life as guaranteed by Article 8 of the European Convention on Human Rights (ECHR). The scope and nature of a procedural requirement for consultation can be revealed, at least in part, by considering the case law concerning disclosure in the field of criminality information sharing. If the Health and Social Care Act 2012 is to be adequately protected from a challenge for incompatibility with the ECHR, then practicable opportunities to provide information about the intended purposes of processing, and respect for any reasonable objection to disclosure, must be recognised beyond those explicitly provided for by the 2012 Act. The Code of Practice that the Information Centre is responsible for producing represents an opportunity to guarantee adequate levels of consultation will be preserved, consistent with proposed changes to the NHS Constitution.


Journal of Criminal Law | 2015

Clare's Law, or the national Domestic Violence Disclosure Scheme : the contested legalities of criminality information sharing

Jamie Grace

Clare’s Law has been a PR success for the police in England and Wales—the police have engaged directly with the media over the national roll-out of the Domestic Violence Disclosure Scheme. But the precise operation of the Scheme, at a doctrinal level, is unclear, and warrants further scrutiny (and, I would argue, reform) before a crisis of confidence in the Scheme is precipitated by a challenge by way of judicial review. Human rights case law concerning the procedural rights of (suspected) domestic violence perpetrators is the medium through which this piece explores the manner in which the Scheme currently operates on the basis of Home Office guidance and policy.


Information & Communications Technology Law | 2018

Algorithmic Risk Assessment Policing Models: Lessons from the Durham HART Model and ‘Experimental’ Proportionality

Marion Oswald; Jamie Grace; Sheena Urwin; Geoffrey C. Barnes

As is common across the public sector, the UK police service is under pressure to do more with less, to target resources more efficiently and take steps to identify threats proactively; for example under risk-assessment schemes such as ‘Clare’s Law’ and ‘Sarah’s Law’. Algorithmic tools promise to improve a police force’s decision-making and prediction abilities by making better use of data (including intelligence), both from inside and outside the force. This article uses Durham Constabulary’s Harm Assessment Risk Tool (HART) as a case-study. HART is one of the first algorithmic models to be deployed by a UK police force in an operational capacity. Our article comments upon the potential benefits of such tools, explains the concept and method of HART and considers the results of the first validation of the model’s use and accuracy. The article then critiques the use of algorithmic tools within policing from a societal and legal perspective, focusing in particular upon substantive common law grounds for judicial review. It considers a concept of ‘experimental’ proportionality to permit the use of unproven algorithms in the public sector in a controlled and time-limited way, and as part of a combination of approaches to combat algorithmic opacity, proposes ‘ALGO-CARE’, a guidance framework of some of the key legal and practical concerns that should be considered in relation to the use of algorithmic risk assessment tools by the police. The article concludes that for the use of algorithmic tools in a policing context to result in a ‘better’ outcome, that is to say, a more efficient use of police resources in a landscape of more consistent, evidence-based decision-making, then an ‘experimental’ proportionality approach should be developed to ensure that new solutions from ‘big data’ can be found for criminal justice problems traditionally arising from clouded, non-augmented decision-making. Finally, this article notes that there is a sub-set of decisions around which there is too great an impact upon society and upon the welfare of individuals for them to be influenced by an emerging technology; to an extent, in fact, that they should be removed from the influence of algorithmic decision-making altogether.


Criminal Justice Matters | 2014

Disclosing domestic violence

Jamie Grace

In November 2013 the Home Secretary announced that from March 2014 the ‘right to ask’ and ‘right to know’ strands of the Domestic Violence Disclosure Scheme (the Scheme) would be operated nationally under existing police common law powers. The Scheme sees the police proactively (right to know) and reactively (right to ask) disclose ‘intelligence’ on (alleged) offenders to their partners, for example, supposedly in order that those partners can take betterinformed decisions as to remaining in a relationship/continuing to live with the ‘risky’ individual concerned.


Archive | 2017

Algorithmic risk assessment policing models: Lessons from the Durham Constabulary HART model

Marion Oswald; Jamie Grace; Sheena Urwin; Geoffrey C. Barnes

ABSTRACT As is common across the public sector, the UK police service is under pressure to do more with less, to target resources more efficiently and take steps to identify threats proactively; for example under risk-assessment schemes such as ‘Clare’s Law’ and ‘Sarah’s Law’. Algorithmic tools promise to improve a police force’s decision-making and prediction abilities by making better use of data (including intelligence), both from inside and outside the force. This article uses Durham Constabulary’s Harm Assessment Risk Tool (HART) as a case-study. HART is one of the first algorithmic models to be deployed by a UK police force in an operational capacity. Our article comments upon the potential benefits of such tools, explains the concept and method of HART and considers the results of the first validation of the model’s use and accuracy. The article then critiques the use of algorithmic tools within policing from a societal and legal perspective, focusing in particular upon substantive common law grounds for judicial review. It considers a concept of ‘experimental’ proportionality to permit the use of unproven algorithms in the public sector in a controlled and time-limited way, and as part of a combination of approaches to combat algorithmic opacity, proposes ‘ALGO-CARE’, a guidance framework of some of the key legal and practical concerns that should be considered in relation to the use of algorithmic risk assessment tools by the police. The article concludes that for the use of algorithmic tools in a policing context to result in a ‘better’ outcome, that is to say, a more efficient use of police resources in a landscape of more consistent, evidence-based decision-making, then an ‘experimental’ proportionality approach should be developed to ensure that new solutions from ‘big data’ can be found for criminal justice problems traditionally arising from clouded, non-augmented decision-making. Finally, this article notes that there is a sub-set of decisions around which there is too great an impact upon society and upon the welfare of individuals for them to be influenced by an emerging technology; to an extent, in fact, that they should be removed from the influence of algorithmic decision-making altogether.


The Journal of Adult Protection | 2015

Better information sharing, or “share or be damned”?

Jamie Grace

Purpose – The purpose of this paper is to explore the ramifications of developments in surveillance policies and technologies for information sharing cultures in a “public protection routine”. Design/methodology/approach – This conceptual paper uses a mixed theoretical, legal and policy-based approach to inform this exploration of the ramifications of developments in surveillance policies and technologies. Findings – This conceptual paper concludes that developments in surveillance policies and technologies as part of the “public protection routine” will result in a damaging and hasty culture of “share or be damned” unless a more careful approach to new information sharing approaches is developed. Otherwise, an increasing bureaucratisation of risk management through surveillance will lead to a disregard for the fine balance between public protection, procedural rights and privacy. Originality/value – The originality and value of this conceptual paper is considerable – as some of the case studies discussed...


The Police Journal | 2013

'TOO WELL-TRAVELLED', NOT WELL-FORMED? THE REFORM OF 'CRIMINALITY INFORMATION SHARING' IN THE UK

Jamie Grace


Journal of Criminal Law | 2014

Old convictions never die, they just fade away : the permanency of convictions and cautions for criminal offences in the UK

Jamie Grace


International Journal of Law Crime and Justice | 2013

Privacy, stigma and public protection: A socio-legal analysis of criminality information practices in the UK

Jamie Grace


Journal of Information Rights, Policy and Practice | 2016

Intelligence, policing and the use of algorithmic analysis: a freedom of information-based study

Marion Oswald; Jamie Grace

Collaboration


Dive into the Jamie Grace's collaboration.

Top Co-Authors

Avatar

Marion Oswald

University of Winchester

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge