Skip to content
  • About
  • Services
    • Phone Interview
    • Initial Meeting
    • Invention Analysis
    • Plan of Action
    • Invention Design
    • 2D/3D Illustrations
    • Invention Engineering
    • Patent Protection
    • Market Research
    • Manufacturers Search
    • Invention Brochures
    • Licensing & Royalties
  • Submit
  • Benefits
  • Blog
  • Contact
Menu
  • About
  • Services
    • Phone Interview
    • Initial Meeting
    • Invention Analysis
    • Plan of Action
    • Invention Design
    • 2D/3D Illustrations
    • Invention Engineering
    • Patent Protection
    • Market Research
    • Manufacturers Search
    • Invention Brochures
    • Licensing & Royalties
  • Submit
  • Benefits
  • Blog
  • Contact

972-402-0000

Irving, Texas

  • About
  • Services
    • Phone Interview
    • Initial Meeting
    • Invention Analysis
    • Plan of Action
    • Invention Design
    • 2D/3D Illustrations
    • Invention Engineering
    • Patent Protection
    • Market Research
    • Manufacturers Search
    • Invention Brochures
    • Licensing & Royalties
  • Submit
  • Benefits
  • Blog
  • Contact
Menu
  • About
  • Services
    • Phone Interview
    • Initial Meeting
    • Invention Analysis
    • Plan of Action
    • Invention Design
    • 2D/3D Illustrations
    • Invention Engineering
    • Patent Protection
    • Market Research
    • Manufacturers Search
    • Invention Brochures
    • Licensing & Royalties
  • Submit
  • Benefits
  • Blog
  • Contact
Free Invention Analysis

Why using AI in policing decisions risks race and class bias

  • May 14, 2017

AI is rocking the world of policing — and the consequences are still unclear.

British police are poised to go live with a predictive artificial intelligence system that will help officers assess the risk of suspects re-offending.

SEE ALSO: Mark this as the moment when United trolling jumped the shark

It’s not Minority Report (yet) but certainly sounds scary. Just like the evil AIs in the movies, this tool has an acronym: HART, which stands for Harm Assessment Risk Tool, and it’s going live in Durham after a long trial.

The system, which classifies suspects at a low, medium, or high risk of committing a future offence, was tested in 2013 using data that Durham police gathered from 2008 to 2012.

Its results are mixed.

Forecasts that a suspect was low risk turned out to be accurate 98 percent of the time, while forecasts that they were high risk were accurate 88 percent of the time.

That’s because the tool was designed to be very, very cautious and is likely to assign someone as medium or high risk to avoid releasing suspects who may commit a crime.

A self-learning system 

According to Sheena Urwin, head of criminal justice at Durham Constabulary, during the testing HART didn’t impact officers’ decisions and, when live, it will “support officers’ decision making” rather than define it.

Urwin also explained to the BBC that suspects with no offending history would be less likely to be classed as high risk — unless they were arrested for serious crimes.

Police could use HART to decide whether to keep a suspect in custody for more time, release them on bail before charge or whether to remand them in custody.

However, privacy and advocacy groups have expressed fears that the algorithm could replicate and amplify inherent biases around race, class, or gender.

“This can be hard to detect, particularly in self-learning systems, which carry greater risks,” Jim Killock, Executive Director of Open Rights Group, told Mashable.

“While this process is reported to be ‘advisory’, there could be a tendency for officers to trust the machine on the assumption that it is neutral.”

Racial bias?

Whenever the data is systematically biased, outcomes can be discriminatory because learning models bring into the foreground assumptions that have been tacitly made by humans.

The Durham system includes data such as postcode and gender which go beyond a suspect’s offending history.

Even though the system is very accurate, let’s say 88 percent of the time, a “subset of the population can still have a much higher chance of being misclassified,” Frederike Kaltheuner, policy officer for Privacy International, told Mashable.

For instance, if minorities are more likely to be put in the wrong basket, a system that is accurate on paper can still be racially biased.

“It’s important to stress that accuracy and fairness are not necessarily the same thing,” Kaltheuner said.

Last year, an investigation by U.S. news site ProPublica shone a light on the alleged racial bias of an algorithm used by law enforcement to forecast the likelihood of a repeated offense.

Among other things, the algorithm was making overly negative predictions about black versus white suspects. The firm behind the system denied the allegations.

For example, ProPublica reports the cases of Brisha Borden, a black 18-year-old teenager who stole a child’s bicycle and scooter, and Vernon Prater, a white 41-year-old who was picked up for shoplifting $86.35 worth of tools.

  • Inventions
Inventions
Tech

Latest Posts

This device tells you when the stove’s been left on
This device tells you when the stove’s been left on
Are you the type of person who needs to check that the stove is off five times before leaving the ...
WeMo is a smart plug that allows you to control appliances remotely
WeMo is a smart plug that allows you to control appliances remotely
Have you ever left the iron on, only to remember hours later while you're sitting at your desk at work? ...
The pocket-sized Nix Sensor digitizes colors on demand
The pocket-sized Nix Sensor digitizes colors on demand
Assuming that you're in fact not some sort of color wizard, The Nix Mini Color Sensor could be an awesome ...
Sit back, relax, and let this electric bike go 236 miles on one charge
Sit back, relax, and let this electric bike go 236 miles on one charge
E-bike technology isn't exactly new — we've seen multiple scattered across the Internet for a few years now, all claiming ...
Think you can invent a personal jetpack? Boeing could give you $2 million for it
Think you can invent a personal jetpack? Boeing could give you $2 million for it
Boeing really wants to make personal jetpacks a thing, and they’re asking for your help. Boeing is sponsoring the GoFly ...
Use this super smart case to instantly print photos from your phone
Use this super smart case to instantly print photos from your phone
The upside to advanced camera phone technology is super sharp pictures we can’t help but share on just about every ...
Previous
Next
View all Posts

What's on Your Mind?

Submit your Idea for your Free
Patent Search Now.

FREE PATENT SEARCH
  • 972.402.0000
  • [email protected]

What's on Your Mind?

Submit your Idea for your Free
Patent Search Now.

FREE PATENT SEARCH
  • 972.402.0000
  • [email protected]

Give Us a Call

972.402.0000

Evaluate

  • Phone Interview
  • Initial Meeting
  • Invention Analysis
  • Plan of Action

Develop

  • Invention Design
  • 2D/3D Illustrations
  • Invention Engineering
  • Patent Protection

Launch

  • Market Research
  • Manufacturers Search
  • Invention Brochures
  • Licensing & Royalties

Address

6565 N.MacArthur Blvd, Irving, Texas 75039

Phone

972.402.0000

800.962.3032

972.402.0095

Email

[email protected]

Evaluate

  • Phone Interview
  • Office Meeting
  • Invention Analysis
  • Plan of Action

Develop

  • Invention Design
  • 2D/3D Illustrations
  • Invention Engineering
  • Patent Protection

Launch

  • Market Research
  • Manufacturers Search
  • Invention Brochures
  • Licensing & Royalties

Follow Us

Facebook-f Instagram Linkedin-in Pinterest-p Twitter
© 2022, Lonestar Patent Services, Inc.

Privacy Policy | Terms of Use

Free Invention Analysis

×

Non-Disclosure Agreement (NDA)

This agreement keeps your idea safe between you and Lonestar Patent Services.

I understand that the product idea information I submit cannot be used, disclosed or sold without my express written permission. I also understand that all Lonestar Patent Services employees are required to sign an ethics and confidentiality agreement for my protection. I believe that I am the original inventor of the idea described herein. I authorize Lonestar Patent Services to review my idea and contact me in 3 to 5 business days with the results. I acknowledge that Lonestar Patent Services monitors and records telephone calls for quality assurance. I understand that Lonestar Patent Services does not promise any financial gain from the development of any new product idea.

By clicking the “submit” button below as my electronic signature, I expressly consent to being contacted about Lonestar Patent Services by phone call, auto-dialed phone call including prerecorded voice messages, text messages or email at any number or email address I provide. I understand that my consent is not a requirement for purchase of services.

Fee based service.