This summer, during his internship with the Department of Homeland Security, a University of New Mexico student created a tool that could predict where future crimes may occur. But could this software have biases of its own?
James Gentry is a junior getting his degree in mechanical engineering. His software uses reported criminal data from the specific area to predict future crimes.
He knows how existing race and gender biases in data could influence the decisions that AI programs make.
鈥淭his data could possibly be biased because it doesn't know where all of the crimes happened. It only knows where police know where crimes happened,鈥 he said.
He has tried to remove information from the data that would replicate biases.
鈥淚t's all been stripped of human characteristics so there's no race, gender, sexuality. It鈥檚 all only location, space, and time,鈥 he said, referring to the data.
But there is a in areas with high populations of people of color and in low-income communities. The high reports of crime in these communities reflects in the criminal data.
However, Gentry is confident he can do this work in a responsible way.
鈥淚'd love to continue developing the software and then possibly get it to local state or even federal authorities in order to help solve some of these issues that we as a country are facing,鈥 he said.