‘Bias Was Baked Into It’: US Activists Shut Down Police-School Data-Sharing Plan

© Photo : PixabaySurveillance
Surveillance - Sputnik International
Subscribe
In January, community activists in Minnesota successfully shut down a program that would have used data analytics to make decisions about which students received resources and support services from the school system. An activist told Sputnik the administrators had ignored community ideas and substituted “that new, shiny thing called ‘innovation.’”

A partnership called the Community Innovation Protect was supposed to be introduced in Saint Paul, Minnesota, had activists not shut it down in January, KSTP reported at the time. The program would have established an integrated data collection system shared between school, law enforcement and municipal officials that activists said had minimal oversight and represented an unwelcome expansion of existing racial inequities in the city.

Radio Sputnik's By Any Means Necessary spoke Thursday with Marika Pfefferkorn, co-founder of the Coalition to Stop the Cradle to Prison Algorithm (CPA Coalition) and director of the Twin Cities Innovation Alliance (TCIA), about the program and the risks it posed to vulnerable students.

https://www.spreaker.com/user/radiosputnik/why-the-police-want-your-childs-report-c

​"We became aware of a joint powers data sharing agreement between Ramsey County, the city of Saint Paul, and Saint Paul public schools" that had been framed as "a way to be more efficient, cost-effective and reach and resource more students and families," Pfeffercorn told hosts Eugene Puryear and Sean Blackmon.

Police officers walk past floral tributes placed at the scene of an attack on Westminster Bridge, in London, Britain March 24, 2017 - Sputnik International
Facial Recognition 'Certainly a Law Enforcement Technology' - Scholars

"What we learned about the process was that they were planning on collecting expansive data that had no prohibition — meaning that they would be collecting this data for an ongoing basis, and they had not identified all the ways that they would be using the data. So that was one issue," Pfeffercorn noted.

"The second issue was that they were talking about integrating a tool called ‘predictive analytics,' where they design an algorithm to identify students who would be potential risks — or as we like to reframe what they're saying and not use code, is that they are identifying young people as ‘threats,'" she explained. "And then the third thing… is that it created a new governing body that structurally excluded anyone but the elected officials or systems representatives from having oversight to this data."

Pfeffercorn noted the origin of the data-sharing agreement was problematic as well. She said the report on community meetings and their recommendations excluded those recommendations, and instead gave elected officials who read the reports the idea that "that the community was really invested in this data-sharing agreement, when it was not the case." She also noted the "systems folks" added their own recommendations to the reports, which included using big data and predictive analytics.

View of San Francisco - Sputnik International
San Francisco Passes First Municipal Facial Recognition Tech Ban in US

People "really did not understand what they had agreed to, what they had committed to," and as a result, couldn't even properly discuss "what they were trying to accomplish," she said, noting, "We were able to get them, after a long, drawn-out struggle… to pause."

Pfeffercorn said she believed that officials were "really attracted to that new, shiny thing called ‘innovation.' This, by definition, is not innovation, but that's what they were selling it as."

Noting she doesn't have a background in data science or statistics, Pfeffercorn told Sputnik she works on "school-to-prison pipeline and discipline disparities." In that work, she found that the dozens of local school districts had discriminated against black and brown children in using out-of-school suspensions as punishment.

"So the intersection with predictive analytics is: as they were selling the predictive analytics, they said, ‘You know, we need to have some indicators that we could use to identify who the resources and support might go to.'" That metric, Pfeffercorn said, included suspensions as a way to rate students as higher risk.

"We looked at the other indicators that they were talking about and how to create a risk assessment, and we realized that bias was baked into it. So how could we even have a fair and just conversation when they did not have the know-how and sense to say we already have racial inequity in how we do discipline in Minnesota? And now we're going to further deepen that by not just using suspension data but, say a child had had an interaction with child protection services or the public housing human services department. Or not even you, the student, but a family member, a sibling who had interacted with the juvenile justice system: all of these things could potentially be used as an indicator to identify a risk score… and all of the indicators, again, have the potential for bias."

Woman Confronts Police During Keith Lamont Scott Protests in Charlotte - Sputnik International
Fraternal Order of Police Want Trump to Restore Racial Profiling

Pfeffercorn said they introduced the term "cradle-to-prison algorithm" to describe the new set of analytics because it references information dating back far beyond the child's enrollment in school, such as child protection services and foster care. This, she said, only further amplifies existing disparities between black and brown children and white children.

The CPA Coalition, she noted, is made up of people who work on both the education and criminal justice sides of the school-to-prison pipeline, and "they flagged right away in the conversation that law enforcement does not need to have more information about students and families."

"Students are supposed to be protected," she said, "they are supposed to have a moat around them so that they can be young people, make mistakes, get second chances, but they're not criminalized in doing so."

Instead of the big data program, Pfeffercorn said the real solution to the problems administrators were trying to address lies in funding, supporting and expanding existing community programs with proven track records.

"That seems like a really common-sense kind of strategy, but we can't even do that one well yet, and yet now we want to do a mechanized process that identifies who gets help, how they get help and who knows about what's going on in their life," she said.

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала