Canadian Cops Will Scan Social Media to Predict Who Could Go Missing

Credit to Author: Nathan Munn| Date: Wed, 17 Apr 2019 15:27:23 +0000

Police in Canada are building a predictive policing system that will analyze social media posts, police records, and social services information to predict who might go missing, says a government report.

According to Defence Research and Development Canada (DRDC), an agency of the Department of National Defence, Saskatchewan is developing “predictive models” to allow police and other public safety authorities to identify common “risk factors” before individuals go missing, and to intervene before something happens. Risk factors can include a history of running away or violence in the home, among dozens of others.

A DRDC report published last month shows the Saskatchewan Police Predictive Analytics Lab (SPPAL)—a partnership between police, the provincial Ministry of Justice, and the University of Saskatchewan—is analyzing historical missing persons data with a focus on children in provincial care, habitual runaways, and missing Indigenous persons, and building tools to predict who might go missing. In the next phase of the project, SPPAL will add social service and social media data.

The report doesn’t specify what kind of predictive insights authorities expect to glean about individuals from social media posts, but police already use social media to monitor people and events for signs of crime, or in the case of missing persons investigations, to discern when a person went missing. For example, police in Ontario made a missing woman’s case a priority after noticing that her usual patterns of social media activity had ceased.

The DRDC report states that municipal police services in Saskatchewan as well as the Ministry of Social Services Child and Family programs and regional RCMP have agreed in principle to share information with SPPAL. In Saskatchewan, more than 70 percent of children in provincial care are Indigenous, and over 100 long-term missing persons cases haven’t been solved.

Tamir Israel, a lawyer with the Canadian Internet Policy and Public Interest Clinic (CIPPIC), told Motherboard that using predictive models to inform decisions on child welfare interventions is concerning.

“We know that predictive models are far from infallible in real-life settings, and that there will be false positives,” Israel said in an email. “The consequences of an intervention based on a false positive can be very serious.”

Israel said that the risk of false positives increases when predictive models use data of “questionable fidelity” such as social media posts. He pointed out that the high number of missing Indigenous women and children in Canada makes them and other marginalized groups especially vulnerable to flaws or biases concealed in predictive models.

Read more on Motherboard.

http://www.vice.com/en_ca/rss