After shooting, MSU adding AI surveillance to detect threats, count people – Bridge Michigan

1 minute, 58 seconds Read
image

The ongoing technology overhaul will give MSU those capabilities — and more — through a combination of new equipment and software that will utilize artificial intelligence. 

Among other things, the new system will monitor doors, elevators and other parts of campus in real time, with the ability to automatically verify a card-access picture ID against live video of an individual attempting to enter the building. 

“Dynamic graphical maps” will allow authorities to select any part of campus and automatically pull up any cameras with a view of that location, according to a “master service agreement” between Moss and MSU. 

License-plate reader cameras will be able to track vehicles across campus and add suspect vehicles to a “hotlist.”

The Genetec system that Moss will install must be configured to automatically focus security cameras on perimeter breaches, and detect or track specified objects or stopped vehicles, according to the documents.

A “people counter” function using “deep-learning models trained on person detection” will allow cameras to automatically count crowd numbers. The system, which can differentiate between adults and children and even people in wheelchairs, allows authorities to monitor the counts on a live dashboard.

That technology will help police ensure “safety and security” at large campus events “by being able to assess the overall population on campus,” said Whyte, with the department of public safety. 

Privacy concerns

While MSU is touting the upgrades, there are privacy concerns associated with such technology, said Mark Ackerman, a professor at the University of Michigan who studies how people and technology interact with each other.

AI-based surveillance systems are often “not very accurate in low lighting,” such as at night, and some have been shown to have an implicit bias against minorities, Ackerman said. “They’re very problematic about seeing certain kinds of hostile intent and …whether the police should respond or not respond.” 

The technology can also be used to identify “suggested threats,” such as a person known to participate in protests, rather than actual threats, he said.

For students seeking a sense of security after last year’s shooting, using security cameras is “important,” and “we have to have them,” said 

Devin Woodruff, a senior studying public policy and vice president for government affairs for the undergraduate student-body government.

But the use of AI is “a little bit of a concern,” said Woodruff, who is Black. “I just don’t want any discrimination happening.”

This post was originally published on 3rd party site mentioned in the title of this site

Similar Posts