[ad_1]

Local police can see into a home without ever stepping foot inside thanks to technology.

 

While supporters argue that such technology, namely robots, saves lives, critics contend its conflicts with the Fourth Amendment, which protects people against unreasonable searches and seizures.

A new robot

The Boulder County Hazardous Device Response Team, also known as the bomb squad, has been using military-grade robots since the mid-2000s. As its models age, in 2020 it plans to ask for funding to buy a new one, according to Longmont Master Police Officer Scott Pierce.

The bomb squad received its main robot, a Remotetech model, in 2005 with a grant from the Urban Areas Security Initiative Program, Pierce said.

The 500-pound robot was originally used in military operations. It has one “arm,” infrared, zoom and night vision cameras, and a speaker system. The squad’s second robot, made by Endeavor Robotics, weighs 60 pounds and is much more portable.

Typically they are used together, according to Pierce. The larger robot will hold a house door open and let the smaller one inside, and vice versa. The larger robot can get stuck in the snow or while going over clothing. Together, they have two “hands” and provide police with two viewpoints.

Pierce said engineers are working on building single robots with two hands, but so far they are not available.

The bomb squad robots are used in three circumstances: All calls for potential bombs, SWAT responses and for examining hazardous materials.

While they aren’t indestructable, Pierce said no one has really tried to damage or destroy them so far. There has been just one instance where a man hit one of the robots with a Samurai sword. The man stopped once an officer spoke over the speaker and asked him to stop hitting the robot.

Most people are initially afraid of the robots, Pierce said. Once, officers used the robots to call a man up from the basement. When he came out and saw the robot, he screamed and laid down, Pierce said.

Both Longmont police and the Boulder County Sheriff’s Office plan to ask for a new robot in their 2020 budgets. Because the departments have an inter-governmental agreement, they would each pay $150,000 for the new technology.

The bomb squad has asked for a new robot in the past. Robots have a life span of about a decade, according to Boulder County Sheriff’s Office Sgt. David Salaman.

“The technology in these things are becoming so advanced,” Salaman said.

It’s becoming more and more difficult to find replacement parts for the larger robot, which is 14 years old.

Pierce said a replacement was a “low priority” last year for the governments who control the budgets, in part because the larger robot still works.

“When they stop making parts for it, we will probably get the funding,” he said.

Life saving, but privacy violating

The robots are used in nearly every call that requires bomb squad response. Longmont police use the robots often, both for safety and to maintain their skills in using them. Last year, the robots went on 70 calls. The year before, it was 100 calls.

“It’s basically a lifesaver because these systems go where you don’t want to,” Pierce said.

Robots are often used to enter homes first when a suspect won’t come out, examine bombs and look around a residence for any dangers. Pierce said the safety using robots provides officers is something they couldn’t do without.










A Boulder County bomb squad robot was deployed to Santiago’s restaurant in Longmont to investigate a suspicious package on Jan. 4, 2018.


A Boulder County bomb squad robot was deployed to Santiago’s restaurant in Longmont to investigate a suspicious package on Jan. 4, 2018. (Lewis Geyer / Staff Photographer)

Operating the machines also is a “perishing skill,'” he said, so officers need to practice with less serious calls so they are prepared for the more grave situations.

“It takes a lot of work to know how to maneuver,” Salaman said. “…You will always learn something new using the robot.”

Salaman estimates officers need at least 40 hours playing with the devices to get used to how they work. The bomb squad and other officers who use them train twice per month.

Boulder County Sheriff’s Office Sgt. Randy Wilber said he thinks the robots are “invaluable” for law enforcement.

However, critics point out that technology such as robots and drones raise questions about privacy.

Denise Maes, public policy director at the American Civil Liberties Union in Denver, said the technology raises Fourth Amendment issues.

“It’s one thing if it’s used to dismantle a bomb in an open area or an abandoned building or something,” Maes said. “I think it’s a lot different when it’s peeking into people’s homes.”

Pierce said that with a search warrant officers can use a drone to look into a second-story window of a home.

But Maes pointed out that police could find evidence of another crime not related to the warrant, which raises legal issues.

While Maes sees the need to use robots in some situations that pose safety concerns, she thinks the notion of using them regularly is based on “false choices.”

“It’s either, ‘Hey, we can use this for our safety and invade your constitutional rights, or we could not be safe and respect your privacy,'” she said. “There just needs to be really tight guardrails around that. Let’s make sure it’s only in those extreme situations.”

Longmont Deputy Chief Jeff Satur stressed that a robot is never the first tool employed by the department and it always follows the rules of the Constitution.

“The Longmont Police Department has no intention of using a robot in violation of the Fourth Amendment. We would only send in a robot(s) or a drone after securing a search warrant, or under exigent circumstances or with consent of a resident,” he said via email. “Because of safety, a robot or drone can be a replacement for a person. If we can send a person lawfully into a residence, because we have obtained a search warrant, or have exigent circumstances or consent, then we can send a robot.”

Maes also countered the idea that police need to use this technology in the field to maintain their skills is “ridiculous.”

“They can do training on that,” she said, adding that police “don’t need the SWAT team in every case just so they can get training.”

Maes emphasized the need to use other, evidence-based tactics that work for police in situations that aren’t dire. She said the idea that people can have firearms in their homes and might pose a threat isn’t new, and there are other ways to de-escalate situations using intervention techniques and on-site mental health providers.

“We’re better apt to put our money into those things than robotics,” she said.

Concerns of the future

Maes also raised concerns about the pace of technology.

“The technology to invade our privacy is moving at such a greater pace than the laws are to protect our privacy,” she said.

The ACLU believes technology needs to be pulled back until those laws are in place, Maes said.

Adam Wandt, an assistant professor of public policy and deputy chair for academic technology at the John Jay College of Criminal Justice in New York, said he and his colleagues also are concerned about the pace of technology in policing.

The use of robots goes back decades, he said, and police began buying high-end military robots toward the end of the Iraq War. When Wandt was a training chief in emergency medical services, he said he wanted his team using technology often.

“And you want them using it in circumstances that are marginal, so that they’re trained properly to use that equipment when it’s very needed,” he said.

However, Wandt said he also believes in finding a balance between technology, police protection and citizens’ rights.

“When it comes to robotics, there’s really quick ways we can step over that line,” he said.

For example, Wandt doesn’t think present-day society is ready for automatically patrolling robots. There also are legal questions behind robots that can use lethal force. Is the robot using the force, or the officer behind it?

Wandt also believes pop culture has preconditioned society to fear robots, which means someone struggling with a mental health issue could react differently to a robot than they would a human officer.

Wandt said he believes face-to-face policing is best, but he understands the need to protect officers. Responding to a suicidal subject using a robot is a “textbook example,” he said, because it protects officers. But it also is not yet known how that affects the person who is struggling, he said.

When asked about the future of technology in policing, Wandt said he and his colleagues were afraid of a “1984” scenario, referencing the dystopian novel by George Orwell.

“We all openly admitted to ourselves that we are far past the ‘1984’ scenario, and we surpassed that scenario a long time ago,” he said.

With technology such as social media, the government and law enforcement are now able to place ideas in people’s minds and living rooms, which produces a “level of control” over the population beyond what Orwell envisioned, because it is covert, he said.

When it comes to artificial intelligence, or AI, Wandt said he and others “are all terrified.”

“Mixing AI into robots and allowing robots to make up their own mind of how to patrol or who to stop or who to kill — that’s when it’s starting to become very scary,” he said. “Because it’s very easy to pass bias and prejudice on to AI models. It’s actually hard to not pass that on.”

While people recognize this is an issue, particularly with racial profiling, Wandt said progress always surpasses controls, so it will probably take time to correct those errors. Once well-designed AI is developed, he believes it will be more just than a human being, because it won’t factor in characteristics like race or ethnicity.

“We should expect the same types of growing pains going from a medieval criminal justice system to the one we have today, that still doesn’t work well,” he said.

Madeline St. Amour: 303-684-5212, mstamour@prairiemountainmedia.com

 

[ad_2]

Source link