Breaking News

How AI and ML are helping first responders

After reading about a startup founded by a combat veteran that was using artificial intelligence to help first responders with behavioral health, Adam Thiel, commissioner of the Philadelphia Fire Department, wanted in. Even though the department has a more traditional face-to-face employee assistance program with access to clinicians, Thiel believed “there was still a gap in getting people to raise their hand…because sometimes the tendency is just ‘this comes with the job,’ and firefighters and other department personnel too often try to be stoic,” he said.

This year, the department began using a behavioral health platform built by NeuroFlow to track, monitor, and engage firefighters remotely, and most importantly, anonymously. Using applied AI tactics, the technology aims to identify the most at-risk users based on red flags in data.

AI used in robots and drones that can go into hazardous and dangerous areas can be valuable because it means humans don’t have to. But it is also being used in tools to detect mental health issues. 

So far, just over 10% of the department’s roughly 3,000 firefighters, medics, support personnel, and dispatchers are using the NeuroFlow app, said Thiel, who is also Philadelphia’s director of the Office of Emergency Management. He is not surprised by the low response; Thiel knows this is a tough sell for some people.

NeuroFlow website

“Behavioral health treatment and awareness is largely about trust, and we knew it would be a big challenge asking someone to share their feelings on an app,” Thiel said. Although the system’s AI engine tailors the experience to the user, “it still requires people’s input, and we had to make sure people were comfortable with that.”

But Thiel knows deploying the app has been the right decision. “We have already had a number of documented instances where people were truly in a tough place and we would not have known that … without having this bridge,” he said. “We didn’t find that out when they were face to face with a counselor or peer; they self discovered this using the app and then we were able to offer them assistance.”

AI and ML help first responders control emergency robots

One of the significant ways AI technology is evolving to address the needs of first responders is in understanding non-verbal commands, said Joseph DelPreto, a PhD student at MIT, who is part of a team in the university’s Computer Science and Artificial Intelligence Laboratory (CSAIL) that has developed a system to provide more seamless human-robot collaboration. The system, called “Conduct-A-Bot,” uses human muscle signals from wearable sensors to pilot a robot’s movement.

This type of system could eventually target a range of applications for human-robot collaboration, including remote exploration, which would save first responders from having to go into dangerous spaces, DelPreto said.

“It’s good for a robot to work with people and have the robot do dull, dirty, and dangerous components of a job and a person do the more strategic, creative parts of the job.” A remotely-controlled robot can be a first responder teammate, DelPreto said, adding that this capability is still in the early stages.

[embedded content]

There is still a lot of training and learning that needs to be done, though. “There’s always pros and cons with machine learning,” DelPreto said. For example, if you’re trying to teach a robot how to detect gestures, machine learning algorithms need to be trained ahead of time on what a human expects.

“The main challenge is in adapting to circumstances you haven’t seen before; humans are good at that, and that’s hard for machine learning,” he said. “It’s important to scope the task you’re trying to solve with machine learning and be specific about what you want a robot to be able to do.”

That’s because it’s very hard for a robot to solve a general problem, DelPreto added, so it needs to be broken down “into a lot of small pieces.” A human then decides what tasks are better suited for the robot, and which are better suited for people.

“As you give a robot more autonomy, you have to think about computer vision problems and how [it can] capture and maneuver and manipulate objects,” DelPreto said. An autonomous system needs to learn how to interact with its environment. “We’re making a lot of progress in all these applications,” he noted, but “it’s hard to pin down a timeframe and hard to predict how these technologies will evolve.”

SEE: Coronavirus: Business and technology in a pandemic

How AI can detect objects in a fire

Karen Panetta, an IEEE fellow working with the Boston Fire Department and other fire departments in the metro Boston area on how to use AI for object recognition, agrees that these are still early days for the technology. The goal is to be able to use a drone or robot that can see through fire and detect and locate objects, said Panetta, who is also the dean of graduate engineering at Tufts University.

“So if first responders are looking for victims they can find them and also use it to help commanders always know the location of their people,” she said. “Right now, everything is in the prototype stage.”

Panetta was motivated to develop a system after hearing about two deceased Boston firefighters in 2014 who had gone back into a burning building and officials had no way to communicate with them that no-one was left inside. “The fire hose exploded and melted, and people got thrown and it was heartbreaking that they couldn’t find them,” she said. When the firefighters were eventually located, “they were physically six feet from where they were searching.”

Her AI technology is trained on data coming from sensors that firefighters wear, and it recognizes objects that can be navigated in a fire, she said.

All the imaging and sensors are working, but the prototypes can’t withstand heat right now, Panetta said. Pre-COVID-19, departments were testing the prototype by doing onsite burns, but that has been stopped.

AI holds great promise and will have a significant impact on first responders’ ability to gather information, Panetta said. Such systems need to learn from thousands of scenarios and best practices on how to do search and rescue and prevent fatalities and detect the presence of victims in hazardous conditions, she added.

Using AI for first response during COVID-19

For first responders, the bottom line is about safety and how a robot can be sent into a dangerous space to collect data so a person isn’t harmed, said Michael Perry, vice president of business development at robotics company Boston Dynamics, which has deployed its Spot robot at Brigham & Women’s Hospital in Boston. Because of a shortage of personal protective equipment, the hospital is using the four-legged robot to support frontline staff in triage tents and parking lots.

Boston Dynamics Spot robot iPad
Boston Dynamics equipped Spot with an iPad and two-way radio to act as a remote connection between doctors and coronavirus patients.
Boston Dynamics

“AI’s main function is to eliminate as many barriers to a person operating a robot and collecting data,” Perry said. If a first responder is trying to figure out what’s going on in a chemical spill, for example, they don’t want to think about how do they move around a space to get time-sensitive information. “Instead, I want to think about what’s going on…in this environment and communicate that to the team that’s taking action.”

SEE: Coronavirus having major effect on tech industry beyond supply chain delays (free PDF) (TechRepublic)

Boston Dynamics is working to simplify the process of driving the robot and configure it with a wide variety of sensor payloads on Spot’s back to obtain information like gas sensing data, thermal data, and radiation detectors, Perry said. 

Massachusetts State Police have used Spot to go in and look at a suspicious package found in a parking lot before sending in an officer, Perry added.

Boston Dynamics has made the payload and software design for the medical response version of Spot open and available to the public “so people can see how we connected payloads together and how we used AI to infer things — like respiratory rate and body temperature,” Perry said.

AI is already saving lives in emergencies

Philadelphia’s Thiel calls himself “a Luddite when it comes to technology. I appreciate the promise of it,” but says he is also skeptical of technology as the panacea for everything.

“What resonates with me is this product was rooted in human experience and continues to be informed and adjusted by human experience,” said Thiel, who is also a user of the NeuroFlow app.

“The real promise for AI in our business is decision support,” both in mental health and in physical use cases, he said. Right now, it is making strides in the former.   

“Based on our early experience using a product with an AI component around the behavioral health of our members, it’s not hyperbolic or an exaggeration to say we’ve been able to potentially save lives,” said Thiel, adding that Philadelphia has a high rate of structural fires and has already had 14 residents die in fires in 2020.

“People are so affected by their experience serving our city that they were experiencing very salient signs of [trauma]…and we’ve saved some of our members’ lives,” he said. “That’s my duty — making sure everyone goes home safely at the end of their shifts.”