12/17/2024 | News release | Distributed by Public on 12/17/2024 11:18
Simulated populations, virtual reality and non-player characters sound like components of a modern video game. In fact, a team of researchers at the Department of Energy's Oak Ridge National Laboratory is using virtual reality to understand normal and abnormal human behavior in a given location - specifically, a nuclear reactor.
"The overarching vision is to proactively prepare for insider threats in secure facilities such as nuclear reactors, fuel depots and other critical infrastructure," said Gautam Malviya-Thakur, an ORNL group leader in location intelligence. "We intend to help situationally understand the underlying patterns of where people go in secure facilities: what paths they take, how much time they spend, what areas they access, and when and if they have the credentials to be there."
As people move around their lives, communities, even their homes, they tend to do similar activities in the same locations, creating a predictable blueprint for how they spend their days. Normal behavior makes it easier to detect anomalous behavior, such as if a worker decided to do something they weren't authorized to do.
"We proposed a project to see how pattern-of-life data could be used to proactively detect human behavior that is outside of normal in secure facilities," said Debraj De, a location intelligence research staff member and principal investigator of this project. De has spent years using pattern-of-life data to understand populations in various settings to see how people typically behave.
Workers at work are no different. Their pattern of life includes generally parking in the same spot, entering through the same doors, or frequenting the same restroom. A collection of these moments in one's life also includes the time of day and duration when workers visit various areas. Over days, weeks and months, this spatio-temporal pattern becomes predictable -even normal- for that person and that place.
De, Malviya-Thakur, co-investigator Chathika Gunaratne, and the team were faced with the challenge of figuring out how to detect anomalous behavior in a secure facility with limited access. Using cameras and sensors to collect pattern-of-life data isn't always easy or otherwise allowed in nuclear facilities. Gathering data about how people move about a secure facility would be difficult to get; staging a situation as an experiment to collect information about human movement inside a secure facility would be costly in time and money.
Instead, De's team created a small, simulated population of digital representatives of people called agents, who would theoretically move throughout a nuclear reactor. Agents established a pattern of life to include finding their workstation, visiting the lunchroom, and walking through the building. Within a few days of real-world time, a simulated population could amass a dataset covering years, even decades, of behavior.
The researchers then created a virtual simulation of a nuclear reactor facility. Since floor plans of nuclear reactors for commercial power are not made available to the public, De's team created a digital twin of ORNL's High Flux Isotope Reactor, or HFIR, a DOE Office of Science user facility. The simulated population data was merged with the HFIR digital twin to create a virtual reality simulation of people working inside a nuclear reactor.
Using VR goggles, a user maneuvers through the virtual HFIR, leaving a trackable trail of activities that establish a pattern of life. The research team gathered data about the user's behavior, looking to see if the person followed the rules or deviated from the task. The final data is compiled into a machine learning algorithm to train on detection of anomalous human behavior, identifying threats to reactor operations before a situation happens.
"We gathered human movement data for non-playable characters (NPCs) under sensor deployment restrictions at HFIR and built an agent-based model based on anecdotal evidence gathered during tours and discussions with facility managers," said Gunaratne, an ORNL artificial intelligence research scientist. Gunaratne brought his experience modeling human movement in urban spaces to the project, going through different scenarios to find the right answer to crowd control challenges.
An end-user wearing the VR headset on sees a facility and equipment around them, NPCs walking around, and a clipboard in their hand with a checklist of tasks similar to tasks a worker would perform throughout their day. Gunaratne said the user proceeds through the tasks as well as performs pop-up instructions based on where they are in the simulation.
A part of this work recently won the Best Demo Paper award at 25th IEEE MDM'24 conference in Brussels, Belgium. Additional HFIR simulation findings, which will be published in the Proceedings of the 2024 Interservice/Industry Training, Simulation and Education Conference and Proceedings of the 2024 Winter Simulation Conference, create a realistic picture using non-player characters to make the VR feel real.
"In this situation with HFIR, we could try different what-if scenarios based on cost and safety risk, such as insider threat situations, to find out what shifts in normal behavior are exhibited during a non-typical event," Gunaratne said. "We could even observe if the cascading effects of one agent's anomalous behavior impacts other agents."
The team said this simulation can also be used to train emergency response personnel on what-if scenarios in restricted or secure campuses.
This project is funded by ORNL's Laboratory Directed Research & Development Program.
UT-Battelle manages ORNL for the Department of Energy's Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science. - Liz Neunsinger