Researchers develop new video game to teach chemistry and engineering

Polycraft World, a modification of the video game Minecraft, was developed by the University of Texas at Dallas researchers to teach chemistry and engineering.

Now the game that allows players to build virtual worlds is serving as the foundation for federal research to develop smarter artificial intelligence (AI) technology.

UT Dallas researchers received a grant from the Defense Advanced Research Projects Agency (DARPA) to use Polycraft World to simulate dynamic and unexpected events that can be used to train AI systems -- computer systems that emulate human cognition -- to adapt to the unpredictable.

The simulated scenarios could include changing weather or unfamiliar terrain. In response to the COVID-19 pandemic, researchers have added the threat of an infectious disease outbreak.

The $1.68 million projects is funded through DARPA's Science of Artificial Intelligence and Learning for Open-world Novelty (SAIL-ON) program, which was formed in 2019 to support research on scientific principles and engineering techniques and algorithms needed to create advanced AI systems.

"The project is part of DARPA's suite of AI programs that are trying to figure out what the next generation of AI is going to look like," said principal investigator Dr. Eric Kildebeck BS'05, a research professor in UT Dallas' Center for Engineering Innovation (CEI).

Our role centers around the concept of novelty. It's all about creating artificial intelligence agents that -- when they encounter things they've never seen before and they've never been trained to deal with -- respond appropriately."

Dr. Eric Kildebeck BS'05, Principal Investigator and Research Professor, Center for Engineering Innovation (CEI), University of Texas at Dallas

Current AI systems excel at performing tasks with rigid rules, such as playing chess. Self-driving cars, for example, know to stop at red stop signs because they are trained with data that includes what to do when approaching those signs. But they might not know how to respond if they approach something outside of those parameters, like a blue stop sign.

"AI now is highly specific to respond to what it has been trained on. But humans encounter things they've never seen before all the time," Kildebeck said.

"If a self-driving car came upon a blue stop sign, it might ignore it, or it might register the sign as something new and stop. The ability to reason and rationally adapt to things you've never seen before is the whole point of this program."

The UT Dallas researchers' work focuses on the first of three phases of DARPA's project -- building simulated scenarios in Polycraft World. Next, researchers at other institutions will develop algorithms to enable AI systems to respond to those challenges.

The UT Dallas researchers are not building military scenarios. Instead, in the third and final phase, the Department of Defense will adopt the researchers' work to reflect what military troops might face. The UT Dallas project began in December and will continue through mid-2021.

Kildebeck said Polycraft World provides an ideal platform for DARPA's project because it incorporates multiple fields of science, including polymer chemistry, biology, and medicine, to enable simulation of real-world scenarios.

Dr. Walter Voit BS'05, MS'06, who led the team that developed Polycraft World, is a co-principal investigator for the DARPA project.

"DARPA seeks to advance the state of the art in how artificial intelligence operates in open worlds where prior training has been limited," Voit said.

"We are excited at UT Dallas to be able to provide a comprehensive test environment based on Polycraft World to provide novel situations for some of the nation's most promising algorithms to see how they react."

Voit, associate professor of materials science and engineering and of mechanical engineering in the Erik Jonsson School of Engineering and Computer Science, and Kildebeck met as undergraduates at UT Dallas in the inaugural cohort of Eugene McDermott Scholars. Kildebeck earned an MD/Ph.D. in cancer biology and gene therapeutics from UT Southwestern Medical Center and is a member of the Eugene McDermott Graduate Fellows Program. His research area includes the application of genome engineering to the treatment of neurologic and autoimmune diseases.

The pair collaborates with Dr. Robert (Joey) Steininger, a research scientist at the CEI and a Eugene McDermott Graduate Fellow who earned a Ph.D. in cellular regulation and pharmacology from UT Southwestern.

The researchers are working to capture novel scenarios from the real world, including recording a video that they can digitize to incorporate into the game.

In the self-driving car example, the virtual car would need to learn to stop at different variations of a stop sign to prevent accidents.

"We're building a track that may have green stop signs, red stop signs, and blue stop signs all along the pathway," Steininger said, giving an example of a possible scenario.

"Other researchers will incorporate their algorithms into a self-driving car. As the car navigates through a virtual city, if it reaches its destination without accidents, it's a success."

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Squid-inspired technology could replace needles for medications and vaccines