Naval Surface Warfare Center, Crane Division (NSWC Crane) is teaming up with state academic institutions Indiana University (IU), the University of Notre Dame, and Purdue University to develop Trusted Artificial Intelligence (AI) research and workforce development. This initiative launched in June 2021 and is intended to scale over several years.
The Embedded Systems/Trusted AI initiative is part of the Scalable Asymmetric Lifecycle Engagement (SCALE) workforce development program funded by the Office of the Undersecretary of Defense for Research and Engineering OUSD(R&E)’s Trusted & Assured Microelectronics (T&AM) program.
Robert Walker, the Chief Technology Officer (CTO) at NSWC Crane, says this initiative is important and adapted to the needs of the Department of Defense (DoD) and the Defense Industrial Base (DIB).
“The Trusted AI SCALE program combines research and workforce development based on the unique needs of the DoD based on recommendations from the DIB,” says Walker. “By working directly with the academic partners, we are giving students real technical challenges that a typical undergraduate education doesn’t include in its curriculum. This is a high payoff effort, and we are excited to work with IU, Notre Dame, and Purdue as the program grows.”
The SCALE model has key aspects of immersive research and development (R&D) alongside student training, consortium framework that rapidly scales and replicates, and educational pathways aligning curriculum, research, and internships.
This Trusted AI initiative is one of five technical verticals of the SCALE model, each with its own consortium of academic institutions and targeted level of education: Radiation Effects (often referred to as Rad-Hard), Advanced Packaging, System on a Chip (SoC) Design, Supply Chain, and Embedded Systems/Trusted AI.
“Artificial intelligence has tremendous potential to benefit the defense and security of the United States,” said Peter Bermel, Elmore Associate Professor of Electrical and Computer Engineering at Purdue and SCALE PI. “Fielding useful AI systems requires us to simultaneously develop new tools while training new personnel. In this project, we will have the opportunity to perform interdisciplinary research in AI, train new students, and connect them with government and defense industry employers, to help address the major workforce challenges in the field.”
NSWC Crane serves as the T&AM workforce development co-lead with the Air Force Research Laboratory (AFRL), manages the overarching SCALE program, and is the Embedded Systems/Trusted AI vertical technical lead.
Alison Smith, the Indiana University Liaison and Trusted AI Program Manager at NSWC Crane, says trusted systems in the DoD are a critical need.
“NSWC Crane has national technical experts in Trusted and Assured Microelectronics providing insight into the program’s problem statements,” says Smith. “AI is a potential third offset for the DoD; however, further research in the focus areas of this effort are needed to harness AI effectively.”
Kara Perry, the Education and Workforce Development Co-Lead for T&AM at NSWC Crane, says this workforce development program is unique.
“The Trusted AI program is a graduate-level program, driven by difficult challenges the DoD faces,” says Perry. “Our university partners take our hard problems and turn them into projects to both train students and develop technical solutions. What’s unique about SCALE is that we develop the talent and technology in parallel. The intent is to scale this effort and bring in more universities and DoD and DIB stakeholders.”
The Trusted AI consortium based on the SCALE model is intended to produce the next generation of Trusted AI workforce pipeline – a ready and flexible workforce ready to work for the DoD, DIB, or government. It is also intended to develop a Validation and Verification (V&V)/T&E framework to assess the level of trust in AI/machine learning (ML)-enabled solutions.
Christopher Sweet
Christopher Sweet
“As AI becomes increasingly pervasive in the technologies the world relies upon, failure of these systems is not an option for the modern soldier. Therefore, training future leaders in this field, including current undergraduates, graduate students, and postdoctoral scholars, will ensure we are building a modern and prepared workforce to help support and meet the Trusted AI needs of a complex world,” said Christopher Sweet, Associate Director for Cyberinfrastructure Development in the Center for Research Computing and Assistant Research Professor in Computer Science and Engineering at the University of Notre Dame. “Notre Dame is committed to this mission and excited to partner with our colleagues in IU, NSWC Crane, and Purdue on this globally impactful work.”
The Trusted AI SCALE academic leads, in collaboration with the NSWC Crane AI Development Laboratory (CrAIDL) technical leads, have identified five research themes required to develop the frameworks, methodologies, and tools necessary to assess the level of trust of AI/ML-integrated systems: Trust and Verifiability, Statistical Framework for Data/Model Analysis, Knowledge Graph Enhanced Natural Language Processing, Human-Machine Pairing, and Framework Infrastructure Development.
“These research goals are very difficult, and addressing them will require a highly interdisciplinary approach,” said David Crandall, Professor of Computer Science and the IU PI of the project. “One of the most exciting aspects of the project is that it brings together faculty and students across three universities and numerous disciplines, including Computer Science, Electrical and Computer Engineering, Informatics, Intelligent Systems Engineering, Psychology and Brain Sciences, and Statistics, sparking collaborations that would likely not have happened otherwise.”
“The security of AI is one of the most critical hurdles to its broad application,” said Jeff Zaleski, interim vice provost for research at IU Bloomington. “The partners working on this project form a strong multidisciplinary team with comprehensive expertise in engineering, computer science, law, policy, ethics, and more. Working together, I’m confident we’ll be able to successfully create trusted-AI technologies that will meet the DoD’s developing needs.”
— Sarah K. Miller, Center for Research Computing