Kent State University team participates in NASA SUITS Design Challenge

The National Aeronautics and Space Administration’s Spacesuit User Interface Technologies for Students Design Challenge took place on June 11.

Kent State University’s interdisciplinary team ATR_FLUX was one of ten teams nationwide that participated in the National Aeronautics and Space Administration’s (NASA) Spacesuit User Interface Technologies for Students (SUITS) Design Challenge Thursday. 

The challenge assigned teams of college and university students to design and create space suit information displays within augmented reality (AR) environments. 

The SUITS Design Challenge, a part of NASA’s Artemis Student Challenges program, provided students with experience and opportunities to shape the technology that will be used by the Artemis Program, which will land the first woman and the next man on the Moon in 2024.

Jong-Hoon Kim is an assistant professor of computer science at Kent State University and lead advisor for the ATR_FLUX team. He explained how new technologies under development could innovate the space missions of the future. 

“The challenge has to do with improving how easily astronauts can collaborate with the ground station while on a mission,” Kim said. “When we first landed on the moon, there was only voice communication. Now, researchers are developing technologies such as holograms and visual displays to help astronauts better interpret data and communicate.”

After submitting a 26-page proposal to NASA, Kent State’s ATR_FLUX team was chosen to present their work at NASA’s Johnson Space Center in Houston, Texas. Due to the COVID-19 pandemic, the event was reorganized as a virtual meeting.

The team, assembled through the Kent State Advanced Telerobotics Research (ATR) Lab, is composed of 12 students and four faculty advisors. For the past several months, they worked collaboratively to research and develop new assistive features for the space suits. 

These features include a helmet-based AR display system and a “telesuit,” capable of collecting motion data and biometric data (such as heart rate and blood pressure) from the astronauts and showing a 3-D visual representation of this information on the AR helmet display.

Irvin Cardenas is a Kent State University computer science Ph.D. student who led the ATR_FLUX team in the challenge. He said augmented reality is technology that is able to superimpose computer-generated images on a user’s view of the real world.

Cardenas explained how AR technology could be used to complete a simple task like building furniture. “Rather than opening up a manual and saying, ‘Oh, you put this here and use this screwdriver,’ you have a device that augments the way that you’re supposed to do things,” he said. “It could show you how to use this tool or even track and highlight in the room.”

In terms of utilizing this technology for a spacesuit, the goal was to enable immediate access to instructions, procedures, graphics, spacesuit status and health status information in the form of this audiovisual display. This would allow astronauts to work more efficiently without constant direction from NASA’s mission control.

“Augmenting the real world for the astronauts can actually reduce mental stress or cognitive overload,” he said.

The team also developed a telesuit prototype to monitor physiological activity and better assist the user and enhance the overall performance. 

Members of the School of Fashion at Kent State collaborated on this aspect of the project with the goal of creating a proof of concept prototype in jacket form.

The concept’s features include different fabrics that stretch and support the piping of inserted biometric data sensors to monitor the user’s muscle groups and zippers along the sleeve that open where the polymer-based strain sensors will be placed. 

“These human-autonomy enabling technologies are necessary for the increased demands of lunar surface exploration,” said NASA officials on the agency’s website.

The team presented its project titled “Coactive and Collaborative Interaction Platform for xEMU” on Thursday to a panel of NASA and industry leaders via Google Hangouts. 

The coronavirus pandemic and the closing of campus created many unique challenges for the team to overcome.

“Essentially everyone had to go home. Most of our students don’t have these gaming computers that you need to present the graphics. So we had to see how we can deal with that,” Cardenas said. “But, we were able to allocate computers to a few people.”

Another challenge for the team was working remotely.

“It was sometimes hard to coordinate. We had to figure out a way to make this fit in everyone’s schedules because not only did they have to work on this, but also on university stuff,” he said. 

The team also encountered some last minute issues when one of the team members experienced technical difficulties while attempting to export a real life demonstration video of the spacesuit.

“One of our students left the country [due to the pandemic] and her computer wasn’t working and her internet wasn’t working, so we just had to re-shoot a lot of the stuff and edit it together,” said Cardenas.

Cardenas and other team members said the experience was positive overall despite the late predicament they faced. They noted they felt their attention to detail helped their interface stand out among the other team’s designs.

“It’s not just about showing an interface or a box that has numbers on it all over the place. Maybe it can show you what you need and what you don’t need and tell you if there is something wrong with the system,” he said.

Cardenas said the feedback from the panelists was positive and will help them with what’s ahead for the project. 

“Everyone was excited. The panelists gave us some ideas to integrate and start building upon for  the next interfaces,” he said. 

The team’s aim is to go a step further than just building the technology by gaining a better understanding of how the technology may affect the operator.

“We use a similar suit to control robots. So we’re assessing the operator of the robot to see whether they’re getting stressed or their heart rate is increasing,” Cardenas said. “We want to assess if an astronaut using this interface is to do a task, are they moving too much? What is their breaking point?” 

This information will help the team better modify the interface to alert the astronaut or crew.

Cardenas said the ATR_FLUX team is working to get permission to go back to work on the project on campus soon.

Contact Connor Steffen at [email protected]