NASA has challenged teams of students to develop heads-up interactive displays to be utilized by astronauts during space walks. A team of students from Texas A&M University is one of 16 chosen to participate in the NASA Spacesuit User Interface Technologies for Students (SUITS) 2019 challenge.
The objective of the SUITS competition is to develop a user interface for space suit information displays, utilizing the Microsoft HoloLens to create an augmented reality environment, that would enable astronauts to better perform spacewalk tasks by providing a set of instructions via the display environment. The team has until April to develop a system, and then present their prototypes to a group of testers at Johnson Space Center in Houston.
Typically, astronauts follow written procedures when performing tasks on a mission, even those that have been done frequently, to ensure that every step is executed correctly, and nothing is missed. For complex tasks during an extra-vehicular activity (EVA)/spacewalk, the crew can’t carry laptops, paper or anything physical to follow detailed written procedures on their own. They rely solely on voice guidance from mission control or a fellow crewmember inside the spacecraft.
Using a helmet-based interactive display, guiding information can be projected on the astronaut’s helmet visor within his or her field of view, freeing up dependence on voice-guided commands. Since the student teams don’t have direct access to the actual space environment for research and development, virtual reality will be used to emulate space conditions and test the augmented reality algorithms in a simulation.
Included in the requirements for the challenge, all EVA task instructions should be displayed, the astronaut should be able to communicate with the IVA astronaut or ground control at any time, and all hand gestures must be operable with EVA-gloved hands.
The faculty advisors for the team are well versed in space. Dr. Greg Chamitoff, professor of practice in aerospace engineering, is a former shuttle and International Space Station astronaut who served 198 days in space. “Visually guided actions would be much more efficient than the way we have operated in space for the past 60 years,” he says. “It would provide better situational awareness, improved clarity of desired actions and reduced likelihood of errors.”
Dr. Ana Diaz Artiles is an assistant professor in aerospace engineering, with interests focusing on human spaceflight and space system engineering, with strong emphasis on extravehicular activity and human performance in altered gravity environments. “Having additional support from AR technology can be very helpful, providing additional information on a specific tool or piece of equipment, including how to use it and the next steps in the operational procedure,” she says.
Team members include doctoral student Neil McHenry and undergraduate students Brock Balthazor, Leah Davis, Ashu Mishra, Israel Gomez III and Tylor Herndon.