Services
BI & Data Analytics
AI & Automation
User Experience Design
User Research
User Interface Design
Data & AI
Customer Experience
Company Background
Project Artemis is part of NASA’s strategy to return to the moon and establish a permanent human settlement on the lunar surface. As part of the overall moon mission, a Lunar Gateway will remain long-term in lunar orbit. The mission aims to be the pinnacle of humanity’s efforts to grow beyond Earth and towards a Kardashev Type 2 civilization. Anticipated to maintain a long-term lunar orbit, the Lunar Gateway station requires a substantial number of on-station operators. These individuals play a vital role in ensuring the Lunar Gateway functions effectively, facilitating the seamless execution of missions to the moon. Specifically, spacecraft and astronauts are expected to travel from Earth to the Lunar Gateway, and a lunar lander will transport astronauts from the Lunar Gateway to the lunar surface and back. Notably, the same spacecraft utilized for the lunar descent will be employed for the return journey to Earth.
The Challenge
The primary challenge encountered in this approach was the task of modularizing all conceivable commands into an easily navigable interface. A critical aspect was to ensure that these commands were not only user-friendly but also sufficiently high-level for effective communication with the robotic operator.
Emphasizing the need for comprehensible commands to users while simultaneously being parsed by the robot required meticulous attention to detail to guarantee that the robot could precisely interpret and understand the instructions. This, in turn, factored in its inherent capabilities, sensors, and physical parameters essential for command execution.
The substantial undertaking of modularizing the complete spectrum of commands available to the robot became apparent. This process involved parsing and mapping these commands into specific actions that the robotic operator could perform. Significantly, this necessitated a comprehensive understanding of the myriad use cases a robot might encounter.
Moreover, it entailed formulating a generic set of language or communication tools capable of breaking down these commands into easily understandable and representable forms. This approach mirrors the methodology employed by large language models or language translation services.
The proposed solution involved the creation of a sophisticated software dashboard designed to provide a visual representation of the robot and its operational context. Additionally, it incorporated real-time readings from all sensors embedded within the robotic operators.
Integral to this solution was a command center featuring an advanced control panel. This interface empowered users to issue high-level instructions to the robot, such as directing it to perform specific tasks.
Upon receiving instructions, the robot autonomously processed and comprehended the directive. It identified objects, such as a brown box, and recognized designated locations within its existing context. The robot then executed the assigned task without necessitating real-time communication with the command center.
Following the execution of a task, typically within a few seconds, the robot communicated back to the command center, confirming the successful completion of the assigned task. This communication ensured a seamless and efficient workflow.
Results of Our Work:
The outcomes of our dedicated efforts have been officially endorsed by NASA and its subcontractors. This recognition positions our work to be incorporated in various capacities within the upcoming Artemis Moon mission, showcasing the relevance and impact of our contributions.
We Achieved
Our results are slated for utilization in the lunar gateway, a significant undertaking expected to launch in 2024, solidifying the tangible application of our work in the forefront of cutting-edge space exploration initiatives.