When the closest doctor is on another planet, real-time care isn’t an option. On a Mars mission, a single question & answer with flight surgeons can take 40 minutes or more. NASA and Google’s answer is an AI medical assistant that can act without Mission Control.
The Crew Medical Officer Digital Assistant (CMO-DA) is built to diagnose illnesses and guide treatment on its own, giving crews medical help during deep space missions when they can’t call Earth.
The CMO-DA is an AI-powered medical assistant that helps astronauts diagnose and treat health problems independently when real-time help is limited or unavailable.
Early tests are showing promising results in diagnosing medical conditions.
The results across the simulated tests were impressive: for example, the CMO-DA correctly identified ankle injuries with an 88% accuracy rate. That’s almost 9 out of 10 correct diagnoses. While not perfect, these numbers show real promise for a system that’s still being refined.
NASA created this AI doctor specifically to support their Artemis program to the Moon and future crewed missions to Mars. Long-duration flights need medical autonomy, and the AI doctor is part of the plan to keep crews healthy for entire months.
The public-private partnership strengthens the position of US leadership in both space exploration and artificial intelligence development.
The deeper we travel into space, the more we need smart tools that can solve problems without help from Earth. On Mars, the nearest physician is, on average, 140 million miles away, with a communication delay of up to 20 minutes one way. That’s 40 minutes for a single question and answer. It’s quite likely that astronauts will experience health problems that need treatment decisions quicker than that, without time to wait for advice from Earth.
Small health problems can escalate quickly on long trips. The CMO-DA is built to be the first responder and help any astronaut handle medical problems, whether they have medical training or not. This includes assessing symptoms, identifying conditions, and initiating treatment without direct guidance.
CMO-DA’s performance comes from a technical stack built on Google’s powerful cloud AI environment and highly specialized medical training data. It can process complex information and provide reliable clinical support independently.
The system runs in Google Cloud’s Vertex AI environment. Thanks to that, the model can process multimodal inputs — speech, text, and even images — and suggest usable guidance quickly and consistently.
Under the agreement, NASA owns the CMO-DA source code while using Google’s cloud services on a fixed-price subscription.
Care in microgravity isn’t the same as care on Earth. The model relies on advanced natural language processing (NLP) and machine learning, but is trained on spaceflight medical literature rather than general medical data, to make sure its insights are relevant for astronauts.
This training approach was validated using the Objective Structured Clinical Examination (OSCE), a standard for assessing clinical competence.
CMO-DA can analyze medical scans and interpret a crew member’s verbal description of their symptoms. It then maps those inputs to likely conditions and suggests next steps.
Before anyone uses this tech in real life, the AI doctor has to prove it can handle common problems in simulated scenarios. Early tests focused on conditions astronauts are likely to face and measured how well the system could spot them.
NASA and Google ran the assistant through simulated medical emergencies. The tests included scenarios like ankle injuries and ear pain, which are common on Earth but not when you’re orbiting the Sun.
CMO-DA posted promising results. It identified ankle injuries with 88% accuracy, ear pain with 80% accuracy, and flank pain with 74% accuracy. Those are early numbers, but they suggest the system can handle the kinds of calls a crew actually needs to make.
Next versions of the AI doctor will ingest data from medical devices and factor in space-specific conditions. The recommendations will adapt to microgravity’s effects on the body. Before anything heads to Mars, NASA plans to test the system on the International Space Station (ISS).
The technology developed for Mars missions could transform healthcare in remote areas on Earth, eventually helping people in isolated communities where access to medical assistance is rare. It can push telehealth forward and help lower the barrier to quality care in places that don’t have it.
At Revolgy, we work with Google Cloud and Vertex AI to build practical decision-support tools for teams based on Earth. If you’re exploring Vertex AI, we’re happy to talk through a pilot and what it would take to make it real. Contact us for a free consultation.
Find out more about the Vertex AI Platform and our AI solutions.