


As ChatGPT moves to encompass the full scope of health care, others are taking a more nuanced approach. One is Headlamp Health, whose new intelligence platform, Lumos AI, aims to advance a research field that has long stalled for drug developers, clinical trial researchers and clinicians working to solve complex mental health challenges in even more complex patients.
With an advisory board that includes investor, entrepreneur and mental health advocate Zak Williams—the son of the late actor Robin Williams—Headlamp officially launched Lumos on Jan. 7. The platform is designed as a coordinated set of agentic A.I. layers meant to bring precision medicine to a space that has long lacked it.
“I never thought I’d go into the mental health space,” Williams told Observer. “But after my father died by suicide, and I was diagnosed with complex post-traumatic stress disorder, generalized anxiety disorder and depression, I found myself in need of solutions.”
That experience led Williams to work with Headlamp Health, where he advises on both the technology and its market positioning. He saw not only a need for reinvention in psychiatry, but also an opportunity to help others where he could.
Erwin Estigarribia, CEO of Headlamp Health, who previously focused on oncology and cardiology technology, has his own reasons for entering the psychiatry tech space. “I was exposed to the mental health side of medicine through family members and personal circumstances, and realized, holy smokes, the entire field is about 20 years behind cancer and cardiology,” Estigarribia told Observer.
Bringing precision medicine to psychiatry
Robin Williams suffered from the brain disease Lewy body dementia, a diagnosis discovered only through autopsy and later made public by his wife, Susan Schneider Williams. During his life, he sought treatment for what appeared—even to medical experts—to be unrelated symptoms, including tremors, delusions and high cortisol levels. Prior to his suicide, he was misdiagnosed with Parkinson’s disease. As many as half of the people with Lewy body dementia are misdiagnosed.
The problem extends far beyond one illness. Schizoaffective disorder is misdiagnosed 75 percent of the time, while even the more common major depressive disorder is misdiagnosed in more than half of cases.
As precision medicine becomes the standard in fields like oncology, psychiatry continues to lag behind. But multilayered A.I. systems are beginning to close the gap. Lumos AI has several core use cases: identifying patient subtypes most likely to benefit from a given therapy; making clinical trials more efficient and effective; de-risking drug development; and modeling how patients change over time.
To power that work, Headlamp has compiled at least 100 million data points—both proprietary and from external health data sources, spanning decades of research. These are fed into layered A.I. frameworks designed to answer a central question: What is the right therapy for the right patient at the right time?
Williams said much of recent A.I. in mental health has focused on automation, but Lumos is built differently. “It’s structured to help identify responder versus nonresponder populations way earlier in development,” he said. “Then, leveraging that longitudinal, real-world and behavioral data informs trial design and treatment matching.”
With clinicians and researchers kept in the loop, decisions come from the “better organization of data, which then leads to better inference and better causal reasoning,” Williams said.
Mental illness is largely episodic and invisible. “We can’t take a picture of depression [or] anxiety,” Estigarribia said. “Measuring it reliably in the blood is something that we’re not able to do due to the blood-brain barrier, which essentially isolates the organ of interest that we’re interested in studying.” Tools that better isolate and interpret the contributing factors behind psychiatric conditions could drive a sea change for millions of people simply trying to get through each day.


Roughly 49,000 people in the U.S. died by suicide in 2024, according to provisional U.S. Census data. Research suggests an average of 135 people are significantly affected by each suicide death—people who may themselves need mental health support.
In clinical settings, Estigarribia said Lumos AI’s suicide prevention impact was not the original goal, but has been a welcome outcome. “Being able to provide clinicians an A.I.-driven real-time view of their [patients] and highlight who is trending positive, negative or neutral since their last visit has actually led to several tragedies being averted.”
On the research side, as federal funding shrinks for the National Institutes of Health and other agencies, platforms like Lumos can help researchers find efficiencies that keep essential studies moving forward. Beyond the statistics, those advances translate into real changes in individual lives.
Improving life, not just delaying death
Other companies are also using A.I. to streamline clinical trials, from patient-matching platforms like BEKHealth to decentralized trial tools such as Datacubed Health. Headlamp, however, is targeting a narrower and less-served niche: working directly with neuroscience researchers, psychiatric drug developers and frontline clinicians, with psychiatry as its sole focus rather than the broader life sciences.
“Because we are the primary aggregator of all types of data, we want people to innovate on wearables, advanced imaging, blood biomarkers [and] cognitive therapies,” Estigarribia said. “We will collaborate, share data and work with anybody whose mission aligns with ours.” The key to tackling such large problems, he added, is to “stay humble, develop gratitude and be collaborative.”
Using A.I. to process sensitive psychiatric health data for clinical decision support carries risks, especially around privacy. As Alexander Tsiaras, founder and CEO of the A.I.-driven medical records platform StoryMD, previously told Observer regarding ChatGPT Health, strong encryption is now an industry standard. The real question, he said, is, “Once you have the data, can you trust them?”
For Williams, who is highly selective about his partnerships, Headlamp met his criteria, including in the area of trust. He evaluated the company and its technology by asking: “Are there good people involved with the organization? Do these people care deeply about how these outcomes are being delivered, how it’s improving the lives of folks, and is it contributing to the greater benefit of humanity?”
Another concern is the integrity of the A.I. itself. Williams pointed to the risk of semantic collapse, in which systems fail as data volume overwhelms reasoning. “There’s a critical need to shift from data volume to data reasoning, to focus on actionable insight,” he said, adding that this is precisely what Headlamp aims to do with Lumos.
Robin Williams, in his role as Patch Adams in the 1998 film about the real-life physician, once said, “Our job is improving the quality of life, not just delaying death.”
Through Headlamp, Estigarribia and his team are trying to live up to that idea. “If I don’t feel safe enough for [Lumos] to be used by my own mother, then it’s not something that we can deploy,” he said.
Source link