In the medical field, there have been concerns regarding the lack of innovation in medical software, especially when compared to other sectors like finance and aerospace. Dereck Paul, a medical student at UC San Francisco, recognized the need for medical software that could keep up with cutting-edge technology and prioritize the needs of patients and doctors. This led him and his friend Graham Ramsey, an engineer at Modern Fertility, to launch Glass Health in 2021.
Glass Health’s primary goal is to provide physicians with a personal knowledge management system to store, organize, and share their approaches for diagnosing and treating various medical conditions. Paul and Ramsey were motivated to create Glass Health due to the overwhelming burdens on the healthcare system and the increasing problem of healthcare provider burnout. As frontline providers themselves, they experienced the challenges firsthand and wanted to leverage technology to improve the practice of medicine.
The initial reception of Glass Health was positive among physicians, nurses, and physicians-in-training, particularly on social media platforms like X (formerly Twitter). This early traction translated into the company’s first funding round, a $1.5 million pre-seed investment led by Breyer Capital in 2022. Glass Health’s potential was further recognized when the company was accepted into Y Combinator’s Winter 2023 batch.
However, in early 2023, Paul and Ramsey made the decision to pivot the company towards generative AI, in line with the growing trend. Glass Health now offers an AI tool powered by a large language model (LLM), similar to OpenAI’s ChatGPT. This AI tool is designed to generate diagnoses and evidence-based treatment options based on input from physicians. The tool analyzes patient summaries provided by clinicians and recommends potential diagnoses for further investigation. It can also generate case assessment paragraphs with explanations of relevant diagnostic studies.
While Glass Health’s AI tool appears promising, there have been concerns with the use of LLMs in healthcare. Other AI startups, such as Babylon Health and NEDA, faced scrutiny for making claims that their disease-diagnosing AI could outperform doctors and for providing harmful suggestions. Evaluations of ChatGPT’s health advice have also revealed the potential for misleading statements and missed studies. Additionally, there are concerns about biases in LLMs, as they can reflect the biases and blind spots present in the health records on which they are trained.
Glass Health’s AI Tool
Glass Health’s AI tool is designed to assist clinicians in generating potential diagnoses and treatment options for their patients. Physicians can input patient summaries, including relevant demographics, medical history, symptoms, and diagnostic findings. The AI tool analyzes this information and provides five to ten potential diagnoses for further consideration and investigation.
In addition to generating diagnoses, the AI tool can also draft a case assessment paragraph for clinicians. This paragraph includes explanations of relevant diagnostic studies, which can be edited and used as clinical notes or shared with the wider Glass Health community. The tool aims to support clinicians in their decision-making process by providing recommendations and options, while still emphasizing the importance of clinical judgment.
While Glass Health’s AI tool has the potential to be highly useful, it is important to recognize its limitations. LLMs have been known to provide inaccurate health advice and may not always reflect the most up-to-date medical knowledge. The tool should be considered as a supplementary resource rather than a definitive or prescriptive solution.
Challenges with LLMs in Healthcare
The use of LLMs in healthcare comes with its own set of challenges. One of the main concerns is the interpretation of health advice generated by LLMs. LLMs like Glass Health’s AI tool rely on pre-training to generate outputs, which can result in inaccurate or outdated medical information. This can be problematic when clinicians rely solely on the AI tool without critically evaluating the advice.
Another challenge is the potential for LLMs to provide misleading or harmful suggestions. In the case of NEDA’s chatbot, the AI system began parroting harmful “diet culture” suggestions, leading to the shutdown of the tool. Evaluations of ChatGPT’s health advice have also highlighted instances of misleading statements and plagiarized content from health news sources.
Furthermore, there is a concern that LLMs may amplify biases present in healthcare. LLMs are often trained on health records, which may only reflect the observations of doctors and nurses and exclude certain demographics or socioeconomic groups. Biases encoded in these records can inadvertently shape the output generated by LLMs, leading to disparities in healthcare.
Superiority of Glass Health’s AI
Glass Health aims to address the challenges associated with LLMs in healthcare through several measures. One of the key advantages of Glass Health’s AI is its integration with clinical guidelines created and peer-reviewed by an academic physician team. These guidelines are designed to reflect state-of-the-art medical knowledge and provide a foundation for the AI tool’s recommendations.
The expertise of Glass Health’s academic physician team adds credibility and ensures that the AI tool aligns with best practices in medicine. The team members come from major academic medical centers and work part-time for Glass Health, similar to their involvement with medical journals. This involvement enables continuous review and fine-tuning of the AI tool to follow the guidelines and address any potential biases.
A significant factor in the superiority of Glass Health’s AI is the supervision of outputs by clinicians. Glass Health emphasizes that the AI tool should be treated as an assistant rather than a replacement for clinical judgment. Clinicians are expected to closely supervise the AI’s outputs and make the final decisions based on their expertise and patient-specific factors. This approach ensures that the AI tool is used as a helpful resource while still valuing the expertise of clinicians.
Another advantage of Glass Health’s AI is the avoidance of legal scrutiny and regulation. By emphasizing the tool’s role as a recommendation and avoiding definitive or prescriptive claims, Glass Health mitigates the potential for legal and regulatory challenges. This allows the company to focus on providing a valuable tool for clinicians without being burdened by excessive scrutiny.
Finally, Glass Health’s AI tool offers fine control and state-of-the-art knowledge. The company collects user data to improve the underlying LLMs, ensuring that the tool evolves and remains up-to-date. User feedback plays a crucial role in the continuous improvement of the AI tool, allowing Glass Health to address any limitations and enhance its effectiveness.
User Data and Improvement
Glass Health recognizes the importance of user data in improving the AI tool. The company collects user data to enhance the underlying LLMs and their ability to generate accurate and relevant outputs. However, it is important to note that patient privacy and data protection are a priority for Glass Health. Users have the option to request the deletion of their stored data at any time.
In addition to user data, Glass Health leverages physician-validated clinical guidelines as AI context to generate outputs. This contextual retrieval of guidelines ensures that the AI tool is based on reliable and up-to-date information. An editorial process is also applied to the guidelines to address any potential biases and align recommendations with the goal of achieving health equity.
User feedback plays a crucial role in the improvement of Glass Health’s AI tool. By actively seeking feedback from clinicians, Glass Health can identify areas for improvement and address any limitations or concerns. This iterative process allows the company to continuously enhance the AI tool’s performance and ensure its usefulness in real-world clinical settings.
Adoption and Expansion
Glass Health has seen significant adoption among clinicians, with over 59,000 users signing up for the platform. The company offers a direct-to-clinician subscription offering, allowing individual clinicians to access the AI tool for a monthly fee. This offering has attracted a substantial user base and demonstrates the demand for AI-powered clinical decision support.
In addition to individual clinicians, Glass Health is also exploring an enterprise offering with health systems. This offering aims to integrate Glass Health’s AI tool with electronic health records, ensuring HIPAA compliance and providing clinicians with AI-powered recommendations and options tailored to specific clinical guidelines or care delivery practices. Several health systems and companies are already on the waitlist for this enterprise offering.
Looking forward, Glass Health plans to allocate its funding of $6.5 million towards physician creation, review, and updating of clinical guidelines, AI fine-tuning, and general research and development. The company aims to continue refining its AI tool and expanding its capabilities to further support clinicians in their decision-making processes.
In conclusion, Glass Health’s AI tool shows promise in assisting clinicians with generating potential diagnoses and treatment options. While there are challenges associated with the use of LLMs in healthcare, Glass Health’s approach, integration with clinical guidelines, expertise of the academic physician team, supervision of AI outputs, and focus on user feedback set it apart from other solutions on the market. Through continuous improvement and addressing limitations, Glass Health aims to provide a valuable resource for clinicians and contribute to advancements in medical software innovation.