Google’s newly-launched AI-based dermatology symptom checker
Google's newest healthcare push is a computer vision application that helps people identify skin, hair and nail conditions.
It involves users taking three pictures of the problem area and answering a series of questions about their skin type and other symptoms. The tool then uses the photos and information, compares them with 288 trained conditions and informs users of any diseases they may have.
Google explains that the analysis is not intended to replace a visit to the doctor. Instead, the app is designed to equip its users with information so they can plan their next steps. Google is working with a research team at Stanford University to test how well the tool works in a healthcare setting.
Medical diagnosis is based on the recognition of patterns. And who is better at finding data patterns than BigTech? Google fed their AI around 65,000 images from clinical settings, and preliminary tests are promising: According to Google, the AI identified the correct disease among the top three suggestions 84 percent of the time within a set of 1,000 samples.
While an average incorrect search query has no cost, a medical false negative or a false positive diagnosis can result in expensive subsequent activities for the patient, and for the healthcare system. With this in mind, it will be interesting to see whether users employ the tool as a supplement to or replacement for a visit to the dermatologist. Startups such as K'ept Health and Dermanostic do not want to replace dermatologist, but simply digitize them.
The AI model that powers Google's tool recently passed clinical validation and received a CE marking in the EU as a Class I medical device. The question remains, however, of how access to the healthcare market will be renegotiated through these new interfaces that operate between the patient and the regulated healthcare system.