Google Lens Helps Identity Skin Condition Using Artificial Intelligence and Skin Tone Scale

Describing a mole or a rash in the skin is difficult when done with words alone, making it hard to receive a diagnosis outside the clinical setting. With the help of machine learning, Google Lens image search can now assist in identifying skin conditions by helping to find visual matches to inform a person's search.

Visual Analysis of Skin Conditions

Google Lens is a virtual search tool allowing users to use their phone camera to search what they see. The user needs to take pictures or upload images, and the app will provide a wide range of visual results.

The same approach can be used to describe other body parts, such as unusual hair loss, bumps on the lip, or strange lines on the nails. However, Google still warns that their results are only information and not a formal diagnosis, so users should still seek medical advice from health authorities.

In recent years, Google has been exploring the possibility of using AI image recognition to identify skin conditions. During the I/O developer conference in 2021, Google previewed its pioneer technology in healthcare by creating a web tool that uses artificial intelligence to help users identify health conditions in the nails, skin, and hair using combined photos and survey responses. During this time, Google claimed that the tool could identify 288 different issues and can suggest the proper condition in the top three suggestions most of the time.

This tool refers to a guided skin search app called DermAssist, which Google Health provided. Upon answering a few questions and taking three images, the user can receive personalized information about their skin concern.

The DermAssist tool still undergoes further marketing analysis through a limited release. Although the European Economic Area CE-marked the tool as a Class 1 Medical Device, it has not been evaluated by the U.S. Food and Drug Authority (FDA). In short, it is still mainly for informational purposes and is not intended to provide a medical diagnosis.

How Accurate are the Results?

Despite the innovation offered by Google, experts suggest that great caution must still be observed in using AI diagnostic tools. One of the criticisms of AI-generated recognition of skin conditions is the accuracy in dealing with darker skin tones.

In research done by Dr. David Wen, it was found that there is only a limited skin type category across the available image databases used in programming artificial intelligence systems. Because of this, the scientists suggest further studies to ensure the technology can benefit all patients.

To address these complaints, Google collaborated with Harvard Professor Ellis Monk in utilizing its Monk Skin Tone Scale (MST). It is a new 10-point skin tone scale created to replace the traditional skin tone scales, which are biased towards lighter tones. Its best practices are also promoted for use in artificial intelligence development.

In 2021, Google claimed that its deep learning system is more accurate in identifying skin conditions for Black patients. With an accuracy rate of 87.9% for Black people, the system is more accurate in recognizing the skin conditions of this group of people than in other ethnicities.

When asked about the ability of the tool to work across a wide range of skin tones, the tech giant said that it collaborates with organizations and clinicians which serve patients from a wide range of backgrounds. It also works with dermatologists with expertise in various skin tones to organize the available thumbnail images.

Check out more news and information on Artificial Intelligence in Science Times.

Join the Discussion

Recommended Stories

Real Time Analytics