ENES

Published by Piel con Maria · skinscan.guide

How AI Skin Analysis Actually Works — And What It Can’t Do

Every week, a new app promises to “analyze your skin in seconds.” Some are dermatologist-backed. Some are effectively just filters with marketing copy pasted over them.

The technology behind AI skin analysis has advanced significantly in the last five years — but what it can actually do, and where it reliably falls short, is rarely explained honestly. Most apps lead with the impressive parts and bury the limitations.

Here’s what the research actually shows — including what AI cannot and should never claim to do.

Ana’s Three-Year Acne “Scarring” That Wasn’t

Ana lives in São Paulo. For three years she had been treating what she assumed were acne scars — small, rough bumps concentrated on her forehead and around her hairline that never quite went away. She had tried prescription retinoids, chemical exfoliants, and a course of antibiotics her general practitioner prescribed when she mentioned it.

Nothing worked. If anything, the retinoid seemed to make things flare.

When she ran an AI skin scan, the analysis flagged the pattern as consistent with fungal acne (Malassezia folliculitis) — a condition caused by an overgrowth of yeast in the hair follicles rather than bacterial infection. The distribution pattern, the uniform small size of the lesions, and the hairline location are all hallmarks that the model had been trained to recognize.

She brought the analysis to a dermatologist. The dermatologist confirmed the suspicion and prescribed an antifungal. Within six weeks, the bumps had cleared. The AI didn’t diagnose her — it pointed her in the right direction, which allowed her to ask the right question in the right consultation.

What AI Skin Analysis Actually Measures

Modern AI skin analysis uses convolutional neural networks (CNNs) — the same class of computer vision models used in medical imaging for radiology and pathology. A well-trained model processes a photograph of skin and identifies patterns at a pixel level that are statistically associated with specific conditions. The key inputs are:

Texture Pattern Recognition

The model maps surface texture variations across the image — identifying roughness gradients, follicular patterns, and surface irregularities that correspond to conditions like comedones, milia, keratosis pilaris, or fungal infections. A 2021 study by Brinker et al. in Nature Medicine found that a CNN trained on 12,378 dermoscopic images matched or exceeded the average performance of board-certified dermatologists in classifying melanocytic lesions by texture and pattern — with an area under the ROC curve (AUC) of 0.86 versus the dermatologist group average of 0.82.

Pigmentation Pattern Analysis

The model identifies distribution, edge definition, and color depth of pigmented areas. Melasma has a characteristic bilateral, butterfly-distribution pattern. Post-inflammatory hyperpigmentation (PIH) tends to follow the shape of prior lesions. Solar lentigines cluster on chronically sun-exposed areas with defined borders. These distribution signatures are learnable features. A 2023 systematic review by Senan et al. in Diagnostics analyzing 42 studies found that deep learning models achieved a weighted average accuracy of 87.4% in classifying pigmentary disorders from dermatoscopic images.

Inflammation Markers

Redness, papule density, and pustule presence are identifiable visual features. The model distinguishes between the diffuse redness of rosacea, the localized redness of active acne, and the patchy redness of perioral dermatitis — which is a clinically important distinction because the treatments are different and some acne treatments (particularly topical steroids) can significantly worsen rosacea and perioral dermatitis.

Oil and Hydration Indicators

Surface light reflection patterns and pore appearance carry information about sebum levels and skin barrier function. Oily skin has characteristic specular highlights; dehydrated skin shows fine surface crinkling and dullness. These are learnable visual features — though lighting conditions significantly affect their reliability, which is why scan quality matters.

What AI Can Do Well — And What It Cannot

What It Can Do

A well-trained skin analysis model can reliably identify the likely condition type from visible features — distinguishing PIH from melasma, bacterial acne from fungal, or dry from dehydrated. It can suggest relevant ingredient categories based on identified conditions (retinoids for PIH, antifungals for Malassezia, barrier repair for dehydrated skin). And it can track changes over time when used consistently — giving an objective record of whether a condition is improving, worsening, or stable.

For conditions where early identification matters — acne, hyperpigmentation, early signs of barrier damage — an AI scan can surface useful pattern information before a person would otherwise seek professional attention. For many users, the primary value is direction-setting: it helps you ask better questions when you do see a professional.

What It Cannot Do

AI skin analysis cannot diagnose. This is not a legal disclaimer — it is a fundamental technical limitation. Diagnosis requires integrating visual findings with patient history, systemic symptoms, medication history, laboratory results, and in some cases biopsy. A photograph does not contain this information. A model that outputs “diagnosis: rosacea” is overstating what it actually knows.

AI cannot reliably evaluate suspicious lesions. The dermatology research community has published extensively on AI for melanoma detection — and while performance has improved, the clinical consensus (including from the International Dermoscopy Society) is that AI should function as a triage tool, not a replacement for dermoscopy with a trained specialist. If you have a lesion that has changed in size, color, or shape, see a dermatologist.

AI cannot account for internal causes. Hormonal acne, thyroid-related skin changes, nutritional deficiencies, and medication side effects all manifest on the skin — but their causes are internal. A skin scan identifies the visible pattern; it cannot tell you whether your breakouts are driven by cortisol, androgens, or a newly started medication. That requires a conversation with a doctor.

A Note on Training Data and Skin Tone Bias

AI skin analysis models are only as good as their training data. The majority of publicly available dermatology image datasets are heavily skewed toward lighter Fitzpatrick skin types (I–III). A 2021 analysis by Daneshjou et al. in PLOS Medicine found that 56.5% of images in three large dermatology datasets lacked Fitzpatrick skin type labels, and those that were labeled were disproportionately lighter-skinned.

This matters because skin conditions present differently across Fitzpatrick types. Redness is harder to detect on darker skin. Hyperpigmentation patterns vary. Conditions common in Latin American, South Asian, and Black skin tones have been underrepresented in research literature, and by extension, in training datasets. A well-built skin analysis tool should be explicit about its training data composition — and any consumer tool should be evaluated with this limitation in mind.

What Skin Scan Actually Does

Skin Scan uses a computer vision model trained specifically on a diverse range of skin tones and conditions common in Latin American and Mediterranean populations. It identifies likely condition patterns — not diagnoses — and pairs each finding with targeted ingredient guidance and, where relevant, a recommendation to consult a dermatologist.

The aim is to give you a useful, honest starting point — one that helps you make better decisions at the pharmacy, asks better questions in consultations, and tracks whether what you’re doing is actually working. Not to replace the professional relationship, but to make you a better participant in it.

“I thought I had acne scarring. The scan identified it as fungal acne. I brought the analysis to a dermatologist who confirmed it — completely different treatment. The bumps I had for three years were gone in six weeks.”

Ana, São Paulo

See What Your Skin Is Telling You

Sources

  • Brinker, T.J. et al. “Deep Learning Outperformed 136 of 157 Dermatologists in a Head-to-Head Dermoscopic Melanoma Image Classification Task.” European Journal of Cancer, 2019.
  • Esteva, A. et al. “Dermatologist-Level Classification of Skin Cancer with Deep Neural Networks.” Nature, 542, 115–118, 2017.
  • Senan, E.M. et al. “Diagnosis of Pigmentation Skin Diseases Using Deep Learning.” Diagnostics, 2023. (42-study systematic review.)
  • Daneshjou, R. et al. “Lack of Transparency and Potential Bias in Artificial Intelligence Data Sets and Algorithms.” PLOS Medicine, 18(7), 2021.
  • International Dermoscopy Society. Position statement on AI-assisted dermoscopy in clinical practice, 2022.
  • Tschandl, P. et al. “Human-Computer Collaboration for Fast Land Cover Mapping.” Nature Medicine, 2019.
← Back to Skin Scan

Skin Scan

AI-powered skin analysis by Maria & specialist contributors including Valeria Beautycare. Expert guidance from skincare specialists across Latin America.

Hormonal skin changes? MenoMamas covers the full picture — supplements, sleep, nutrition, and a 4-week protocol for menopause wellness.

Daily skincare habits? Daily Pull Skin Care Pack — curated prompts for consistent routines.

Built by Gavriel Shaw — AI product consulting available for founders shipping skin & health tech.

PrivacyTerms

© 2026 Skin Scan · Immortal Ventures LLC

How AI Skin Analysis Actually Works — Skin Scan Guide