After years of misdiagnoses and medical dead ends, 23‑year‑old Phoebe Tesoriere from Cardiff, Wales, did something many people have quietly wondered about: she asked ChatGPT what might be wrong with her. Within minutes, the AI suggested conditions that eventually lined up with what specialists later confirmed.

If you’ve ever felt dismissed, confused, or stuck in a maze of referrals and vague answers, her story probably hits close to home. At the same time, it can feel unsettling: what does it mean when an AI chatbot picks up what multiple appointments missed?

In this article, we’ll unpack what happened in Phoebe’s case (as reported by outlets like the New York Post), what AI can and cannot safely do for your health, and how to use tools like ChatGPT in a way that supports—not replaces—your doctors.

Portrait of Phoebe Tesoriere, the Welsh woman whose diagnosis was helped by ChatGPT
Phoebe Tesoriere turned to ChatGPT after years of unclear answers about her symptoms. (Image credit: New York Post)

When Doctors Can’t Find the Answer: The Diagnostic Gap

Modern medicine is powerful, but diagnosis is still hard. Especially for complex, overlapping, or rare conditions, it’s common for patients to spend years searching for an explanation.

  • Symptoms may appear unrelated (fatigue, pain, brain fog, stomach issues).
  • Different specialists see only their “slice” of your health.
  • Lab tests can be inconclusive or look “normal” even when you feel awful.
  • Time‑pressured visits leave little room for deep pattern‑finding.

“Diagnostic error is a significant challenge in health care. Even in well‑resourced systems, a meaningful minority of patients experience delays or inaccuracies before reaching the right diagnosis.”

— Adapted from reports by the U.S. National Academies of Sciences, Engineering, and Medicine

That’s the gap where many people, like Phoebe, now turn to the internet—and increasingly, to AI tools—for help connecting the dots.


What Happened in Phoebe’s Case?

According to coverage by the New York Post and other outlets in April 2026, Phoebe had been struggling for years with debilitating symptoms. She saw multiple clinicians but kept receiving partial explanations and treatments that didn’t fully help.

Feeling desperate, she tried something new: she carefully typed out her symptom history and medical details into ChatGPT. The bot responded with a list of possible conditions and, crucially, an explanation of why those conditions might fit her pattern of symptoms.

  1. She gathered her symptoms and medical history.
  2. She described them to ChatGPT in detail.
  3. ChatGPT suggested condition(s) that had not been fully explored.
  4. She took those suggestions back to health professionals.
  5. Further testing with specialists led to a more accurate diagnosis.

How AI Like ChatGPT Can Help in Medical Diagnosis (Within Limits)

Tools like ChatGPT are essentially pattern‑matchers. They’ve been trained on huge amounts of text, including publicly available medical information, clinical guidelines, and patient stories. That gives them some unique strengths—alongside serious limitations.

Potential Benefits When Used Carefully

  • Pattern recognition: AI can consider multiple symptoms at once and suggest conditions that might explain the whole picture.
  • Information organizer: It can help you summarize your history into a clear, concise form to bring to your doctor.
  • Question generator: It can suggest questions to ask your clinician, so you feel more prepared for appointments.
  • Education aid: It can explain medical terms or tests in plain language.

Serious Limitations You Need to Know

  • It does not examine you or see your body language.
  • It cannot review your actual imaging, lab files, or real‑time vital signs.
  • It may “hallucinate” (make up) references or overconfident explanations.
  • It is not a licensed clinician and carries no legal or ethical responsibility for your care.

“AI can be a powerful adjunct to clinical care, but it should never be treated as a doctor. Its role is to inform conversations with health professionals, not replace them.”

— Typical position from major medical bodies such as the American Medical Association and the World Health Organization
Patient and doctor discussing health information on a tablet
The safest use of AI in health is as a conversation starter between you and your clinician, not as a replacement.

A Step‑by‑Step Guide to Using ChatGPT Safely for Health Questions

If you’re considering using ChatGPT or another AI to understand your symptoms, here’s a practical framework that balances curiosity with safety.

1. Protect Your Privacy

  • Avoid sharing your full name, address, phone number, or ID numbers.
  • Describe your age, gender, and relevant background instead of personal identifiers.
  • Remember that online tools are not the same as private medical records.

2. Describe Your Situation Clearly

Include:

  • Key symptoms (what, where, how long, how severe).
  • What makes symptoms better or worse.
  • Past diagnoses, medications, and major tests already done.
  • Any urgent red‑flag signs (for these, you should contact urgent care right away).

3. Ask for Possibilities, Not a Diagnosis

Instead of asking “What’s my diagnosis?”, try:

  • “What are some possible explanations for these symptoms?”
  • “What questions should I ask my doctor about this?”
  • “What tests are commonly used to investigate these issues?”

4. Cross‑Check With Trusted Sources

Use AI answers as a map, then verify information on reputable health sites such as:

5. Bring AI‑Generated Notes to Your Doctor

You might say:

“I used an AI tool to organize my symptoms. It suggested a few possibilities like X, Y, and Z. Can we talk through whether any of these make sense in my case?”

Woman reviewing symptom notes on a laptop at a desk
Using AI to organize your story can make medical visits more focused and effective.

Common Obstacles Patients Face—and How AI May Help (Or Hurt)

Phoebe’s experience is inspiring, but it also highlights very real emotional and practical challenges in the diagnostic journey.

Feeling Dismissed or Not Believed

Many patients—especially women, people of color, and those with chronic or “invisible” conditions—report feeling brushed off. An AI that responds calmly and thoroughly can feel validating, but it’s not a substitute for being heard by a real clinician.

Information Overload and Anxiety

Searching online can spiral into worst‑case scenarios. AI is similar: it may mention rare or serious diseases that are statistically unlikely, which can raise anxiety.

  • Use AI to focus on questions and next steps, not on self‑diagnosing a specific disease.
  • If a suggestion frightens you, write it down and calmly review it with your doctor.

Trust and Relationship With Your Clinician

Some people worry that bringing AI printouts to appointments might irritate their doctor. In reality, many clinicians appreciate well‑organized information—as long as it’s presented respectfully.

  • Frame AI suggestions as conversation starters, not ultimatums.
  • Ask: “How do these ideas compare with your thinking?”
  • Be open to hearing why some AI suggestions are unlikely or not relevant to you.

What Does the Science Say About AI and Diagnosis?

Research into medical AI is moving fast. While individual news stories can be dramatic, it’s important to look at broader data.

  • Studies in areas like dermatology and radiology have found that some AI tools can match or sometimes exceed expert performance for highly specific tasks (for example, classifying certain skin lesions from images).
  • Large language models like ChatGPT have shown promising accuracy in answering medical exam questions and generating differential diagnoses in controlled studies—but they still make non‑trivial errors.
  • Safety, bias, and accountability remain major concerns highlighted by organizations such as the World Health Organization.
In clinical practice, AI is increasingly used behind the scenes to support doctors—especially in imaging and pattern recognition—rather than to replace them.

Overall, experts tend to agree on two key points:

  1. AI can be very useful for pattern recognition and decision support.
  2. It must be embedded in a system where humans remain in charge and patients’ safety, privacy, and autonomy are protected.

Practical Steps If You’re Still Searching for a Diagnosis

Whether or not you choose to use ChatGPT, you deserve clear, compassionate care. Here are evidence‑informed steps you can take.

1. Build a Symptom Timeline

  • Note when each symptom started, how it has changed, and what affects it.
  • Include sleep, stress, diet changes, infections, travel, or injuries.
  • Consider using AI to help you format and condense this into 1–2 clear pages.

2. Gather Your Records

  • Lab results, imaging reports, discharge summaries, medication lists.
  • Ask your clinic or hospital for digital copies when possible.

3. Consider a Second (or Third) Opinion

It’s reasonable to seek another viewpoint—especially at major academic or specialty centers—if your symptoms remain unexplained or you feel your concerns aren’t being addressed.

4. Use AI as a Companion, Not the Captain

  • Let AI help with summaries, explanations, and brainstorming.
  • Keep your final decisions grounded in conversations with trained professionals.
A simple notebook or digital journal, supported by AI summaries, can become one of your most powerful health tools.

Key Safety Reminders Before You Rely on AI for Health

  • AI is not a doctor. It does not replace physical exams, tests, or clinical judgment.
  • Don’t delay urgent care. If something feels like an emergency, act first and research later.
  • Beware of miracle claims. No AI can guarantee a cure or “secret” diagnosis.
  • Use reputable platforms. Prefer tools backed by recognized organizations and strong privacy protections.
  • Stay critical. AI can be wrong, biased, or outdated. Always cross‑check.

Finding Answers in a New Era of Medicine

Phoebe Tesoriere’s story is powerful—not because “a chatbot beat the doctors,” but because it shows how a determined patient used every available tool to advocate for herself. AI helped her see new possibilities, and skilled clinicians turned those possibilities into a real, evidence‑based diagnosis.

You deserve that same combination of curiosity, support, and rigor. If you’re still searching for answers:

  • Keep a clear record of your symptoms and history.
  • Use AI to organize thoughts and generate questions, not to self‑diagnose.
  • Partner with clinicians you trust—and don’t be afraid to seek another opinion.

Your next step:

  1. Set aside 20–30 minutes to write a one‑page summary of your health story.
  2. Optionally, use an AI tool to help you refine and clarify it.
  3. Book an appointment with your clinician and bring that summary as the starting point for a deeper, more focused conversation.

You’re not alone in this. Technology is changing, medicine is evolving, and your voice—supported by the right tools—can make a real difference in your care.