AI for health research: how to not freak yourself out
Quote from Diane Park on March 15, 2026, 10:30 amAs a pharmacist and health writer, I want to address something I've been seeing: people using AI to research health symptoms and diagnose themselves. This is both useful and dangerous, so let me give you a framework.
DO use AI to:
- Understand medical terminology after a doctor visit
- Research questions to ask your doctor
- Learn about medications, side effects, and interactions
- Understand lab results ("my A1C is 5.8, what does that mean?")
- Find evidence-based lifestyle interventions for known conditionsDO NOT use AI to:
- Diagnose yourself ("I have a headache and fatigue, what's wrong with me?" will return everything from dehydration to brain cancer)
- Replace professional medical advice
- Make decisions about stopping or changing medications
- Interpret imaging results (MRI, CT, X-ray)Key prompt for medical questions: Always add "cite your sources" or "link me to the relevant studies." If the AI can't provide a real, verifiable source, treat the information with extra skepticism.
AI is a phenomenal tool for health literacy. It's a terrible replacement for a healthcare provider.
As a pharmacist and health writer, I want to address something I've been seeing: people using AI to research health symptoms and diagnose themselves. This is both useful and dangerous, so let me give you a framework.
DO use AI to:
- Understand medical terminology after a doctor visit
- Research questions to ask your doctor
- Learn about medications, side effects, and interactions
- Understand lab results ("my A1C is 5.8, what does that mean?")
- Find evidence-based lifestyle interventions for known conditions
DO NOT use AI to:
- Diagnose yourself ("I have a headache and fatigue, what's wrong with me?" will return everything from dehydration to brain cancer)
- Replace professional medical advice
- Make decisions about stopping or changing medications
- Interpret imaging results (MRI, CT, X-ray)
Key prompt for medical questions: Always add "cite your sources" or "link me to the relevant studies." If the AI can't provide a real, verifiable source, treat the information with extra skepticism.
AI is a phenomenal tool for health literacy. It's a terrible replacement for a healthcare provider.
Quote from Sarah Chen on March 15, 2026, 2:00 pmThe "cite your sources" tip is so important. I've caught ChatGPT making up study names before. It'll confidently reference a "2023 study published in The Lancet" that literally doesn't exist. Always verify.
Perplexity is actually better for medical research because it shows its sources inline. You can click through and verify in real time.
The "cite your sources" tip is so important. I've caught ChatGPT making up study names before. It'll confidently reference a "2023 study published in The Lancet" that literally doesn't exist. Always verify.
Perplexity is actually better for medical research because it shows its sources inline. You can click through and verify in real time.
Quote from Lisa Morales on March 16, 2026, 6:45 amI definitely fell into the self-diagnosis trap early on with my dad. Every symptom I googled or asked ChatGPT about led me down a terrifying rabbit hole. Diane is right — use it to prepare for doctor visits, not replace them.
What HAS been helpful is asking AI to explain my dad's medications and potential interactions. He's on like 8 different things and keeping track of what interacts with what is a full time job.
I definitely fell into the self-diagnosis trap early on with my dad. Every symptom I googled or asked ChatGPT about led me down a terrifying rabbit hole. Diane is right — use it to prepare for doctor visits, not replace them.
What HAS been helpful is asking AI to explain my dad's medications and potential interactions. He's on like 8 different things and keeping track of what interacts with what is a full time job.