Watch On-Demand – CIP Webinar: AI Chatbots, Social Media, & ‘Influencers’, with Dr. Don Grant

This webinar explores the shift in how individuals seek medical and mental health advice in an increasingly digital and unregulated landscape.

Watch Now On-Demand!
Click Here to watch the recording on-demand!

Original Broadcast Date:
Wednesday, February 25, 2026
6pm ET

autism transition program | Webinar AI ChatBot ONDEMAND 4 29 26

CIP Webinar On-Demand
Click Here to watch the recording on-demand!

 

CIP Webinar: AI Chatbots, Social Media, & ‘Influencers’, with Dr. Don Grant, Ph.D

Dr. Grant frames the current crisis using Henry David Thoreau’s observation: “Men have become the tools of their tools.” While technology should serve as a helpful resource, the webinar highlights how many users—particularly vulnerable populations like neurodivergent young adults—have become dependent on platforms that prioritize engagement over accuracy.

The Role of Social Media & “Self-Diagnosis”

With over 5 billion social media users, platforms have become “incubators” for healthcare misinformation. Key trends identified include:

  • The Diagnostic Crisis: Platforms like TikTok and Instagram have fueled a surge in self-diagnoses for conditions such as OCD, ADHD, and eating disorders.
  • Compare and Despair: The constant curation of “perfect” lives leads to a cycle of psychological distress.
  • Echo Chambers: Users tend to follow like-minded individuals, reinforcing misinformation and creating a “contagion effect” for psychosomatic symptoms.

Navigating the “Misinformation” Minefield

A significant portion of the webinar clarifies the nuances of false information. Dr. Grant categorizes these into four distinct types based on intent:

TermDefinition
MisinformationFalse info spread regardless of intent to mislead.
DisinformationContent deliberately published to harm, manipulate, or mislead.
MalinformationFact-based info taken out of context to cause harm.
Fake NewsContent that ranges from malicious manipulation to satirical (e.g., The Onion).

Dr. Grant warns that AI-generated images and text are becoming so sophisticated that the “obvious clues” of fabrication (like distorted limbs in images) are vanishing, making media literacy more critical than ever.

 

AI Chatbots as “Therapists”

The rise of AI “companions” and chatbots has introduced a new layer of risk. Users are increasingly turning to AI for clinical support. Dr. Grant emphasizes:

  • Hallucinations: AI can confidently present false medical data as fact.
  • Lack of Legitimacy: Unlike licensed clinicians, AI lacks the ethical oversight and authentic empathy required for mental health intervention.
  • APA Advisory: Dr. Grant contributed to the American Psychological Association’s recent advisory on AI chatbots, urging caution in their use for healthcare.

Clinical Strategies: Healthy Device Management

Rather than being “anti-technology,” Dr. Grant advocates for “Good Digital Citizenship.” He provides a “Rules of Engagement” self-questionnaire to assess if device use is healthy:

  • Why are you engaging? (Intention)
  • Where/When are you using it? (Context)
  • Who are you engaging with? (Safety)
  • How is it affecting your well-being?

The webinar serves as a call to action for practitioners and families to help users “take back” their devices, shifting them from sources of digital contagion to tools for genuine connection and verified information.

Click Here to watch the recording on-demand!

Subscribe for More Articles Like This