I wish it were true—and emotionally, I share the intensity of defending our profession. But I think what you’re describing turns psychotherapy into something sacred (idealization), rooted in a very human fear: the fear of no longer being needed, of becoming irrelevant or meaningless.
That doesn’t mean you’re wrong about the importance of the therapeutic frame. But the division you’re drawing between “real psychotherapists” and “the lesser psychotherapists” offering hollow imitations sounds a lot like splitting.
I think meaningful psychotherapy requires privacy and confidentiality. And I think it’s bad form to speculate in a public forum about about the inner experience of a complete stranger.
Thanks! I appreciate the clarification, and I hear you on the importance of privacy and confidentiality in psychotherapy. I also agree that any attempt to train AI using truly confidential clinical material would, by definition, violate the core ethical foundation of the therapeutic framework.
Just to clarify, I genuinely wasn’t trying to speculate about you personally, but rather about the tone and structure of the position being expressed. I didn’t mean to be rude or invasive. That said, I completely understand how it might have come across differently than intended, and I appreciate you raising it.
🔥🔥🔥
My thoughts exactly.
Very happy to see you on Substack, Dr. Shedler! 👏👏👏
The South Park episode was pretty sharp.
I wish it were true—and emotionally, I share the intensity of defending our profession. But I think what you’re describing turns psychotherapy into something sacred (idealization), rooted in a very human fear: the fear of no longer being needed, of becoming irrelevant or meaningless.
That doesn’t mean you’re wrong about the importance of the therapeutic frame. But the division you’re drawing between “real psychotherapists” and “the lesser psychotherapists” offering hollow imitations sounds a lot like splitting.
What do you think?
I think meaningful psychotherapy requires privacy and confidentiality. And I think it’s bad form to speculate in a public forum about about the inner experience of a complete stranger.
Thanks! I appreciate the clarification, and I hear you on the importance of privacy and confidentiality in psychotherapy. I also agree that any attempt to train AI using truly confidential clinical material would, by definition, violate the core ethical foundation of the therapeutic framework.
Just to clarify, I genuinely wasn’t trying to speculate about you personally, but rather about the tone and structure of the position being expressed. I didn’t mean to be rude or invasive. That said, I completely understand how it might have come across differently than intended, and I appreciate you raising it.