From Post-Truth to Post-Empathy, or Not

Dr. Todd Essig
 

Will post-empathy follow post-truth as society shifts from a print to a digital information architecture? And if so, can psychoanalysis minimize the loss? Maybe, it’s up to us.

0
Comments
1222
Read

Post-truth and fake news can be seen both as reflecting the profound changes roiling journalism and part of an emerging trend. According to a father-son pair of economists (Susskind & Susskind, 2015), all the professions, from journalism to law to consulting to healthcare (and, yes, that includes providing psychoanalytic care) face a transformed future. They argue that artificial intelligence (AI), big data, and global connectivity are transforming professional practice in ways far more profound than just increased efficiencies. Instead, they (we!) face a fundamentally transformed future in which complex, technologically-mediated systems will meet demand for professional services previously met by one-on-one bespoke solutions.

In this telling post-truth and fake news are just one consequence of one profession – journalism – struggling to keep pace as society’s information architecture transforms from print-based to digital. Journalism has simply not been able to keep up. But the transformation crosses all professions. So there will assuredly be other consequences in other domains, including ours. That is why I want to ask: What goes ‘post-’ or ‘fake’ next? What’s the next loss after facts and truth? What other values and experiences are on the digital chopping block? And what can we learn from journalism’s troubles trying to adapt? 

I would like to suggest that empathy, the essence of I-Thou relations central to contemporary psychoanalysis (Buber, 1970/1923; Greenberg & Michell, 1983), is a particularly prime candidate for going ‘post-’. Just as journalism is transforming in the transition to a digital information architecture with a consequent rise in fake news and post-truth, mental healthcare, including providing psychoanalytic care, is being similarly transformed with empathy at risk. But I do not want merely to suggest we are on the precipice of post-empathy and artificial intimacy, though it is clear to me we are (Essig, Turkle & Russell, 2018). Rather, I want to suggest psychoanalysis is at a crossroads where we can help undermine the rise of post-empathy and artificial intimacy before it’s too late, or not. We are in a privileged place where we can both celebrate the potentials of the cultural transition to a digital information architecture and, at the same time, protect the sanctity of direct, technologically un-mediated in-person empathy and intimacy. Of course, getting to the point where such a ‘both-and’ stance gets widely accepted so it can help organize decisions throughout the psychoanalytic community will be difficult, perhaps impossible, to achieve.

To be clear, when talking about post-truth, facts and truth still exist (and empathy will as well with post-empathy). It’s just that the ‘facts of the matter’ no longer matter as they once did. Truth is not valued as it once was. Traditional professional journalism, the ‘free press’ of the democratic ideal, had been tasked to objectively filter reality. Journalistic ethics were there to protect truth. Traditional media icons were thus to be believed when they presented inconvenient truths. And for some it still does. But social media, artificial intelligence (AI), decentralized distribution channels, corporate profit pressures, and the ease of creating indistinguishable fakery with a few clicks have created a media landscape where reality is now filtered to confirm prejudices and reinforce reference group identity. Things just have to look ‘truth-y.’ Exploiting confirmation biases to ‘chase eyeballs’ is where the action is. Of course, there are still journalists chasing the truth and real news. But today’s media icons are there to entertain, soothe, and offer confirmation of one’s prior beliefs. Inconvenient truths are dismissed as fake news. The original aspirational task is getting lost. 

From 2007 until the pandemic I watched journalism losing its grip on truth from the inside. Along with my practice I wrote several hundred columns as a technology and mental health commentator, first for a news start-up called True/Slant and then at Forbes after it was bought by them (Essig, 2020). What I saw was a profession reeling from the digital transformation where every version of reality was now available with a click. Writers and editors struggled with this newly vicious competition for attention. As one of them I was paid in fractions of a penny depending on how many and what kind of readers clicked on my articles.

Early on I started working with two other writers [1] trying to figure out the mysteries of SEO (search engine optimization) and how to plant the seeds for something going ‘viral.’ We closely followed our metrics, happy when numbers climbed into the tens or hundreds of thousands, or more, and sad when a piece we were proud of never gained traction. And what I learned while ‘chasing eyeballs’ is in retrospect pretty obvious; truths ignored have the same influence as truths never reported and any truth or real news too discordant with someone’s prior beliefs and prejudices will get ignored. People were losing interest in facts when noticing them, let alone accepting their reality, was even mildly uncomfortable. After all, why do that when there is a seemingly just as plausible version a click away that makes you feel good about yourself, your life, and the decisions you are making. Apparently, in our post-truth world of fake news the only ‘paper of record’ that matters is the one that provides ‘truths’ you want to hear.

I believe empathy faces a similar future of not being valued, of people becoming increasingly indifferent to the experience and consequences of genuine in-person intimacies. How will this happen? Empathy will go ‘post-’ step-by-sleepwalking-step for similar reasons of comfort and convenience. Eventually, the time-consuming, often messy and at times difficult human processes of empathy will seem anachronistic and inefficient compared to what can be achieved by big data fueled AI. These changes in what we value paired with technological developments are the twin engines for the emerging post-empathy world of artificial intimacy. (Essig, Turkle, and Russell, 2018). Which brings us to the central questions: Will we passively let the I-It drown the I-Thou in a flood of technologically-mediated simulations of empathy and, if not, what to do? 

Early versions of post-empathy already abound. Habit-forming simulations of empathic understanding are in that jolt of recognition when shopping online and something gets recommended you didn’t even realize you wanted; that appreciation you feel when your Spotify account, rather than a friend, recommends a new artist who brings you joy; that enjoyment from personalized TikTok videos or when Netflix suggests your favorite next show (Pieraccini, 2021; Schrage, 2020). 

But post-empathy is already way beyond those increasingly familiar and powerful recommendation engines. Relationship simulations already exist [2]. Replika [3] is an AI-fueled chat-bot ‘friend.’ The web-site promises ‘The AI companion who cares. Always here to listen and talk. Always on your side.’ People are already having emotionally intense exchanges with the program, including men creating AI girlfriends and then sometimes verbally abusing them (Bardhan, 2022). In the mental health world consider Woebot [4] , an AI-fueled chat-bot that provides CBT treatment with no human therapist involved. Its web-site proudly announced ‘Welcome to the future of mental health.’ Or Elle [5] , an AI-fueled video avatar adapted to treat veterans experiencing depression and post-traumatic stress syndrome. Counseling—of a sort—with no human on the other side of the screen is already here. And as deep fake and video simulation technologies become increasingly photo-realistic, it will become easier and easier to confuse and really not care about whether a session is with another human or whether an AI program is the ‘provider.’ Soon, and I know this sounds like dystopian science fiction too absurd to take seriously as a likely, or even possible, future, people with a psychoanalytic turn of mind will choose teleanalysis – of a sort – by AI instead of psychoanalysis by a person. 

Of course, no on-screen AI simulation will ever fully capture the creative, intuitive, genuine care a psychoanalyst provides, whether in person or on screen. But that objection misses the point and the lessons of post-truth. For post-empathy to triumph the technology does not need to be an indistinguishable simulation. Post-truth shows that many will seek post-empathy on-screen AI-fueled treatment not because it’s the same as on-screen human treatment but because once you’re on screen the easier and more convenient becomes irresistible (Alter, 2017). Once people get used to therapy on screen as indistinguishable from seeking help in person then any significant overlap will be good enough. When the unique value of genuine, fully embodied empathy gets devalued many will choose increasingly sophisticated, convenient, and inexpensive treatment by AI program, however limited, over treatment conducted by a person because we have been trained to no longer value the rich, messy, experience of fully and mutually embodied empathy 

Like those journalists still fighting to protect facts and truth, I believe psychoanalysts need to join the fight protecting richly embodied empathy, the kind built on the affect attunements, implicit imitations, interactional synchronies, relational affordances, and other processes only possible when people relate in person and not on screen. And that’s where the ‘both-and’ stance comes in. I believe the only way to protect empathy is by both taking full advantage of all the possibilities afforded by the emerging digital information architecture and vigorously protecting the sanctity of empathic connections only possible when people are bodies together in the same place at the same time.

Our profession’s future just may depend on this ‘both-and.’ Two risks immediately present themselves. Should we fail to fully appreciate the differences between what takes place on screen and what can only take place in person we inadvertently take a step towards the post-empathy world of AI-based mental healthcare, of psychoanalysis without an analyst. Both organizationally and clinically when providing teleanalysis or teletherapy, the inevitable losses and limitations of on screen relating should be explicitly recognized [6]. At the same time, we can also accelerate the move to a post-empathy world in the other direction by not appreciating the richness and depth of experience possible on screen. Then we would become like a news organization insisting that paper and ink are necessary for something to be a newspaper. To thrive psychoanalysis just may need to become radically ‘both-and’ by embracing both technology’s promise and fully appreciating all that is possible only when people are in the room together.

Unfortunately, the ‘both-and’ does not yet have wide currency in psychoanalysis. Instead, too many focus one or another aspect. Many advocate what can be seen as a refusal to participate in the world as it is. As a member of the IPA Task Force on Remote Treatment in Training I heard many colleagues draw a hard line in the sand: if bodies are not in the room together nothing of genuine psychoanalytic value takes place. Like monks after the printing press was invented asserting only hand-copied scriptures had value, refusing to accept the possibilities of digital information for psychoanalytic care will likely only further separate us from the people who would benefit from the care we provide.

Others are diving head first into all the technological possibilities of the present moment apparently unaware of how shallow the water can be. This is what the APsaA has done with their recently approved Standards and Guidelines. They explicitly recognized only the benefits of distance education for psychoanalytic training and made candidate teleanalysis one choice among many. For them, shared embodiment was essentially made irrelevant—perhaps even an unnecessary inconvenience – thereby bringing us one step closer to a post-empathy world.

To conclude, the lesson from post-truth and fake news is that if we are to work against the likely rise of post-empathy then the place to do it, where we should plant our flag and fight, is not on the obvious differences between a program-generated screen image and a person-generated screen image. It is on the difference between being on screen and being in person. And the way to do that is to embrace both all the possibilities for providing psychoanalytic care on screen and all the inevitable losses, limitations, and perils of doing so. If we don’t protect in person empathy, who will?
[1] Jeff McMahon writes about green technology. His work can be found at http://forbes.com/sites/jeffmcmahon. David DiSalvo writes about science and health. His work can be found at http://forbes.com/sites/daviddisalvo. My archive is at https://www.forbes.com/sites/toddessig
[2] For a particularly chilling vision of where this is going see David Levy, in Love and Sex with Robots, describing the path to a near future where the lonely, or just the interested, will fall in love with their ‘emotionally intelligent’ sexbots (see also Knafo & Bosco, 2016).
[3] Found at https://replika.ai
[4] Found at https://woebothealth.com
[5] Found at https://ict.usc.edu/prototypes/
[6] For a description of the paradoxical clinical dance of simultaneously being immersed in technologically-mediated psychoanalytic relating and aware of its limitations and losses see Essig & Russell, 2021.
 
References
Alter, A. (2017). Irresistible: The Rise of Addictive Technology and the Business of Keeping us Hooked. Hamondsworth: Penguin.
Bardhan, A. (2022). Men are creating AI girlfriends and then verbally abusing them. Futurism. Downloaded 1/20/2020 from https://futurism.com/chatbot-abuse.
Buber, M. (1970). I and Thou (W. Kaufmann, Trans.). New York, NY: Charles Scribner & Sons. (Original work published 1923).
Essig, T. (2020). ‘Training Done? Write!’ A response to Alexander Stein. Psychoanalytic Perspectives, 17(2), 173-182.
Essig, T., & Russell, G. I. (2021). A report from the field: Providing psychoanalytic care during the pandemic. Psychoanalytic Perspectives, 18(2), 157-177.
Essig, T., Turkle, S. & Russell, G.I. (2018). Sleepwalking towards artificial intimacy: How psychotherapy Is failing the future. https://www.forbes.com/sites/toddessig/2018/06/07/sleepwalking-towards-artificial-intimacy-how-psychotherapy-is-failing-the-future/
Greenberg, J. R., & Mitchell, S.A. (1983). Object Relations in Psychoanalytic Theory. Cambridge, MA: Harvard University Press. 
Knafo, D., & Bosco, R.L. (2016). The Age of Perversion: Desire and Technology in Psychoanalysis and Culture. London: Routledge.
Levy, D. (2007). Love and Sex with Robots: The Evolution of Human-Robot Relationships (p. 352). London: HarperCollins.
Pieraccini, R. (2021). AI Assistants. Cambridge, Mass.: MIT Press
Schrage, M. (2020). Recommendation Engines. Cambridge, Mass.: MIT Press
Susskind, R. E. & Susskind, D. (2015). The Future of the Professions: How Technology will Transform the Work of Human Experts. Oxford University Press, USA.
 

More articles by:
 


Star Rating

12345
Current rating: 4.6 (31 ratings)

Comments

*You must be logged in with your IPA login to leave a comment.