home bbs files messages ]

Just a sample of the Echomail archive

Cooperative anarchy at its finest, still active today. Darkrealms is the Zone 1 Hub.

   CONSPRCY      How big is your tinfoil hat?      2,445 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 1,547 of 2,445   
   Mike Powell to All   
   Using ChatGPT as therapis   
   28 Jul 25 07:55:20   
   
   TZUTC: -0500   
   MSGID: 1281.consprcy@1:2320/105 2cecae39   
   PID: Synchronet 3.21a-Linux master/123f2d28a Jul 12 2025 GCC 12.2.0   
   TID: SBBSecho 3.28-Linux master/123f2d28a Jul 12 2025 GCC 12.2.0   
   BBSID: CAPCITY2   
   CHRS: ASCII 1   
   FORMAT: flowed   
   We havent figured that out yet: Sam Altman explains why using ChatGPT as your   
   therapist is still a privacy nightmare   
      
   Date:   
   Mon, 28 Jul 2025 11:22:17 +0000   
      
   Description:   
   Seeking therapy from an AI like ChatGPT could come with a lot of risks, says   
   OpenAI CEO Sam Altman.   
      
   FULL STORY   
      
   One of the upshots of having an artificial intelligence (AI) assistant like   
   ChatGPT everywhere you go is that people start leaning on it for things it    
   was never meant for. According to OpenAI CEO Sam Altman, that includes    
   therapy and personal life advice  but it could lead to all manner of privacy   
   problems in the future.    
      
   On a recent episode of the This Past Weekend w/ Theo Von podcast, Altman   
   explained one major difference between speaking to a human therapist and    
   using an AI for mental health support: Right now, if you talk to a therapist   
   or a lawyer or a doctor about those problems, theres legal privilege for it.   
   Theres doctor-patient confidentiality, theres legal confidentiality,    
   whatever. And we havent figured that out yet for when you talk to ChatGPT.    
      
   One potential outcome of that is that OpenAI would be legally required to   
   cough up those conversations were it to face a lawsuit, Altman claimed.   
   Without the legal confidentiality that you get when speaking to doctor or a   
   registered therapist, there would be relatively little to stop your private   
   worries being aired to the public.    
      
   Altman added that ChatGPT is being used in this way by many users, especially   
   young people , who might be especially vulnerable to that kind of exposure.   
   But regardless of your age, the conversation topics are not the type of   
   content that most people would be happy to see revealed to the wider world.   
      
   A risky endeavor   
      
   The risk of having your private conversations opened up to scrutiny is just   
   one privacy risk facing ChatGPT users.    
      
   There is also the issue of feeding your deeply personal worries and concerns   
   into an opaque algorithm like ChatGPTs, with the possibility that it might be   
   used to train OpenAIs algorithm and leak its way back out when other users    
   ask similar questions.    
      
   Thats one reason why many companies have licensed their own ring-fenced   
   versions of AI chatbots. Another alternative is an AI like Lumo , which is   
   built by privacy stalwarts Proton and features top-level encryption to    
   protect everything you write.    
      
   Of course, theres also the question of whether an AI like ChatGPT can replace   
   a therapist in the first place. While there might be some benefits to this,   
   any AI is simply regurgitating the data it is trained on. None are capable of   
   original thought, which limits the effectiveness of the advice they can give   
   you.    
      
   Whether or not you choose to open up to OpenAI, its clear that theres a   
   privacy minefield surrounding AI chatbots, whether that means a lack of   
   confidentiality or the danger of having your deepest thoughts used as    
   training data for an inscrutable algorithm.    
      
   Its going to require a lot of effort and clarity before enlisting an AI   
   therapist is a significantly less risky endeavor.   
      
   ======================================================================   
   Link to news story:   
   https://www.techradar.com/ai-platforms-assistants/chatgpt/we-havent-figured-th   
   at-out-yet-sam-altman-explains-why-using-chatgpt-as-your-therapist-is-still-a-   
   privacy-nightmare   
      
   $$   
   --- SBBSecho 3.28-Linux   
    * Origin: capitolcityonline.net * Telnet/SSH:2022/HTTP (1:2320/105)   
   SEEN-BY: 105/81 106/201 128/187 129/14 305 153/7715 154/110 218/700   
   SEEN-BY: 226/30 227/114 229/110 111 206 300 307 317 400 426 428 664   
   SEEN-BY: 229/700 705 266/512 291/111 320/219 322/757 342/200 396/45   
   SEEN-BY: 460/58 712/848 902/26 2320/0 105 304 3634/12 5075/35   
   PATH: 2320/105 229/426   
      

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca