' Nabla, a French start-up specializing in healthcare technology, tested GPT-3 as a medical chatbot, though OpenAI itself warned against such use. As expected, GPT-3 showed several limitations. For example, while testing GPT-3 responses about mental
health issues, the AI advised a simulated patient to commit suicide.'
https://en.wikipedia.org/wiki/GPT-3#Applications
--- SoupGate-Win32 v1.05
* Origin: fsxNet Usenet Gateway (21:1/5)