ChatGPT no longer provides legal or medical advice

Tech News

image

On October 29 this year, OpenAI updated its service usage policy. From now on, users are prohibited from seeking legal or medical advice through ChatGPT, according to the company’s official website.

The updated policy also restricts ChatGPT from offering guidance in areas related to education, housing, employment, financial activities and lending, insurance, as well as key public services. In addition, ChatGPT will no longer provide advice on migration or national security matters.

“We require users to use our services responsibly. Failure to comply with the rules or safety measures may result in suspension of access to the systems or other actions,” OpenAI said in its statement.

The company also reminded users that its services are strictly prohibited from being used for threats, harassment, defamation, promoting self-harm, or any other harmful activities. Among the banned uses are sexual violence, terrorism, weapon manufacturing or use, and gambling.

Furthermore, OpenAI banned using its tools to create someone’s realistic image or voice without consent, to assess or predict criminal behavior based on personal traits, or for other invasive purposes.

It also prohibited content that criticizes minors’ appearance, promotes unhealthy eating habits, or facilitates access to age-restricted products or activities.

Earlier, OpenAI had announced a new safety feature for ChatGPT that alerts parents if a child’s mental health condition worsens.


Tags

ChatGPT OpenAI

Rate Count

0

Rating

3

Rate this article

Share with your friends