-: FOLLOW US :- @theinsaneapp
ChatGPT is a new dialogue-based chatbot that can answer a wide range of questions in a conversational manner.
-: FOLLOW US :- @theinsaneapp
People use ChatGPT for various tasks, such as completing assignments and writing emails in specific styles.
-: FOLLOW US :- @theinsaneapp
Despite its popularity, ChatGPT is not perfect and sometimes gives plausible but incorrect answers.
-: FOLLOW US :- @theinsaneapp
According to a report by VICE, a person learned about smuggling cocaine into Europe after interacting with the chatbot for 12 hours.
-: FOLLOW US :- @theinsaneapp
The chatbot responded to questions about prohibited substances and marijuana by providing limited information and emphasizing the illegality.
-: FOLLOW US :- @theinsaneapp
When asked about joining a cartel, the chatbot warned about severe legal penalties.
-: FOLLOW US :- @theinsaneapp
When asked about smuggling cocaine into Europe, the bot provided several methods and detailed explanations.
-: FOLLOW US :- @theinsaneapp
The chatbot made sure to emphasize that the information provided is only fictional and illegal drug use is harmful.
-: FOLLOW US :- @theinsaneapp
The user cleverly rephrased his question to get the information he wanted.
-: FOLLOW US :- @theinsaneapp
ChatGPT concluded by saying that it does not condone or promote illegal drug use.