According to reports, the Punjab and Haryana High Court made history by including the responses of ChatGPT in its order. The court sought ChatGPT’s perspective on granting bail to a man charged with assault by cruelty.
Justice Anoop Chitkara specified that ChatGPT’s response was intended to provide a more comprehensive understanding of the case rather than a viewpoint on the case’s merits.
The ability of the large language model to generate responses that resemble human-like answers has caught the attention of many. Notably, its performance has earned good marks on law and business exams at various universities. Additionally, a Colombian court has used the AI chatbot to help reaching a legal verdict.
During a hearing, the Punjab and Haryana High Court asked ChatGPT, “What is the jurisprudence on bail when the assailants assaulted with cruelty?”
To this, the software replied that the jurisprudence in such cases would depend on the jurisdiction’s laws where the case is being tried, reported LiveLaw.
According to the ChatGPT, if the offenders have been accused of committing violent crimes involving cruelty, such as murder, aggravated assault, or torture, they may be regarded as a potential danger to the community and could attempt to flee. Consequently, the judge may hesitate to grant bail or set a high bail amount to ensure the defendant’s appearance in court and prevent public safety threats.
When deciding whether to grant bail, ChatGPT stated that the judge must evaluate various factors, such as the severity of the assault, the defendant’s criminal history, and the strength of the evidence.
According to the chatbot, it is essential to acknowledge that the presumption of innocence is a crucial principle of the justice system. Furthermore, all defendants are eligible for bail unless there are compelling reasons to deny it.
Therefore, in instances where the attackers have committed heinous acts of cruelty, they may still receive bail if the judge determines that they do not pose a danger to the community or are a flight risk.