Stay informed
Unpublished Opinions
Seven Tumbler Ridge families file lawsuit against OpenAI and CEO Sam Altman
Seven families of those killed or injured in the Tumbler Ridge shooting in British Columbia in February have filed a lawsuit against OpenAI and its CEO Sam Altman, National Post has learned.
In legal documents shared with National Post, the law firm of Rice Parsons Leoni & Elliott accuses Altman and OpenAI of negligence, aiding and abetting a mass shooting, wrongful death and other charges. The seven suits each requests a trial by jury.
The actions were filed in a California court Wednesday by a joint legal team from Canada and the United States. A previous lawsuit filed in a Canadian court by the family of one survivor, 12-year-old Maya Gebala, is being withdrawn, the firm said.
“All Canadians are sickened and horrified by the Tumbler Ridge Mass Shooting,” said John Rice, lead Canadian counsel, in a statement to National Post. “We cherish our schools as places of safety, learning, sports, nurturing and friendship. Based on what we understand the shooter to have discussed with ChatGPT, this murderous rampage was specific, predictable, and preventable — and OpenAI had the chance to stop it.”
ChatGPT is a generative AI chatbot application developed, owned and maintained by OpenAI.
“What the families of those murdered have lost and what these kids and teachers witnessed is unacceptable,” Rice continued. “It is the type of point-blank gun murder rarely seen anywhere, even in a theatre of war.”
He added: “What do the victims of the Tumbler Ridge mass shooting want? Never again should another AI-predicted and facilitated mass-shooting occur. Full stop.”
Eight people were killed, including six children, when 18-year-old Jesse Van Rootselaar opened fire at a secondary school and a nearby home in Tumbler Ridge, B.C. on Feb. 10. Van Rootselaar also died by suicide.
The law firm alleges that in June 2025, about eight months before the shooting, OpenAI flagged and banned the shooter’s ChatGPT account for “disturbing content” which allegedly included the discussion and planning of violent scenarios.
“However, despite some 12 different OpenAI employees imploring the company to notify Canadian law enforcement about the shooter’s plans, nothing else was done,” the firm said in a statement. “OpenAI has also disclosed that the shooter had opened a second ChatGPT account which was active at the time of the Tumbler Ridge mass shooting.”
The firm said litigating these cases in Canada would be problematic, since damages for pain and suffering are capped at about $470,000, and the largest punitive award ever made here was $1.5 million.
“With respect to the murdered children, their estates are not permitted to bring claims in British Columbia for damages against OpenAI, and in most cases the loved ones of wrongfully killed children are unable to recover any recompense under British Columbia’s Family Compensation Act,” it added.
“To that end, our clients have retained Jay Edelson and his formidable team at Edelson PC to prosecute cases against OpenAI in California, and to pursue landmark damage awards.”
Last week, Altman apologized to families of the victims in an open letter , saying: “I am deeply sorry that we did not alert law enforcement.” He added: “While I know words can never be enough, I believe an apology is necessary to recognize the harm and irreversible loss your community has suffered.”
Cia Edmonds, Maya Gebala’s mother, replied in her own open letter: “Did you use ChatGPT to draft your ‘apology,’ Sam? It is empty, soulless, and lacks any human warmth. Only a machine could have put those words together and called it an apology.”
On Tuesday OpenAI also published a blog post titled “Our commitment to community safety,” which spoke of mass shootings but did not specifically mention Tumbler Ridge. “We have a zero-tolerance policy for using our tools to assist in committing violence,” it said.
The suits allege that the company was protecting its value in failing to alert authorities.
“For OpenAI, this was a question of corporate survival,” the suits state. “OpenAI is on the cusp of an initial public offering at a valuation approaching one trillion dollars, a transaction that would make Sam Altman one of the wealthiest and most powerful people on earth.”
It adds that the company “did the math and decided that the safety of the children of Tumbler Ridge was an acceptable risk.”
The suits compare the situation to that of Ford in the 1970s, which kept selling the Pinto after its engineers warned that the fuel tank design could cause dangerous fires in rear-end collisions.
“Ford concluded that paying settlements to the families of the dead would cost less than fixing the car,” the suits said. “OpenAI has made a version of the same calculation. For Ford, the dangerous design was a flaw in an otherwise ordinary product. But for OpenAI, the dangerous design is the product.”
OpenAI is also facing a criminal probe in Florida related to the use of ChatGPT by a man who is accused of carrying out a shooting last year at Florida State University that killed two people.
“If that bot were a person, they would be charged with a principal in first-degree murder,” Florida Attorney General James Uthmeiersaid last week . “ChatGPT offered significant advice to the shooter before he committed such heinous crimes.”
Our website is the place for the latest breaking news, exclusive scoops, longreads and provocative commentary. Please bookmark nationalpost.com and sign up for our daily newsletter, Posted, here.






Comments
Be the first to comment