Share.

    2 Kommentare

    1. Neat_Let923 on

      This likely won’t go well for anyone involved…

      The mother and brother were killed at home with an unregistered shotgun with an as yet unknown origin.

      The primary weapon used at the school was not part of the previously seized weapons the RCMP took around 2 years earlier and were returned at some point (possibly within a month of the shooting). They have not released who owned or registered the long gun yet or how she got hold of it.

      The issues on ChatGPT occurred sometime around June 2025, 8 months before the shooting took place. The conversation transcripts are with the RCMP and have not been released to the public or media and were fully backed up as per standard practice when an account is reviewed and banned.

      Everything stated by the complainant about those conversations is entirely made up since they have not had access to those transcripts either (yet).

      They’re going to have to prove there was an imminent threat in those conversations. So far all details released state that there was not, but that’s where discovery comes into play.

      Imminent threat requires a combination of the following:

      1. Specific target: a person, building, or location

      2. Specific timing: today, tomorrow, or a near-term date

      3. Operational preparation: acquiring weapons or asking for tactical instructions

      4. Declared intent: statements clearly indicating the person plans to carry it out

      5. Capability: evidence they actually can carry it out (access to weapons, transportation, etc.)

      So far, what has been stated indicates that none of the above were present in the conversations to create an imminent threat.

      The lawsuit seems to be jumping the gun a little before the investigation has even completed. I really hope the mother of the victim isn’t paying for this lawyer because if she’s wrong in her assumptions that’s going to be a lot of money they likely do not have on top of the difficulties of having to take care of her brain damaged daughter.

      There’s enough blame to go around with this but it sounds like a good portion might have laid with the mother…

    2. Lawyer here:

      I have the utmost sympathy for Maya and her family.

      That said, I think this lawsuit is not going anywhere at all. Normally there is no duty to report something to the police.

      In this case, they’re arguing that a duty of care exists because ChatGPT was acting as a pseudo-therapist, which I suspect is not going to fly in the courts. But, I guess we’ll see.

    Leave A Reply