Unternehmen aus Cumbria entwickelt Technologie zur Raubtiererkennung, um Frauen zu schützen

    https://www.bbc.co.uk/news/articles/c0jvy4xn0l4o

    Von SliceIndividual6347

    15 Kommentare

    1. Lazy_Crab_3584 on

      This is just gonna be the AI version of the meme where Peter Griffin is checked against the ’safe‘ colours card isn’t it?

    2. Silencer-1995 on

      We can’t even get cars to safely drive themselves now you want to use A.I to work out if someone is autistic or a sexual predator? I dunno man.

      Maybe we just employ more security guards and place them in areas women find intimidating in their local communities, and then hold a national dialogue about how we address this problem in the long term.

    3. ItsSuperDefective on

      Oh yay, now a computer gets to decide I’m a rapist because I act slightly unusually.

    4. How does it know what ‚moving in an unexpected way‘ means.

      This sounds like a great crowd control tool for events at stadiums and managing customer/passenger traffic etc but I am not sold on how it is supposed judge malicious intentions.

    5. PS1_Hagrid_Guy on

      >On a dark winter evening, a woman waits for a train on a deserted platform. A man arrives and sits right beside her, making her feel uncomfortable and unsafe.

      Thank God the latest technology is here to ensure we clamp down on criminal malcontents intent on menacing society by *checks notes* sitting on a public bench

    6. Ok-Milk-8853 on

      It’s tricky, but if you look at the treeline you can usually see a little bit of visual distortion.
      Also if you manage to make it bleed, it’s a vivid green color.

      That’s how I spot Predator anyway.

    7. Useful_Promotion_521 on

      But what happens if I say in front of a networked microphone something like “The Metropolitan Police urgently needs reform”?  This AI is going to label me a predator twice.

    8. This is like something from Brasseye, let alone Back Mirror.

      Also, train stations are a stupid example given how many are unmanned these days. Who’s doing to be responding to the automated alerts?

    9. JackStrawWitchita on

      Of course it would be great if this technology helps protect women but we also need to be concerned about false positives. We’ve already seen how police AI facial recognition tools are falsely targeting people of colour. Would this new technology also falsely accuse innocents while missing actual perpetrators.

    10. Striking_Smile6594 on

      This feels very dodgy to me and will almost certainly lead to lots of people being arrested because the ‚computer‘ decided they where a wrongun.

      This reminds of these ‚life 360‘ type apps, they don’t nothing to keep you safe rather they exploit and escalate peoples fears and normalise us being surveilled 24/7.

    11. Consistent-Pirate-23 on

      Oh yes, because ai is renowned for being accurate.

      How long till it makes comically bad errors or profiles on something that looks a bit too much like something we can’t discriminate for

    12. As nice an idea as this tech is, I highly doubt it’s going to work. You just know this AI is somehow going to miss people with actual criminal records for sexual assault and just target some innocent guy in a tracksuit because he looks moody.

    13. I’m sure the non existent security staff at the station are going to jump onto the cctv and attend in person because a notification says one person sat next to another on a bench.

      Seems interesting yet overbearing technology but really could have thought up a better example.

    14. No_Atmosphere8146 on

      „Cumbria firm sued for mistakenly identify neurodivergent man as potential sexual predator.“

    Leave A Reply