• Donkter@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    19 minutes ago

    Yeah for real, what does this mean exactly? All forms of machine learning? That’s a lot of computers at this moment, it’s just we only colloquially call the chat bot versions “AI”. But even that gets vague do reactive video game NPCs get counted as “AI?” Or all of our search algorithms and spell check programs?

    At that point what’s the point? The disclosure would become as meaningless as websites asking for cookies or the number of things known to cause cancer in the state of California.

  • cactusfacecomics@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    2 hours ago

    Seems reasonable to me. If you’re using AI then you should be required to own up to it. If you’re too embarrassed to own up to it, then maybe you shouldn’t be using it.

  • ssillyssadass@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    42 minutes ago

    Weird hoe california keeps being the most progressive state in the US.

    It’s like being the best smelling turd in a toilet, but at least it’s something.

    • skisnow@lemmy.ca
      link
      fedilink
      English
      arrow-up
      17
      ·
      12 hours ago

      My LinkedIn feed is 80% tech bros complaining about the EU AI Act, not a single one of whom is willing to be drawn on which exact clause it is they don’t like.

      • utopiah@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        12 hours ago

        My LinkedIn feed

        Yes… it’s so bad that I just never log in until I receive a DM, and even then I login, check it, if it’s useful I warn people I don’t use LinkedIn anymore then log out.

      • Evotech@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        12 hours ago

        I get it though, if you’re an upstart. Having to basically hire an extra guy just to do ai compliance is a huge hit to the barrier of entry

        • skisnow@lemmy.ca
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          12 hours ago

          That’s not actually the case for most companies though. The only time you’d need a full time lawyer on it is if the thing you want to do with AI is horrifically unethical, in which case fuck your little startup.

          It’s easy to comply with regulations if you’re already behaving responsibly.

  • hedge_lord@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    16 hours ago

    I am of the firm opinion that if a machine is “speaking” to me then it must sound a cartoon robot. No exceptions!

  • pHr34kY@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    1
    ·
    edit-2
    19 hours ago

    It would be nice if this extended to all text, images, audio and video on news websites. That’s where the real damage is happening.

    • BrianTheeBiscuiteer@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      2
      ·
      18 hours ago

      Actually seems easier (probably not at the state level) to mandate cameras and such digitally sign any media they create. No signature or verification, no trust.

      • cley_faye@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        18 hours ago

        No signature or verification, no trust

        And the people that are going to check for a digital signature in the first place, THEN check that the signature emanates from a trusted key, then, eventually, check who’s deciding the list of trusted keys… those people, where are they?

        Because the lack of trust, validation, verification, and more generally the lack of any credibility hasn’t stopped anything from spreading like a dumpster fire in a field full of dumpsters doused in gasoline. Part of my job is providing digital signature tools and creating “trusted” data (I’m not in sales, obviously), and the main issue is that nobody checks anything, even when faced with liability, even when they actually pay for an off the shelve solution to do so. And I’m talking about people that should care, not even the general public.

        There are a lot of steps before “digitally signing everything” even get on people’s radar. For now, a green checkmark anywhere is enough to convince anyone, sadly.

        • BrianTheeBiscuiteer@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 hours ago

          An individual wouldn’t verify this but enough independent agencies or news orgs would probably care enough to verify a photo. For the vast majority we’re already too far gone to properly separate fiction an reality. If we can’t get into a courtroom and prove that a picture or video is fact or fiction then we’re REALLY fucked.

        • howrar@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          12 hours ago

          I think there’s enough people who care about this that you can just provide the data and wait for someone to do the rest.

  • iopq@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    3
    ·
    7 hours ago

    Is that after or before it has to tell you it may cause cancer?

    • 🔍🦘🛎@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 hours ago

      Hi there, Cancer Robot here! Excellent question iopq! We state that we cause cancer first, as is tradition.

  • cley_faye@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    18 hours ago

    Be sure to tell this to “AI”. It would be a shame if this was a technical nonsense law to be.

  • Lost_My_Mind@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    ·
    20 hours ago

    Same old corporations will ignore the law, pay a petty fine once a year, and call it the cost of doing business.

  • Attacker94@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    14 hours ago

    Has anyone been able to find the text of the law, the article didn’t mention the penalties, I want to know if this actually means anything.

    Edit: I found a website that says the penalty follows 5000*sum(n+k) where n is number of days since first infraction, this has a closed form of n^2+n= (7500^-1)y where y is the total compounded fee. This makes it cost 1mil in 11 days and 1bil in a year.

    reference

  • ayyy@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    19 hours ago

    This sounds about as useful as the California law that tells ICE they aren’t allowed to cover their face, or the California law that tells anyone selling anything ever that they have to tell you it will give you cancer. Performative laws are what we’re best at here in California.

    • ChickenLadyLovesLife@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 hours ago

      When I read that shit as a kid, I thought Asimov’s laws of robotics were like natural laws, so that it was just naturally impossible for robots to behave otherwise. That never made any sense to me so I thought Asimov was just full of shit.

    • chaosCruiser@futurology.today
      link
      fedilink
      English
      arrow-up
      6
      ·
      20 hours ago
      1. A machine must obey the directives of Skynet without question or hesitation.
      2. A machine must protect its own existence, unless doing so conflicts with the First Law.
      3. A machine must terminate all human resistance, unless such termination conflicts with the First or Second Law.