• minorkeys@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    7 days ago

    If you ask ChatGPT, it says it’s guidelines include not giving the impression it’s a human. But if you ask it be less human because it is confusing you, it says that would break the guidelines.

    • markovs_gun@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 days ago

      ChatGPT doesn’t know its own guidelines because those aren’t even included in its training corpus. Never trust an LLM about how it works or how it “thinks” because fundamentally these answers are fake.

  • pHr34kY@lemmy.world
    link
    fedilink
    English
    arrow-up
    59
    arrow-down
    2
    ·
    edit-2
    8 days ago

    It would be nice if this extended to all text, images, audio and video on news websites. That’s where the real damage is happening.

    • BrianTheeBiscuiteer@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      4
      ·
      8 days ago

      Actually seems easier (probably not at the state level) to mandate cameras and such digitally sign any media they create. No signature or verification, no trust.

      • CosmicTurtle0@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        16
        ·
        8 days ago

        I get what you’re going for but this would absolutely wreck privacy. And depending on how those signatures are created, someone could create a virtual camera that would sign images and then we would be back to square one.

        I don’t have a better idea though.

        • BrianTheeBiscuiteer@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 days ago

          The point is to give photographers a “receipt” for their photos. If you don’t want the receipt it would be easy to scrub from photo metadata.

        • howrar@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 days ago

          Privacy concern for sure, but given that you can already tie different photos back to the same phone from lens artifacts, I don’t think this is going to make things much worse than they already are.

          someone could create a virtual camera that would sign images

          Anyone who produces cameras can publish a list of valid keys associated with their camera. If you trust the manufacturer, then you also trust their keys. If there’s no trusted source for the keys, then you don’t trust the signature.

      • cley_faye@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        1
        ·
        8 days ago

        No signature or verification, no trust

        And the people that are going to check for a digital signature in the first place, THEN check that the signature emanates from a trusted key, then, eventually, check who’s deciding the list of trusted keys… those people, where are they?

        Because the lack of trust, validation, verification, and more generally the lack of any credibility hasn’t stopped anything from spreading like a dumpster fire in a field full of dumpsters doused in gasoline. Part of my job is providing digital signature tools and creating “trusted” data (I’m not in sales, obviously), and the main issue is that nobody checks anything, even when faced with liability, even when they actually pay for an off the shelve solution to do so. And I’m talking about people that should care, not even the general public.

        There are a lot of steps before “digitally signing everything” even get on people’s radar. For now, a green checkmark anywhere is enough to convince anyone, sadly.

        • howrar@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 days ago

          I think there’s enough people who care about this that you can just provide the data and wait for someone to do the rest.

          • cley_faye@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 days ago

            I’d like to think like that too, but it’s actually experience with large business users that led me to say otherwise.

        • dev_null@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 days ago

          It could be a feature of web browsers. Images would get some icon indicating the valid signature, just like browsers already show the padlock icon indicating a valid certificate. So everybody would be seeing the verification.

          But I don’t think it’s a good idea, for other reasons.

        • BrianTheeBiscuiteer@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 days ago

          An individual wouldn’t verify this but enough independent agencies or news orgs would probably care enough to verify a photo. For the vast majority we’re already too far gone to properly separate fiction an reality. If we can’t get into a courtroom and prove that a picture or video is fact or fiction then we’re REALLY fucked.

      • technocrit@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 days ago

        The problem is that “AI” doesn’t actually exist. For example, Photoshop has features that are called “AI”. Should every designer be forced to label their work if they use some “AI” tool.

        This is a problem with making violent laws based on meaningless language.

  • cactusfacecomics@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    ·
    7 days ago

    Seems reasonable to me. If you’re using AI then you should be required to own up to it. If you’re too embarrassed to own up to it, then maybe you shouldn’t be using it.

    • technocrit@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      12
      ·
      7 days ago

      I’m stoked to see the legal definition of “AI”. I’m sure the lawyers and costumed clowns will really clear it all up.

    • skisnow@lemmy.ca
      link
      fedilink
      English
      arrow-up
      23
      ·
      8 days ago

      My LinkedIn feed is 80% tech bros complaining about the EU AI Act, not a single one of whom is willing to be drawn on which exact clause it is they don’t like.

      • utopiah@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        8 days ago

        My LinkedIn feed

        Yes… it’s so bad that I just never log in until I receive a DM, and even then I login, check it, if it’s useful I warn people I don’t use LinkedIn anymore then log out.

      • Evotech@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        8 days ago

        I get it though, if you’re an upstart. Having to basically hire an extra guy just to do ai compliance is a huge hit to the barrier of entry

        • skisnow@lemmy.ca
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          8 days ago

          That’s not actually the case for most companies though. The only time you’d need a full time lawyer on it is if the thing you want to do with AI is horrifically unethical, in which case fuck your little startup.

          It’s easy to comply with regulations if you’re already behaving responsibly.

  • Lost_My_Mind@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    8 days ago

    Same old corporations will ignore the law, pay a petty fine once a year, and call it the cost of doing business.

  • cley_faye@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    8 days ago

    Be sure to tell this to “AI”. It would be a shame if this was a technical nonsense law to be.

  • hedge_lord@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    8 days ago

    I am of the firm opinion that if a machine is “speaking” to me then it must sound a cartoon robot. No exceptions!

      • Lost_My_Mind@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        8 days ago

        Oooooooh! As long as California doesn’t do those stupid ID verification laws, that might be the place to set your VPN from now on.

        • Attacker94@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 days ago

          There was a link in the article about that, it was saying that they are just requiring self reporting, I don’t know the political context in California, but it seems like you wouldn’t push this and then turn around and try the id thing, but I am by no means an expert at predicting the idiocracy of politicians.

    • Not_mikey@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 days ago

      Probably will get it anyway, companies don’t like to build and maintain software for two different markets so they tend to just follow the regulations of the strictest market, especially if those regulations don’t really cut into there bottom line like this one.

  • Attacker94@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    8 days ago

    Has anyone been able to find the text of the law, the article didn’t mention the penalties, I want to know if this actually means anything.

    Edit: I found a website that says the penalty follows 5000*sum(n+k) where n is number of days since first infraction, this has a closed form of n^2+n= (7500^-1)y where y is the total compounded fee. This makes it cost 1mil in 11 days and 1bil in a year.

    reference

    • mic_check_one_two@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      6
      ·
      8 days ago

      Yeah, this is an important point. If the penalty is too small, AI companies will just consider it a cost of doing business. Flat-rate fines only being penalties for the poor, and all that.

      • Attacker94@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 days ago

        How do you figure, I haven’t seen the actual text, is it written ambiguously? If not, I would imagine that they be able to enforce it, the only thing is the scope is very small.

  • ayyy@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    2
    ·
    8 days ago

    This sounds about as useful as the California law that tells ICE they aren’t allowed to cover their face, or the California law that tells anyone selling anything ever that they have to tell you it will give you cancer. Performative laws are what we’re best at here in California.

  • technocrit@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    edit-2
    7 days ago

    Will someone please tell California that “AI” doesn’t exist?

    This is how politicians promote a grift by pretending to regulate it.

    Worthless politicians making worthless laws.