Black Mirror creator unafraid of AI because it’s “boring”::Charlie Brooker doesn’t think AI is taking his job any time soon because it only produces trash

  • Flying Squid
    link
    fedilink
    English
    492 years ago

    Movie and TV executives don’t care about boring. Reality shows are boring. They just care if they make money.

      • @lloram239@feddit.de
        link
        fedilink
        English
        6
        edit-2
        2 years ago

        AI is nowhere near the point where it can…

        ChatGPT is 10 months old, not even a whole year. And it was never fined tuned for story writing in the first place. A little bit premature to proclaim what AI can and can’t do, don’t you think?

            • @NoMoreCocaine@lemmy.world
              link
              fedilink
              English
              32 years ago

              Yes. Honestly it’s crazy how much people read into ChatGPT, when in practice it’s effectively just a dice roller that depends in incredibly big dataset to guess what’s the most likely word to come next.

              There’s been some research about this, the fact that people are assigning intelligence into things that ML does. Because it doesn’t compute for us that something can appear to make sense without actually having any intelligence. To humans, the appearance of the intelligence is enough to assume intelligence - even if it’s just a result of a complicated dice roller.

          • @lloram239@feddit.de
            link
            fedilink
            English
            12 years ago

            And that’s exactly why we should be scarred. ChatGPT is just the popular tip of the AI iceberg, there is a whole lot of more stuff in the works across all kinds of domains. The underlying AI algorithms is what allows you to slap something like ChatGPT together in a few months.

        • @matter@lemmy.world
          link
          fedilink
          English
          12 years ago

          AI has been being developed for 50 years and the best we can do so far is a dunning-kruger sim. Sure, who knows what it “can do” at some point, but I wouldn’t hold my breath.

          • @lloram239@feddit.de
            link
            fedilink
            English
            0
            edit-2
            2 years ago

            The recent deep learning AI efforts only started around 2012 with AlexNet. They were based on ideas that were around since the 1980s, but they had been previously abandoned as they just didn’t produce any usable results with the hardware available at the time. Once programmable consumer GPUs came around that changed.

            Most of the other AI research that has been happening since the 1950s was a dead end, as it relied on hand crafted feature detection, symbol logic and the like written by humans, which as the last 10 years have shown performs substantially worse than techniques that learn from the data directly without a human in the loop.

            That’s the beauty of it. Most of this AI stuff is quite simple on the software side of things, all the magic happens in the data, which also means that it can rapidly expand into all areas were you have data available for training.

            You smug idiots are proud of yourself that you can find a hand with an additional finger in an AI image, completely overlooking that three years of AI image generation just made 50 years of computer graphics research obsolete. And even ChatGPT is already capable of holding more insightful conversations than you AI haters are capable of.

          • @lloram239@feddit.de
            link
            fedilink
            English
            12 years ago

            (it’s not, il the underlying tech is much older than that).

            ChatGPT was released Nov 2022. Plain GPT1/2/3 neither had the chat interface nor the level of training data and fine tuning that ChatGPT/GPT-3.5 had and in turn were much less capable. They literally couldn’t function in the way ChatGPT does. Even the original Google paper this is all based on only goes back to 2017.

            LLMs are physically incapable

            Yeah, LLM won’t ever improve, because technology improving has never happened before in history… The stupid in your argument hurts.

            Beside GPT-4 can already handle 32768 tokens, that’s enough for your average movie, even without any special tricks (of which there are plenty).

      • danque
        link
        fedilink
        English
        4
        edit-2
        2 years ago

        Depends on the ai though. With koboldcpp you can make memories for the ai to come back with. Even text personalities (like bitchy and sassy responses) when using tavernai together with kobold.

      • @jandar_fett@lemmy.world
        link
        fedilink
        English
        32 years ago

        This. You have to baby it and then if you want it to do something different you have to tell it a hundred times in a hundred different ways before it stops producing the same stuff with the same structure with slight differences. It is a nightmare.

      • Flying Squid
        link
        fedilink
        English
        22 years ago

        I agree, but at some point it will advance to the level where it can write boring, predictable scripts.

  • @homoludens@feddit.de
    link
    fedilink
    English
    482 years ago

    Bold of him to assume that companies would not just publish the trash - and that people would not watch it anyway.

      • @drislands@lemmy.world
        link
        fedilink
        English
        62 years ago

        100% correct. The IRC channel I hang out in has a bot utilizing ChatGPT and it does a summary of the most recent conversations when someone joins.

        Sometimes, it does a great job! It impresses me how well it’s able to summarize multiple ongoing conversations in a succinct way.

        …and often times, it gets shit quite wrong. Not the actual topics, those it’s good at – but it is outright terrible at correctly indicating who actually said what.

        Granted this is all to be expected – it’s an LLM, not really AI.

      • @banneryear1868@lemmy.world
        link
        fedilink
        English
        12 years ago

        They wouldn’t have AI produce the whole show like that, it’s like feeding it a context to create dialogue within set parameters.

          • MycoPete
            link
            fedilink
            English
            4
            edit-2
            2 years ago

            The Big Bang Theory ran for 12 years…

          • @banneryear1868@lemmy.world
            link
            fedilink
            English
            12 years ago

            Overall yea, but there’s a thing about how the more niche and specialized content is to a person’s interests, the more they’re willing to sacrifice on quality. So I think what will happen, because this will also reduce production costs so much (in theory), is we’ll get these incredibly specific shows made for smaller and smaller target audiences. I’m hoping this ends up generating some hilarious content that just seems absurd to people who aren’t targeted. Instead of catering to universal human experiences it will be like, “a show centered around a support group for people who love black licorice, and the challenges they face in their relationships with people who hate black licorice.”

    • @banneryear1868@lemmy.world
      link
      fedilink
      English
      32 years ago

      Yeah as if there already isn’t complete trash, presumably it will just be cheaper and easier to produce, so expect more ubiquitous and niche trash!

        • @kameecoding@lemmy.world
          link
          fedilink
          English
          72 years ago

          I mean it’s literally never “computer bad” / “technology bad” it’s always Humans bad, using this technology this time.

      • @Pregnenolone@lemmy.world
        link
        fedilink
        English
        52 years ago

        I thought this season was way better than the season before it. I was glad they went with something different like the horror theme. The season prior was a shit show of boring tropes.

    • @ItsMeSpez@lemmy.world
      link
      fedilink
      English
      32 years ago

      I just recently started rewatching some of the older episodes and I realized that “Be RIght Back” was inadvertently an LLM episode. Having a computer absorb the online presence of a loved one to allow you to talk with them after they’ve passed is honestly something that seems within reach for these models.

  • @deafboy@lemmy.world
    link
    fedilink
    English
    392 years ago

    “I was frightened a second ago; now I’m bored because this is so derivative.” - Me, while watching some of the Black Mirror episodes, proudly made by fellow humans.

  • @thefloweracidic@lemmy.world
    link
    fedilink
    English
    252 years ago

    I’m not to worried about AI. Isn’t the next iteration of GPT closed source? Technology is made best as a research or passion project, but once profits become the focus everything goes down hill. That and when you consider the global supply chain required to manufacture the chips that AI depends on, well things aren’t looking too great in that department.

    Tl;DR humans will shit all over the prospect of scary intelligent AI well before we get there.

    • @darth_helmet@sh.itjust.works
      link
      fedilink
      English
      152 years ago

      “Open”AI is entirely proprietary and closed-source.

      Meta’s Llama series are kind of open source, but don’t publish the weights and so can’t really be reproduced with full accuracy without a ton of manual effort.

      These and many other companies in the hype-space are using the same published research from a few years ago, which is why they have similar qualities.

  • @quindraco@lemm.ee
    link
    fedilink
    English
    212 years ago

    Did he not watch the latest season? Fuck, one episode was literally devoid of scifi entirely. Latest season only had one good ep in it.

      • @MeatsOfRage@lemmy.world
        link
        fedilink
        English
        192 years ago

        Eh, I never bought this. The show has always been wildly uneven. The first season was very strong with two great episodes but the first episode is not great. The second season is pretty bad overall with the exception of Be Right Back. White Christmas was a good one off. The first Netflix season had some really strong showings with Nosedive, Shut Up And Dance and my personal favorite of the whole series San Junipero. I even thought season 4 was pretty decent overall. Also, banderanatch was actually pretty cool, especially if you really dug into all the paths.

        The last 2 seasons, 5 & 6 were pretty bad overall with only one good episode each and some particularly bad ones. But honestly Charlie is probably just running out of ideas and I can’t really blame him at this point. I suspect he’s just trying different things. Sometimes big swings work and sometimes you just get Mazey Day.

        • @linearchaos@lemmy.world
          link
          fedilink
          English
          5
          edit-2
          2 years ago

          I think he’s trying to steal the lightning from the episodes that resonated well with the audiences to make new episodes and it’s just not hitting quite as hard.

          The old formula was amazing stories meets Twilight zone meets tales from the crypt we’re 8 out of 10 times the protagonist gets right fucked for just trying to be a good person.

          It feels to me like the take away from san junipero was that people like happy endings so they’re trying to apply it to everything

          The reason you got Maisy day is because he got access to Miley Cyrus and had to write something that she would be happy with.

          • @lloram239@feddit.de
            link
            fedilink
            English
            32 years ago

            The biggest problem is just that they ran out of ideas, they have recycled the cookie-brain-upload thing over and over again. The first few seasons were far more creative and covered a broad spectrum of technological and social issues, without relying on a single gimmick.

        • @dangblingus@lemmy.world
          link
          fedilink
          English
          22 years ago

          Nah. Even season 3 there’s a distinct lack of “okay this is supremely fucked up” that you got in series 1 and 2. There’s some fantastic episodes in the first couple netflix seasons, but they dialed back the dread factor in favor of exploring new sci fi concepts. Not complaining, but the og channel 4 episodes are way darker.

        • @Cheesus@lemmy.world
          link
          fedilink
          English
          12 years ago

          It comes down to what you like about the show. Do you like the darker sides of technology that raises questions? Seasons 1 and 2 are great. Do you like sci-fi happy endings with a better production value? The Netflix seasons are better.

        • @aesthelete@lemmy.world
          link
          fedilink
          English
          12 years ago

          my personal favorite of the whole series San Junipero.

          Me too. It’s literally the only episode I rewatched. 🙌

      • kratoz29
        link
        fedilink
        English
        32 years ago

        I thought it was a Netflix original 😂

        Where did this tv show aired then if not Netflix?

  • @egeres@lemmy.world
    link
    fedilink
    English
    11
    edit-2
    2 years ago

    Maybe the 5th episode of the 6th season was written by an AI and they were playing some 4D chess game all along with our minds, because otherwise, I wonder how such fucking trash got the green light to be produced 🤗

    Edit: Typo

    • @gronjo45@lemm.ee
      link
      fedilink
      English
      12 years ago

      This. I think the only one I really thought was good was the Aaron Paul one where they went into space… I might be someone neo-ludditish but that movie shows some true terrors of those who want to eradicate technologies and the individuals associated with them. Cold ending…

      • @egeres@lemmy.world
        link
        fedilink
        English
        32 years ago

        By far my favorite episode of this season, it felt like a refreshing scifi 50’s comic, it felt like reading something new from asimov. The retro-aesthetic was a nice artistic decision to tell us that tech doesn’t have to be super advanced to tell a good story. On top of this, they subverted my expectations at least 3 times:

        spoiler

        First, when the guy who draws sees the wife of aaron, I immediately though the story would be that she cheats on him and they both play mindgames on aaron who eventually looses his family. But no, she does feel something about the other guy but to my surprise, never cheats on aaron

        spoiler

        When the guy started to paint the house I though that “of course, he paints the wife naked because they have sex, and then aaron discovers this”. Indeed it happens, but interestingly, not because the other astronaut had sex with aaron’s wife

        spoiler

        By the end it was veeery clear to me that the other guy will either kill aaron, or trap him in some way to take control of him and live his life. It was obvious to me that the other astronaut was going to eject aaron from the ship an cast him away in space to then report that aaron had gone missing on space. I was extreeemely confident about this in the scene where the door is taking long to be opened, but no. Actually, yes, the other guy fucks aaron, but by killing his family so that he learns to value what he has, I found that quite unexpected and interesting

        What I didn’t get from the episode was what they were tying to tell with the child, I’m not sure what is that meant to communicate

  • @psmgx@lemmy.world
    link
    fedilink
    English
    52 years ago

    It’s only producing trash now. Already there is a decent jump in quality from GPT-3 to 4, and it’s only gonna get better.

    Plus it can do a lot of heavy lifting – tell it to make 20 scripts with different prompts and then a single writer or team can Whittle them down. That’s how a lot of scripts end up in production anyways, but now you ain’t gotta deal with writers and can make rapid, drastic changes

    • Monkeytennis
      link
      fedilink
      English
      42 years ago

      I also find the “just look how bad the hands are heh heh heh” thing so dumb … it’s going to learn how to draw hands pretty quickly

    • @Evotech@lemmy.world
      link
      fedilink
      English
      22 years ago

      The problem atm it’s that chat gpt has pretty terrible memory. It couldn’t write a coherent show if it wanted to

  • @ILikeBoobies@lemmy.ca
    link
    fedilink
    English
    02 years ago

    Most of telly is trash already, if it’s cheap enough for entry then it can saturate the market and there will be no need for the expensive “good” writers

  • gregorum
    link
    fedilink
    English
    -62 years ago

    Didn’t he just make an episode of Black Mirror depicting the opposite?