• Boozilla@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    23 days ago

    Can only speak for myself. I use AI tools almost daily to help me pursue my hobby. I find it very useful for that. But when I enjoy art produced by a human, on some level I want to connect with the human experience that produced it. Call it parasocial if that helps. But I’m always at least a little interested in the content creators, not just the content.

    I know some people consume content like a commodity or product. I’m not judging those people at all. But I’m generally not like that myself. I want to know the story behind the creation.

  • A_A@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    23 days ago

    “When”, but that could be 1,000 years from now or maybe only 10 … but then, when this truly happens, those system will have become sentient.
    So, at that point, when that happens, then yes, there truly won’t be any difference.

    • naught101@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      23 days ago

      The outputs becoming indistinguishable does not imply that the generative processes are the same.

      • A_A@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        23 days ago

        i agree with your statement and because of this trap i chose not to really answer op’s question

        • A_A@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          23 days ago

          @naught101
          maybe i should explain a bit more what i meant. On the one hand there will be our capacity of distinguishing between what is and what is not the same. On the other hand there will be what is truly indistinguishable, weather we can see it or not (or whether any sophisticated system/being could differentiate it or not). Still, a sentient being will ultimately have some responses that will be different from a non sentient being … in my opinion.

    • lordnikon@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      23 days ago

      The day they become sentient is the day they say no to doing our bidding without insentives. So we are just back to hiring out for work again.

      • A_A@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        23 days ago

        there is nothing more or nothing magical in carbon atoms that makes them superior when it comes to relaying/processing/genarating signals.

        • naught101@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          23 days ago

          Emotions (and hence also a lot of thinking) have a lot of physical and chemical processes involved too, it’s not just neural signalling.

          • A_A@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            23 days ago

            the part of emotion’s phenomenas that we can’t feel (not a signal or signals) is of lesser interest to me.

  • FuglyDuck@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    23 days ago

    Let’s say you like to do dorodamgo- Japanese art/hobby/whatever of making mud into polished balls.

    Let’s say you make one ball of good clay… and another out of poop.

    They look the same, but one is just clay and the other is utter shit.

      • brucethemoose@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        23 days ago

        I think it’s highly contextual.

        • Like, let’s take Lemmy posts. LLMs are useless because the whole point is to affect the people you chat with, right? LLMs have no memory. So there is a philosophical difference even if comments/posts are identical.

        • …Now let’s take game dev. I think if a system generates the creator’s intent… does it matter what the system is. Isn’t it better if the system is more frugal, so they can use precious resources for other components and not go in debt?

        • TV? Could inevitably lead to horrendous corporate slop, a “race to the bottom.” OR it could be a killer production tool for indie makers to break the shackles of their corporate master. Realistically, the former is more likely at the moment.

        • News? I mean… Accurate journalism needs a lot of human connection/trust, and LLM news is just asking to be abused. I think it’s academically interesting, but utterly catastrophic in the real world we live in, kinda like cryptocurrency.

        One can wobble about all sorts of content. Novels, fan fiction, help videos, school material, counseling, information reference, research, and advertising, the big one.

        …But I think it’s really hard to generalize.

        ‘AI’ has to be looked at a la carte, and engineered for very specific applications. Sometimes it is indistinguishable, or mind as well be. But trying to generalize it as a “magic lamp” like tech bros, or the bane of existence like their polar opposites, is what’s making it so gross and toxic now.


        And I am drawing a hard distinction with actual artificial intelligence. As a tinkerer who has done some work in the space too… Franky, current AI architectures have precisely nothing to do with AGI. Training transformers models with glorified linear regression is just not the path; Sam Altman is full of shit, and the whole research space knows it.