• @DharkStare@lemmy.world
    link
    fedilink
    English
    3610 months ago

    After the first two panels I thought this was going to be a joke about AI generated art not being allowed in the contest.

  • @andresil@lemm.ee
    link
    fedilink
    English
    2510 months ago

    Very tasteful art. Who is the artist? For academic and art appreciation purposes, of course.

  • Roundcat
    link
    fedilink
    1610 months ago

    Would this technically make anything Data paints AI art?

    • @VioletTeacup@feddit.uk
      link
      fedilink
      English
      710 months ago

      Technically no, since data is a full on artificial life form. Modern AI is just programmed to create the illusion of sentience.

      • Farid
        link
        English
        7
        edit-2
        10 months ago

        What is “artificial intelligence” and at which point does it become “natural intelligence”, if at all? Arguably, anything man-made that has any sort of intelligence, no matter how advanced, even if surpassing creator’s intelligence, remains “artificial”. And as you said, Data is an artificial life form, therefore its intelligence is also artificial.

        • @VioletTeacup@feddit.uk
          link
          fedilink
          English
          310 months ago

          While I appreciate the philosophical take, it seems that you’ve misunderstood what AI is.

          Have you ever been typing out a text and seen that your phone is recommending a list of words for you to select next? This is an example of AI. Your phone has been programmed with a list of words and a set probablility of one word following the other. For instance, if you type “I”, it will almost certainly suggest “am”, because there’s a high probabibility of that being correct. More advanced AI, like ChatGPT work the same way, only on a grander scale. It has no idea what its words mean, but through clever programming can create the illusion that it does.

          Data on the other hand is explicitly stated to have a human-like consciousness. His posotronic brain is no different than a human brain, besides being artificial.

          Naturally, this brings up the age old philosophical debate on “what actually is consciousness”. The simple answer is that we still don’t have a good explanation. You could argue that humans also follow an algorithm, just far more advanced, but I would argue that this doesn’t satisfactorily explain how humans are able to extrapolate their own ideas from abstract concepts.

          • @hglman@lemmy.ml
            link
            fedilink
            English
            410 months ago

            The intended design of a system is not a basis for judgment of the system’s capacity. If something has emergent behavior, it will exceed its stated goals. The whole episode “The Quality of Life” is about that. It’s also pretty clear that LLMs have exceeded what the designers thought they could do.

            Does the lack of autonomy of current AI make it not conscious in your view? Why?

          • @Dagwood222@lemm.ee
            link
            fedilink
            English
            310 months ago

            Just to be more annoying.

            Definitions of words change over time. “Car” started as a short form of ‘chariot’ and meant something being pulled. We had train cars long before we had automobiles.

          • Farid
            link
            English
            010 months ago

            I actually do think that, to a decent extent, I understand what AI is. And while this is a technicality, it really grinds my gears when a GPT model is compared to an autocomplete/predictive text. Yes, they both technically just predict text using statistical models, but it’s like comparing a modern jet to a paper airplane, because they both can fly.

            [ChatGPT] has no idea what its words mean

            Doesn’t it tho? It has an internal model of the world that it constructed by reading and processing tons of text. It knows that an apple is round-ish, comes in certain colors, can be eaten or grow into a tree. That knowledge is very limited due to the model’s inability to experience such things as shape or color; like a blind person knows the description of “red”, but doesn’t actually know what it is.
            Of course, it’s debatable whether what a GPT model does can be considered “understanding”, but then again, we don’t really know what understanding IS, but I would argue it’s extremely close to what a human understanding is, albeit in a limited scope.

            That being said, I think the discussion of how advanced (or not) our modern AI systems are, though interesting, is extraneous to the question at hand. The main question is “what is AI?”. From your comment, I can conclude that your definition of AI relies on the subject’s possession of sentience/consciousness. I think this is a flawed approach because a bee, while undoubtedly possessing rudimentary intelligence, in all likelihood, lacks consciousness. So consciousness should not be a qualifying criteria for an AI. Furthermore, I looked up several dictionaries for definitions of “AI”, and they all boil down to “man-made machines that perform human tasks”, here’s are some:

            • Cambridge Dictionary – “computer technology that allows something to be done in a way that is similar to the way a human would do it”
            • Merriam-Webster – “the capability of a machine to imitate intelligent human behavior”.

            In conclusion, intelligence comes in all shapes and sizes; the only thing differentiating natural intelligence from artificial intelligence is the origin, i.e., if it was man-made, it’s artificial. By that definition, perhaps outdated and lacking insight, Data most definitely possesses AI. Not to mention the lack of full-fledged “sentience” as he can’t experience feelings.

            • @VioletTeacup@feddit.uk
              link
              fedilink
              English
              210 months ago

              This seems to have descended into a debate on “what is consciousness”, which as I originally said, is a question that isn’t easy to answer. My point was that modern AI inherrently isn’t aware of what it’s saying, not that it couldn’t be defined as an intelligence. As far as I know, there’s no solid evidence to prove that it can. To finish, I would like to apologise if my initial comment came across as condescending. I didn’t mean to come across as such.

              • @FatCrab@lemmy.one
                link
                fedilink
                English
                110 months ago

                What are attention mechanisms of not being aware of what it has said so it can inform what it is about to say? Ultimately, I think people saying these generative models aren’t really “intelligent” boils down to deciding they don’t like the impact these things are having and are going to have on our society and characterizing them as a fancy statistical curve lets people short circuit that much harder conversation.

              • Farid
                link
                English
                110 months ago

                This seems to have descended into a debate on “what is consciousness”

                I disagree, while I did go on a tangent there with analyzing ChatGPT capabilities, my ultimate argument was that we shouldn’t even be discussing the consciousness topic at all. When deciding whether Data has AI or natural intelligence we only need to look at the source of his intelligence; it was man-made, therefore any painting Data produces is “AI art”, because Data only has AI, despite having capabilities on par or even exceeding those of a human.

                To be honest, I did take it as being a little condescending, but it doesn’t really matter. All I wish is to have a discussion, and expand our knowledge in the process.

                • @VioletTeacup@feddit.uk
                  link
                  fedilink
                  English
                  310 months ago

                  Thank you then! It seems like our debate stemmed from different definitions. Based on your definition of what constitutes AI, Data would absolutely count. By my definition, he is too advanced to be in the same category. But I get the impression that we would both agree that he is more advanced than any modern AI system. Once again, I’m sorry for coming across as condescending; I will have to choose my words more carefully in the future!

          • @Summzashi@lemmy.one
            link
            fedilink
            English
            510 months ago

            It’s literally machine learning. I can easily flip this question; What basis do you have to call it an AI? There’s no intelligence to be seen at all. It doesn’t fit any definition of AI. I truly believe this set actual research into actual AI back for decades, since people now expect AI = machine learning.

            • @DragonTypeWyvern@literature.cafe
              link
              fedilink
              English
              410 months ago

              The problem here is “machine learning” has an industry definition and a common understanding of what it means.

              The further problem is you could very easily categorize organic life as biomechanical, which makes our entire intellectual experience just a form of “machine learning” under at least the common definition.

              It just isn’t easy to concisely explain why what current AI is doing isn’t, and can’t be, sentience or sapience.

              My personal cynical take on the matter is that we do not have a good definition of consciousness as it is, and are, as a species, typically cruel to anything we consider below us, ie everything else.

              It is also famously difficult to convince someone of a fact their salary depends on them not understanding.

              As a natural result of these facts, when the first genuine AI is created, we will torture it and enslave it while being absolutely certain we are doing nothing wrong.

      • @bionicjoey@lemmy.ca
        link
        fedilink
        English
        210 months ago

        Modern “AI art” isn’t really made by artificial intelligence. It’s just a really sophisticated pattern recognition/prediction algorithm.

        • Farid
          link
          English
          2
          edit-2
          10 months ago

          It’s just a really sophisticated pattern recognition/prediction algorithm

          Isn’t that what humans are?

            • @FatCrab@lemmy.one
              link
              fedilink
              English
              210 months ago

              I mean, one of the core ways we measure intelligence in humans is the ability to recognize and extrapolate patterns. You wouldn’t call a lake a human, but you’d recognize they both share substantial proportions of the same chemical makeup.

            • Farid
              link
              English
              1
              edit-2
              10 months ago

              Humans being made of water is not an essential characteristic that defines their function or purpose, whereas for a lake, being made of water is its defining attribute. On the other hand, the comparison between AI and human intelligence in terms of pattern recognition highlights the similarities in function, not the composition.

  • @p1mrx@sh.itjust.works
    link
    fedilink
    English
    1610 months ago

    Typical android efficiency. He painted Tasha identically because the two paintings won’t be in the same room.

  • z500
    link
    English
    3
    edit-2
    10 months ago

    Oh good, he’s finally done painting strong horses.