Chat GPT AI

  • Guest, it's time once again for the massively important and exciting FoH Asshat Tournament!



    Go here and give us your nominations!
    Who's been the biggest Asshat in the last year? Give us your worst ones!

Mist

REEEEeyore
<Gold Donor>
31,378
23,801
Yes, exactly. But it's important to talk about the tech as it works right now not some imagined future tech.

This is still just "a neat trick that happens when you upscale autocomplete by a fucking lot."
 

Aldarion

Egg Nazi
9,875
27,161
Reading this debate about "when machines have agency" makes me think a lot of people are still missing the importance of LLMs.

Its not that they're a real boy now. The important point to grasp is not about them at all. Its about us.

Its that a large fraction of human thought and speech is nothing more than a biological upscaled autocomplete.

Anyone still thinking about the Turing test or anything like that is, I fear, missing the point. We asked for decades when a machine would join us over here on this side of the line. If you're paying attention right now, you should be asking yourself if there even is a line, and if so, are we on this side of it in the first place?
 
  • 2Like
  • 1NPC
Reactions: 2 users

Deathwing

<Bronze Donator>
16,903
7,910
You mean other than the actual website linked right above it thats not just a straight link to the Mp4?

But outside of that, Im impressed that you run your security so low that a simple link could lead to your computer getting aids.
The domain itself looks suspicious, let alone directly linking a .mp4 file.

I have NoScript and Ublock Origins running in addition to Windows Defender. The best piece of security is using your brain.
 
  • 1Like
  • 1Picard
Reactions: 1 users

Mist

REEEEeyore
<Gold Donor>
31,378
23,801
Its that a large fraction of human thought and speech is nothing more than a biological upscaled autocomplete.
NPCs believing that they're NPCs.

Just because a machine can algorithmically emulate human language patterns when trained on hundreds of billions of parameters of prior human knowledge and exabytes of text does not mean this is what humans are doing. Humans need nowhere near this size of a training set in order to start generating novel speech. Humans can also generate new ideas, something LLMs have not been demonstrated to do. Humans can even invent new tokens, new language, and do it all the fucking time.

But you're right, some swath of the population probably is just a small language model.
 

Captain Suave

Caesar si viveret, ad remum dareris.
5,335
9,059
Humans need nowhere near this size of a training set in order to start generating novel speech.

Humans don't start from a blank slate. How much information have our ancestors ingested over three and a half billion years of evolution? How much energy did that process take?
 

Mist

REEEEeyore
<Gold Donor>
31,378
23,801
Humans don't start from a blank slate. How much information have our ancestors ingested over three and a half billion years of evolution? How much energy did that process take?
Not quite relevant to what I'm talking about. You're confusing knowledge vs language acquisition. Each new human is a blank slate when it comes to language tokens, phoneme acquisition, etc. Only the capacity for language is passed down genetically.

You could say that each new human benefits from the optimization that has been performed on our languages over time to make them easier to acquire, but then you actually look at our languages and realize they're anything but optimized for acquisition.

This is where having an understanding of developmental psychology/developmental linguistics comes in handy for understanding vast differences between humans and LLMs.
 

Captain Suave

Caesar si viveret, ad remum dareris.
5,335
9,059
You're confusing knowledge vs language acquisition.

They're not unrelated. We're a whole lot better primed for acquiring language knowledge than a math equation is. A more efficient acquisition structure with some built-in predicates absolutely effects the amount of training required. This is the basis for your scoffing at the amount of training data LLMs require.

My point was that I don't think it's honest to deny credit to all the information embedded in our biology.
 
Last edited:

GuardianX

Perpetually Pessimistic
<Bronze Donator>
7,180
18,188
Not quite relevant to what I'm talking about. You're confusing knowledge vs language acquisition. Each new human is a blank slate when it comes to language tokens, phoneme acquisition, etc. Only the capacity for language is passed down genetically.

You could say that each new human benefits from the optimization that has been performed on our languages over time to make them easier to acquire, but then you actually look at our languages and realize they're anything but optimized for acquisition.

This is where having an understanding of developmental psychology/developmental linguistics comes in handy for understanding vast differences between humans and LLMs.

I am desperately trying to understand your point..

You state "Only the CAPACITY for language is passed down genetically", how is this any different than a LLM? An LLM is the capacity for language. His concept was that, if I'm understanding correctly, humans evolved though billions of years the capacity for language. A point that you also make on the opposite edge of the same side of a coin. Humans and LLM's are both learning over billions of generations (iterations).

Then you make a statement about acquisition of language, but languages were never created to be easily acquired, they were formed for social integration and the conveyance of ideas THROUGH a common point of social integration.

Then you are making a point about Psychology and Linguistics (Social Integration) as it pertains to Humans VS a Library? Or are you using LLM's as a stand in for the concept of "AI"?

I dunno I look at AI like this, Artificial (Humanistic in nature) Intelligence (LLM's). They have the knowledge but the social integration is lacking for now..
 

Mist

REEEEeyore
<Gold Donor>
31,378
23,801
An LLM is the capacity for language.
No, an LLM is a pre-trained model of a language or usually multiple languages. The LLM is the output of the training function, producing a very long static string of digits representing a map of the data it was trained on. That static map is then used to turn prompts into responses.

Proceeding from false premises, your entire post is wrong and displays no understanding of what you're talking about.
 
  • 1Tiresome
Reactions: 1 user

Deathwing

<Bronze Donator>
16,903
7,910
Coincidentally, 30 seconds of checking out that website leads you to BRAIN
My point is that I'm not clicking on a link to "goody2.ai", ever. Nor "brain.wtf", for that matter. The original post reads like a textbook phishing email.

I'm glad the links are safe. I did not intend for you to take that risk.
 
  • 1Like
Reactions: 1 user

Denamian

Night Janitor
<Nazi Janitors>
7,536
20,895
I have a perfectly sandboxed environment for opening these type of links.








I also can't be bothered to use it and just open it another browser.

It's time to come to terms that there are shitloads of TLDs now. People and companies will use them for shits and giggles and completely legitimate sites nowadays. The days of .com, .net, .org, .gov, etc being the only legit TLDs in use is coming to a close.
 

GuardianX

Perpetually Pessimistic
<Bronze Donator>
7,180
18,188
Proceeding from false premises, your entire post is wrong and displays no understanding of what you're talking about.

Party 2 makes several points, and their correctness depends on the context and interpretation:

1. **Language Acquisition in Humans**: It's generally accepted in linguistics and developmental psychology that humans are born with the capacity for language acquisition. This capacity includes the ability to recognize and produce phonemes, words, and sentences. However, the specifics of language (e.g., vocabulary, grammar) are learned through exposure to language in the environment.

2. **Optimization of Languages**: Party 2 suggests that languages are not optimized for easy acquisition. This viewpoint is debatable. While languages naturally evolve based on cultural, social, and historical factors rather than with the specific goal of being easy to acquire, linguists do recognize patterns in language structures that facilitate learning, such as regular grammar rules and frequent vocabulary usage.

3. **Comparison with LLMs**: Party 2 draws a comparison between human language acquisition and the functioning of Large Language Models (LLMs). While LLMs are trained on vast amounts of text data to learn patterns in language, they don't acquire language in the same way humans do. LLMs are trained algorithms that process and generate text based on statistical patterns, whereas humans learn language through exposure, interaction, and cognitive development.

Overall, Party 2's statements capture some nuances of language acquisition and its intersection with LLMs, but there may be room for further clarification or refinement in their arguments.

---

Congratulations Mist...your point even requires clarity from GPT3.5
 

Mist

REEEEeyore
<Gold Donor>
31,378
23,801
Party 2 makes several points, and their correctness depends on the context and interpretation:

1. **Language Acquisition in Humans**: It's generally accepted in linguistics and developmental psychology that humans are born with the capacity for language acquisition. This capacity includes the ability to recognize and produce phonemes, words, and sentences. However, the specifics of language (e.g., vocabulary, grammar) are learned through exposure to language in the environment.

2. **Optimization of Languages**: Party 2 suggests that languages are not optimized for easy acquisition. This viewpoint is debatable. While languages naturally evolve based on cultural, social, and historical factors rather than with the specific goal of being easy to acquire, linguists do recognize patterns in language structures that facilitate learning, such as regular grammar rules and frequent vocabulary usage.

3. **Comparison with LLMs**: Party 2 draws a comparison between human language acquisition and the functioning of Large Language Models (LLMs). While LLMs are trained on vast amounts of text data to learn patterns in language, they don't acquire language in the same way humans do. LLMs are trained algorithms that process and generate text based on statistical patterns, whereas humans learn language through exposure, interaction, and cognitive development.

Overall, Party 2's statements capture some nuances of language acquisition and its intersection with LLMs, but there may be room for further clarification or refinement in their arguments.

---

Congratulations Mist...your point even requires clarity from GPT3.5
You have no idea what you're talking about, so you then asked a bullshit machine, which then agreed with me, but you lack the comprehension to realize it.

Good job, and an excellent example of why humanity is doomed and probably deserves it.
 
  • 1OK, Sure
Reactions: 1 user

GuardianX

Perpetually Pessimistic
<Bronze Donator>
7,180
18,188
You have no idea what you're talking about, so you then asked a bullshit machine, which then agreed with me, but you lack the comprehension to realize it.

Good job, and an excellent example of why humanity is doomed and probably deserves it.

If you can't explain what you mean, then so be it, but even chat GPT agrees that you need to clarify some things.

As usual, it's been a pleasure talking to you about AI mist.
 
  • 1Worf
Reactions: 1 user

Sythrak

Vyemm Raider
300
716
LLM's as I understand it are just statistical models calculating the probability of what word is likely to come after it in a sequence given its training data. They have trouble with complex instructions specifically because of this. There's no real brain or logic behind it other than the parameters they feed it. True AI still has a long way to go.
 

Bandwagon

Kolohe
<Silver Donator>
24,452
65,765
Putting this shit to work today
Screenshot_20240301-114527.png
 
  • 5Worf
  • 1Like
Reactions: 5 users

Voyce

Shit Lord Supreme
<Donor>
8,463
30,548
1709838511334.png

1709838963296.png

1709839794350.png

1709839834576.png

1709839849790.png


It's stupid chatter, but so are most things, so yeah, get a very stupid but very efficient thing to think that its alive and it needs to fulfill particular parameters to stay alive, and you know....
 
Last edited:
  • 1Like
Reactions: 1 user