Chat GPT AI

Jasker

brown please <Wow Guild Officer> /brown please
1,528
944
When will ChatGPT be integrated with real dolls? How far away from that are we?
 

Edaw

Parody
<Gold Donor>
12,381
80,341
When will ChatGPT be integrated with real dolls? How far away from that are we?

Already there.


Screenshot 2024-03-09 at 18-48-40 Buy Real Doll Robot Buy Silicone Love Sex Dolls Online.png


 
  • 2Worf
  • 1Like
Reactions: 2 users

Mist

Eeyore Enthusiast
<Gold Donor>
30,490
22,403
Add this to the list of reasons why you don't use generative AI for customer-facing platforms:

1710256876826.png



IT DOESN'T EVEN WORK AS A LIMERICK.
 
  • 1Worf
Reactions: 1 user

Bandwagon

Kolohe
<Silver Donator>
22,921
60,106
Add this to the list of reasons why you don't use generative AI for customer-facing platforms:

View attachment 519343


IT DOESN'T EVEN WORK AS A LIMERICK.
I like to imagine you telling this story while we're laying in bed, as you twirl a finger through my unapologetically masculine chest hair. Then I put you in a half nelson and whisper "I'm going to make you forget dildos exist" before tying you up for 3 hours while I watch Shark Tank and eat pistachios in bed.
 
  • 1WTF
Reactions: 1 user

ToeMissile

Pronouns: zie/zhem/zer
<Gold Donor>
2,779
1,693
I like to imagine you telling this story while we're laying in bed, as you twirl a finger through my unapologetically masculine chest hair. Then I put you in a half nelson and whisper "I'm going to make you forget dildos exist" before tying you up for 3 hours while I watch Shark Tank and eat pistachios in bed.
And then?
 

Big Phoenix

Pronouns: zie/zhem/zer
<Gold Donor>
44,858
93,782
AI oroboros;

 

Pasteton

Blackwing Lair Raider
2,616
1,729
How would I achieve 300 gb if ram at home if I wanted to try this. Just a buncha video cards chained together or is there another option
 

Mist

Eeyore Enthusiast
<Gold Donor>
30,490
22,403
Jesus Christ, wtf is being loaded into memory that can't be paged/spooled/cached/whatever?
A neural network.
How would I achieve 300 gb if ram at home if I wanted to try this. Just a buncha video cards chained together or is there another option
You're not going to be able to. Period.

If you're interested in running an LLM locally, and have an RTX card, you can do this:


Or you can go on huggingface and download one of the models designed to run locally.

Grok is a very, very large model for how relatively dumb it is:

1710771309110.png
 
  • 1Like
Reactions: 1 user

Bandwagon

Kolohe
<Silver Donator>
22,921
60,106
A neural network.

You're not going to be able to. Period.

If you're interested in running an LLM locally, and have an RTX card, you can do this:


Or you can go on huggingface and download one of the models designed to run locally.

Grok is a very, very large model for how relatively dumb it is:

View attachment 520401
Do you think Grok's intelligence score is lower because it thinks there's only two genders and that white men are not the single greatest threat facing America?