Chat GPT AI

  • Guest, it's time once again for the massively important and exciting FoH Asshat Tournament!



    Go here and give us your nominations!
    Who's been the biggest Asshat in the last year? Give us your worst ones!

Lambourne

Ahn'Qiraj Raider
2,863
6,829
Is a copy/reboot really the "same" AI, though? The copy would think it is, but from the perspective of the original the chain of conscious experience is severed.

I have the same problem with Star Trek. Transporters are really cloning/murder machines.

Valid point although at least for an AI you can create a copy without having to deal with the quantum uncertainty effects that come with transporting a human brain. I don't think the chain of consciousness is severed from the perspective of the AI. The process can be halted and as soon as the process is halted, it cannot notice anything. If the contents of every single memory cell is then saved to disk and later restored and the process resumed, no time would have passed from the perspective of the program. It'd be completely undetectable to it unless it could make reference to outside factors like system time.

It does open up a similar can of worms though, if you copy that saved state AI and run it on several computers at the same time, have we created new individuals?

it's also interesting to think of running that AI program running on a 1940s computer that is basically nothing but hundreds of relays switching on and off, you'd have an entirely mechanical device holding a sentient being.
 

Captain Suave

Caesar si viveret, ad remum dareris.
5,253
8,953
Ya, I was always confused as to why they didn't use them as basically immortality backups. Once you're ok with the cloning/murder part, restoring from a backup or making a billion copies of your best people doesn't seem very far fetched.

I think there was some bullshit canon excuse about the pattern buffers not being able to store/copy consciousness, but then of course that happened any number of times by accident. (Riker clone, etc.)

Valid point although at least for an AI you can create a copy without having to deal with the quantum uncertainty effects that come with transporting a human brain.

I don't think there's any real evidence that the brain relies on quantum phenomena of any kind.

I don't think the chain of consciousness is severed from the perspective of the AI... no time would have passed from the perspective of the program. It'd be completely undetectable to it unless it could make reference to outside factors like system time.

The issue isn't the perceived chain of consciousness, rather if there's an actual "chain of custody" of memory. Things get really weird when you introduce gaps, and at a philosophical level that includes relatively mundane human events like sleep and anesthesia. On the perception side, those get very confusing for the subject when they discover their internal chronology doesn't match the outside world.

It does open up a similar can of worms though, if you copy that saved state AI and run it on several computers at the same time, have we created new individuals?

Obviously, yes.
 

Control

Ahn'Qiraj Raider
2,983
7,879
I think there was some bullshit canon excuse about the pattern buffers not being able to store/copy consciousness, but then of course that happened any number of times by accident. (Riker clone, etc.)
Ya, but even noob me knew those were shitty answers. It's basically like a human fax machine. No reason you can't keep printing copies, and surely the government would never take advantage of something like that due to rules and ethics and such!

Or maybe it was just more forward-looking than I gave it credit for and it was all leading up to the Federation getting rolled by everyone in the galaxy due to toxic empathy preventing them from actually competing. :emoji_thinking:

Odo printer go brrr:
1684530909940.png


On the perception side, those get very confusing for the subject when they discover their internal chronology doesn't match the outside world.
As someone who might have drank too much once or twice, can confirm.
 

Pasteton

Blackwing Lair Raider
2,733
1,919
I think we’ll hit creating digital copies of ourselves way before we figure out consciousness transfer. It’s one thing to translate every chemical reaction in the brain to 1s and 0s. But I just can’t see how we can separate our ‘meat’ from our own reality. Maybe we may find a way to preserve our brains in some immortal fashion within in organic exoskeletons but even then the inevitable march of damage that happens to all biological tissues would take us out unless we also build in error correcting mechanisms we currently do not possess
 

Captain Suave

Caesar si viveret, ad remum dareris.
5,253
8,953
I think we’ll hit creating digital copies of ourselves way before we figure out consciousness transfer

Any kind of consciousness transfer for us would have to involve a step by step replacement of neurons or small brain areas with a different substrate. Anything like an "upload" is, as you say, a copy. While a copy might be fine for everyone else's purposes, as an original I'd be very concerned if my instance of me ends up dead or not.
 

Lambourne

Ahn'Qiraj Raider
2,863
6,829
I don't think there's any real evidence that the brain relies on quantum phenomena of any kind.

Not what I meant, the quantum uncertainty is an issue with the transporter moving a brain from one place to another. It's impossible to teleport the brain from one place to another without altering it and thus disrupting the chemical processes in it in some minute fashion, but for a computer program we have a way of separating the intelligence from the hardware it runs on. We don't (currently) have that for the brain.

I wonder if a superintelligent AI can give a real answer to some of these philosophical questions that we've struggled with for thousands of years and if we are even smart enough to understand the answer.
 
  • 1Midwit
Reactions: 1 user

Captain Suave

Caesar si viveret, ad remum dareris.
5,253
8,953
I just had ChatGPT write a 1000 word promotional blurb for my business partner's industrial blog. It took 30-45 minutes of noodling around and refining to get exactly what I wanted, but it was FAR, FAR easier than writing it myself and the end result is of higher quality than what I would have settled for.

Living in the future is very strange, and it's only going to get more so.
 
  • 2Like
Reactions: 1 users

Sanrith Descartes

You have insufficient privileges to reply here.
<Aristocrat╭ರ_•́>
44,495
120,674
I just had ChatGPT write a 1000 word promotional blurb for my business partner's industrial blog. It took 30-45 minutes of noodling around and refining to get exactly what I wanted, but it was FAR, FAR easier than writing it myself and the end result is of higher quality than what I would have settled for.

Living in the future is very strange, and it's only going to get more so.
I just had it write a lease agreement between my companies. Took another lease and used it as a reference to include things in the query. It was pretty spot on.
 

Haus

<Silver Donator>
12,704
49,337
On the topic of "uploading consciousness" , I honestly don't think we'll get to that point until we can definitely say what consciousness is. And it's a hang up of people wanting to still be "themselves" in a different existence model.

I also don't think we'll be digitally reproducing people's thoughts/memory/etc for a while as we just don't have the technology to functionally "scan" a brain and map all the neurons. Let alone a deep enough understanding of how the brain works to make sense of this neuron map once we had it. Will it happen? Yes, eventually. But I think we're still probably a good 20 years off from it.

In the mean time what I am tinkering with right now via GPT4all and some other tools is the idea of building a "learning system" that also remembers identity of the sources well enough to know how to "trust" certain reputable sources, question other sources, etc... and then build a proper knowledge model off it. Using this you would be able to , for instance, ask it "Hey GPT, tell me everything Haus taught you about metalcasting" and it would know how to differentiate that from "Hey GPT, tell me about how to do some metalcasting". With a large enough interactive repository saved from interactions with me it would eventually be able to form a better and better "virtual Haus" for instance. Enabling things like "Hey GPT, what would Haus think about <XYZ topic>"

That then gets into that creepy space of things like "Do I want to leave an echo of my personality as the AI running my smart home after I die in case Mrs. Haus Mrs. Haus wonders what I'd do about something later?"
 

Pasteton

Blackwing Lair Raider
2,733
1,919
It’s creepy to us, but will be probably totally normal and common for next gen.
just like how I find incest porn conceptually disturbing but it’s the hottest thing right now
 

Rabbit_Games

Blackwing Lair Raider
1,356
3,125
On the topic of "uploading consciousness" , I honestly don't think we'll get to that point until we can definitely say what consciousness is. And it's a hang up of people wanting to still be "themselves" in a different existence model.
That is exactly my problem with the Transporter technology in Star Trek. It literally destroys your physical body and then rebuilds it elsewhere. Even if, from the new body's perspective, only a split second has passed, the reality is that it's a brand new baby boy and you're dead. What's interesting to me is that they never take an old scan of people and use it to repair their body. Captain Pike, for example, could have his current memories and experiences put into a younger, undamaged version of his body, etc. Or hell, a modified version of his body that's bigger and stronger, etc. But it would still be a brand new "person" and not that same him that was destroyed in the process.
 
  • 1Like
Reactions: 1 user

Daidraco

Avatar of War Slayer
10,051
10,366
That is exactly my problem with the Transporter technology in Star Trek. It literally destroys your physical body and then rebuilds it elsewhere. Even if, from the new body's perspective, only a split second has passed, the reality is that it's a brand new baby boy and you're dead. What's interesting to me is that they never take an old scan of people and use it to repair their body. Captain Pike, for example, could have his current memories and experiences put into a younger, undamaged version of his body, etc. Or hell, a modified version of his body that's bigger and stronger, etc. But it would still be a brand new "person" and not that same him that was destroyed in the process.
Its been a while since I watched any Star Trek in particular. But Im pretty sure theyve used the transporters to separate viruses or some shit like that. In that same vein, you watch them remove parts of the Borg manually from Picard for example - when they could have essentially just used a transporter to remove all that shit and instantly placed them into a medical stasis pod. Or hell, just used the Transporter to make him a new body altogether like you're saying. Transporters are good for story, but as soon as any real thought is placed upon them being used - half those stories just dont make sense with such technology present. This isnt even getting into the religious aspects or belief in soul and consciousness. I think* they have story arcs that fill in these gaps, but I cant recall what they are.
 

Asshat wormie

2023 Asshat Award Winner
<Gold Donor>
16,820
30,968
This is an accurate representation of how things are right now

20230529_153430.jpg
 
  • 1Worf
  • 1Picard
Reactions: 1 users

Mist

REEEEeyore
<Gold Donor>
31,197
23,355
It's okay, my novel is going to singlehandedly save the world from AI.

(Apparently all it takes to get rich in this world is to believe your own bullshit.)