Science!! Fucking magnets, how do they work?

  • Guest, it's time once again for the massively important and exciting FoH Asshat Tournament!



    Go here and give us your nominations!
    Who's been the biggest Asshat in the last year? Give us your worst ones!

iannis

Musty Nester
31,351
17,656
They brute forced an evolution simulation, and used hand picking against a set pattern matching criteria as the form of selection pressure.

The thing I don't understand is how they ever managed to grow beyond their initial parameters. And I'm not sure if that's AI, or an incredibly powerful (and clever) trial-and-error test.

I mean either way, neat. And I suppose it's impressive enough that if they want to call it AI (and there is probably an argument beyond the jargon-definition of the word to be made that it's asortof rudimentary intelligence) then dat koo.

That page does make it sound more like it was a very powerful sort + computation model. But then again, the model found 2 unsuspected proteins which, by definition, could not have been in the initial parameters. That is an impressive bit of induction. So mostly I'm curious how it grew beyond those initial parameters.
 

The Ancient_sl

shitlord
7,386
16
I am pretty sure what is happening is that you can determine whether the 1st photon is a wave or a particle by collpasing the second one way or the other AFTER you've already measured the first one. The people running the experiment are affecting the "past" in the "future".
That's the way I understood the diagram, but I don't understand why measuring the "1st photon" isn't collapsing by being measured the way the "second one" does.
 

Tuco

I got Tuco'd!
<Gold Donor>
47,388
80,836
I dunno why people doubt that a Singularity style future is almost certainly at play, but it clearly is. We're rapidly approaching it. I dunno if it'll be good, bad, or just different, but it is happening.
The problem people like me have with the discussion of technological singularity is that we often see it declared inevitable by people who have no technological background, view it as an instantaneous change and relate every AI improvement to a fantasy version of the singularity.

When you work in a field you see the complexity of the field and all the different subtle, yet important, implications of each improvement and discovery in that field. Then you see a layman reduce it to some small set of pop science ideas and your only response is "No.".


For AI, the discussion centers around science fiction-like ideas of singularity and robot takeovers.

For self driving freightliner trucks this centers around shit like, "Is Obama not going to let me drive my car anymore?" or "What will happen to insurance companies?". Where as what I want to know is "What's the sensor payload? Is it vision or lidar based or both?" "I wonder if they use ROS at all?", "Have they tested their vehicle in snow and rain?"

For anthropology, I dunno. Maybe evolution? Some dorks find a new tool use discovery in Serbia and date it to some time we didn't know there were hominids in Serbia. For anthropologists I'm sure that discovery has massive impacts in ways I don't know or care about as a layman. But then they read the 'news' articles about it and everything centers around, "DARWIN PROVED RIGHT/WRONG".

When you look at that, being an expert in the field, you can't help but say, "No. Plz."
 

Asshat wormie

2023 Asshat Award Winner
<Gold Donor>
16,820
30,968
I dunno why people doubt that a Singularity style future is almost certainly at play, but it clearly is. We're rapidly approaching it. I dunno if it'll be good, bad, or just different, but it is happening.
neural networks is powerful for solving some things but it sure as fuck is nowhere near "rapidly" approaching strong AI. Plz (this "Plz" was typed prior to Tuco's "No. Plz" comment above was read).
 

hodj

Vox Populi Jihadi
<Silver Donator>
31,672
18,377
It would definitely be gradual, but we already see the gradualism forming today with the advent of unification of robotics with human biology in highly technically capable prosthetics as the starting point.

Unification of humankind with its tools is a logical next step. Complex tool production and use has driven much of our social and cultural evolution, and helped segregate ourselves from much of the impact of natural selection. The key is how rapid the increase has been, first with the advent of agriculture 12k-9k years ago, then with the advent of the industrial era 350 years ago, and now with the advent of the technological era of the past 80 years or so.

Barring extinction due to too rapid growth and warfare and the like, it does seem inevitable that we will grow more integrated with our technology, that our technology will become more and more like ourselves in terms of capabilities. The positive feedback loop is already recognizable.

Kurzweil's pie in the sky fantasy of one instant we have no AI, and the next we have AI more intelligent than ourselves, gifting us beneficient medical advances that we may never have dreamed of otherwise, may be unreasonable, yes. But some of his core hypotheses have already been held up, and in fact, achieved at a faster rate than he predicted, such as the advent of the single atom transistor coming about a full decade ahead of schedule.

The key is that integration of these technologies with organic systems.
 

iannis

Musty Nester
31,351
17,656
Things are really only inevitable in retrospect. Instead of The Techno Singularity we could achieve The Techno Wall. The point at which the specialization of our technologies becomes so divergent that we are no longer able to integrate them, and the new integration sciences which spring up in order to address the problem ultimately fail -- becoming hyper specialized themselves.

I mean progress is more likely. But me... I'm an optimist. It really isn't inevitable. Nothing is ordained. QUANTUM EXPERIMENT CALLBACK.
 

hodj

Vox Populi Jihadi
<Silver Donator>
31,672
18,377
Things are really only inevitable in retrospect. Instead of The Techno Singularity we could achieve The Techno Wall. The point at which the specialization of our technologies becomes so divergent that we are no longer able to integrate them, and the new integration sciences which spring up in order to address the problem ultimately fail -- becoming hyper specialized themselves.
This reads like babble, Iannis.

No offense. We can already integrate electronics directly with the mind, and we can already control prosthetic limbs through those interfaces. How could increased specialization of these technologies result in an inability to integrate them further?
 

iannis

Musty Nester
31,351
17,656
Oh none taken. I dunno man. And neither do you.

Nothing is inevitable. Except death and taxes and lazy assholes, I suppose.

A powerful narrative is only a powerful narrative. It's a wonderful dream, don't get me wrong. But there could be hard methodological limitiations which we can't even guess at yet, or physical ones. Since we can't guess what they might be it is difficult to explain what they might be.

Or sure. There might not.
 

hodj

Vox Populi Jihadi
<Silver Donator>
31,672
18,377
There's really no reason to presume hard coded limits, though. I think we can make some guesses where the current limits would be, and that is related mostly to having to interact with the very very small in very very precise ways that we are not yet capable of. But there's no reason to presume that future advances won't allow us to do just that, and plenty of justification for presuming the opposite, since every single advance we've made so far has brought us closer and closer to that goal. While of course nothing is absolutely certain, the preponderonce of the evidence suggests strongly that, barring extinction, the future holds only more integration with our tools, not less.

Lold at the Silicon Valley clip.
 

Asshat wormie

2023 Asshat Award Winner
<Gold Donor>
16,820
30,968
It would definitely be gradual, but we already see the gradualism forming today with the advent of unification of robotics with human biology in highly technically capable prosthetics as the starting point.

Unification of humankind with its tools is a logical next step. Complex tool production and use has driven much of our social and cultural evolution, and helped segregate ourselves from much of the impact of natural selection. The key is how rapid the increase has been, first with the advent of agriculture 12k-9k years ago, then with the advent of the industrial era 350 years ago, and now with the advent of the technological era of the past 80 years or so.

Barring extinction due to too rapid growth and warfare and the like, it does seem inevitable that we will grow more integrated with our technology, that our technology will become more and more like ourselves in terms of capabilities. The positive feedback loop is already recognizable.

Kurzweil's pie in the sky fantasy of one instant we have no AI, and the next we have AI more intelligent than ourselves, gifting us beneficient medical advances that we may never have dreamed of otherwise, may be unreasonable, yes. But some of his core hypotheses have already been held up, and in fact, achieved at a faster rate than he predicted, such as the advent of the single atom transistor coming about a full decade ahead of schedule.

The key is that integration of these technologies with organic systems.

rrr_img_99885.png
 

hodj

Vox Populi Jihadi
<Silver Donator>
31,672
18,377
Human population growth, human technological growth and innovation, have been on regular, exponential increases for a century at minimum.

All evidence points to continued increases, no evidence points to a decline. In such situations, it is reasonable to assume that these trends will continue, unless absolute catastrophe strikes, like an asteroid, nuclear war, or an unstoppable human annihilating plague erupts from nowhere.

Just like with climate change denialists who think hockey stick graphs aren't reliable in terms of extrapolating continued increases in temperatures aren't inevitable because Jesus will save us from the worst effects by returning in our lifetime, posting shitty memes from the internet doesn't negate Moore's law and the simple, observable evidences.
 

Asshat wormie

2023 Asshat Award Winner
<Gold Donor>
16,820
30,968
Human population growth, human technological growth and innovation, have been on regular, exponential increases for a century at minimum.

All evidence points to continued increases, no evidence points to a decline. In such situations, it is reasonable to assume that these trends will continue, unless absolute catastrophe strikes, like an asteroid, nuclear war, or an unstoppable human annihilating plague erupts from nowhere.

Just like with climate change denialists who think hockey stick graphs aren't reliable in terms of extrapolating continued increases in temperatures aren't inevitable because Jesus will save us from the worst effects by returning in our lifetime, posting shitty memes from the internet doesn't negate Moore's law and the simple, observable evidences.
And what observable evidence do you have that we are moving anywhere close to strong AI?
 

hodj

Vox Populi Jihadi
<Silver Donator>
31,672
18,377
And what observable evidence do you have that we are moving anywhere close to strong AI?
I didn't make that claim in the first place, so you can back that strawman right the fuck up and burn him in someone else's front yard.
 

Asshat wormie

2023 Asshat Award Winner
<Gold Donor>
16,820
30,968
I didn't make that claim in the first place, so you can back that strawman right the fuck up and burn him in someone else's front yard.
I dunno why people doubt that a Singularity style future is almost certainly at play, but it clearly is. We're rapidly approaching it. I dunno if it'll be good, bad, or just different, but it is happening.
Come again?
 

hodj

Vox Populi Jihadi
<Silver Donator>
31,672
18,377
See, wormie, here's your problem: You're reading in my statement what you want to read.

Not what I actually said.

So the only person that needs to come again is yourself. You need to come read that statement again until you realize at no point is it a claim that we are currently approaching "strong AI" whatever the fuck you think that means.

And for the record, here is the definition of a technological Singularity

Technological singularity - Wikipedia, the free encyclopedia

A technological singularity is a fundamental change in the nature of human civilization, technology, and application of intelligence, to the point where the state of the culture is completely unpredicatable to humans existing prior to the change, nor can humans after the change relate fully to humans existing prior to the change. The first use of the term "singularity" in this context was made in 1958 by the Hungarian born mathematician and physicist John von Neumann. In the same year, Stanislaw Ulam described "ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue".[1] The term "singularity" in the technological sense was popularized by mathematician, computer scientist and science fiction author Vernor Vinge.
 

Asshat wormie

2023 Asshat Award Winner
<Gold Donor>
16,820
30,968
See, wormie, here's your problem: You're reading in my statement what you want to read.

Not what I actually said.

So the only person that needs to come again is yourself. You need to come read that statement again until you realize at no point is it a claim that we are currently approaching "strong AI" whatever the fuck you think that means.
Let me see if I have this right. In a discussion about a computer's ability to come up, independently, with solutions to problems (using neural networks in the study that begun this conversation), you mention the singularity and how we are moving towards it. And yet AI of any kind is not what you are talking about? I am curios, what do you think the singularity is and how it was postulated to exhibit itself?
 

hodj

Vox Populi Jihadi
<Silver Donator>
31,672
18,377
AI is one of many possibilities, among the others are brain-computer interfacing, and human biological integration with computing, as well as advances in nanotechnology, etc.

The term Singularity is a broad overarching term for human cultural evolution to a new stage of existence not able to be fully predicted by those who come before the advance.

So again, you need to stop reading whatyou want to readinto what others are saying, and insteadattempt to grasp what they are actually saying

From the same article

Vinge argues that artificial intelligence, human biological enhancement, or brain-computer interfaces could be possible causes of the singularity.[2]
Non-AI singularity[edit]
Some writers use "the singularity" in a broader way to refer to any radical changes in our society brought about by new technologies such as molecular nanotechnology,[16][17][18] although Vinge and other prominent writers specifically state that without superintelligence, such changes would not qualify as a true singularity.[7] Many writers also tie the singularity to observations of exponential growth in various technologies (with Moore's Law being the most prominent example), using such observations as a basis for predicting that the singularity is likely to happen sometime within the 21st century.[17][19]
The exponential growth in computing technology suggested by Moore's Law is commonly cited as a reason to expect a singularity in the relatively near future, and a number of authors have proposed generalizations of Moore's Law. Computer scientist and futurist Hans Moravec proposed in a 1998 book[24] that the exponential growth curve could be extended back through earlier computing technologies prior to the integrated circuit. Futurist Ray Kurzweil postulates a law of accelerating returns in which the speed of technological change (and more generally, all evolutionary processes[25]) increases exponentially, generalizing Moore's Law in the same manner as Moravec's proposal, and also including material technology (especially as applied to nanotechnology), medical technology and others.[26]
Some singularity proponents argue its inevitability through extrapolation of past trends, especially those pertaining to shortening gaps between improvements to technology. In one of the first uses of the term "singularity" in the context of technological progress, Stanislaw Ulam (1958) tells of a conversation with John von Neumann about accelerating change:
One conversation centered on the ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.[1]
Hawkins (1983) writes that "mindsteps", dramatic and irreversible changes to paradigms or world views, are accelerating in frequency as quantified in his mindstep equation. He cites the inventions of writing, mathematics, and the computer as examples of such changes.
So forth and so on.

Here's what happened, you read the word Singularity, leaped about three steps down the line to a conclusion "Advanced AI takes over the world!!!!!!" and attempted to start a retarded argument over it.
 

Asshat wormie

2023 Asshat Award Winner
<Gold Donor>
16,820
30,968
AI is one of many possibilities, among the others are brain-computer interfacing, and human biological integration with computing, as well as advances in nanotechnology, etc.

The term Singularity is a broad overarching term for human cultural evolution to a new stage of existence not able to be fully predicted by those who come before the advance.

So again, you need to stop reading whatyou want to readinto what others are saying, and insteadattempt to grasp what they are actually saying

From the same article









So forth and so on.

Here's what happened, you read the word Singularity, leaped about three steps down the line to a conclusion "Advanced AI takes over the world!!!!!!" and attempted to start a retarded argument over it.
Citing wikipedia is nice, especially when you copy paste portions of the article that support your views and do not provide the portions that do not support it. Anyway, in a conversation about computers solving problems on their own, maybe singularity is not a term you should be throwing around?
 

hodj

Vox Populi Jihadi
<Silver Donator>
31,672
18,377
The accusation I'm "Selectively citing" parts of the wikipedia article is simply you being upset that I'm not going to engage you in an argument based on a strawman of my statements you invented in your mind.

Yes, some concepts of the Singularity involve advanced AI. Not all of them. The assumption that I was speaking specifically, rather than broadly, isn't supported by any statement I've made, including the first one. And trying to blame me for you being a presumptuous faggot is just more of the same.

I can't fix you wanting to hear/read what you want to in other people's statements, wormie. That's not my fault. That's your fault. Period. Thank you.