I went to Houston a couple weeks ago!Did you make up the 500k number on the spot or did you know the Saturn V carries almost exactly 500k gallons of kerosene+lox?
So no practical breakthrough only some execs blowing smoke because they think quantum computing is close and they want some poor engineer to program an API to bridge communications between legacy and said still very theoretical quantum computers. I will take pure hype clickbait articles for 500 Alex!
It will be exciting when they demo the thing playing crysis 15 on max settings.
I've always felt like A.I. would so rapidly outpace us that it would just be indifferent to humanity. We might be collateral damage, in the sense that you drive over ants backing out of your driveway, but you dont aim for the anthills intentionally.You mean a whole mess of shit to kill us with?
It is true that we would be pretty much insignificant to any AI unless it somehow perceives us as a threat of any kind which is not a stretch.I've always felt like A.I. would so rapidly outpace us that it would just be indifferent to humanity. We might be collateral damage, in the sense that you drive over ants backing out of your driveway, but you dont aim for the anthills intentionally.
Any A.I. (or aliens with the tech to travel here) would so far outpace us that extinguishing our species would be a waste of their time.
Maybe I'm wrong, but given a large enough technological/evolutionary gap i dont see a practical reason to kill us rather than ignore us.
I doubt we would look like a threat, unless we tried to attack it. Which we would totally do.It is true that we would be pretty much insignificant to any AI unless it somehow perceives us as a threat of any kind which is not a stretch.
I've always felt like A.I. would so rapidly outpace us that it would just be indifferent to humanity. We might be collateral damage, in the sense that you drive over ants backing out of your driveway, but you dont aim for the anthills intentionally.
Any A.I. (or aliens with the tech to travel here) would so far outpace us that extinguishing our species would be a waste of their time.
Maybe I'm wrong, but given a large enough technological/evolutionary gap i dont see a practical reason to kill us rather than ignore us.
Only takes time for us to get there.I doubt we would look like a threat, unless we tried to attack it. Which we would totally do.
For all we know it would consider an attempted firmware update as a threat to its existence. I would hate to be the first nerd sent to patch that thing for Y2.1KI doubt we would look like a threat, unless we tried to attack it. Which we would totally do.
Why would it ask us to patch it?For all we know it would consider an attempted firmware update as a threat to its existence. I would hate to be the first nerd sent to patch that thing for Y2.1K
You would let it update its own code? Are you some kind of INSANE?Why would it ask us to patch it?
That can be arranged,As long as they give us some cool shit to distract us, like 100% immersive virtual reality, ignore away! Fuck, hook me up to the matrix for all I care. Ignorance is bliss.
Fuck, hook me up to the matrix for all I care. Ignorance is bliss.
According to a survey published in the journal Nature last summer, more than 70% of researchers have tried and failed to reproduce another scientist's experiments.
In theory yes. Can't control for stochasticity though, so I'd say 80% is acceptabledid you all see the bbc article on reproductivity in papers? (being able to reproduce scientific paper results...
Most scientists 'can't replicate studies by their peers' - BBC News
nature has introduced a reproducibility checklist for submitting authors, designed to "improve reliability and rigour."
and here is the reason
science is about being able to reproduce the experiment 100% of the time.. if you do a b c and get D as a result....ever person in the world SHOULD be able to do the exact same steps and get D....