500ml of water is .0001$ here and 140wh is 1.68cents.
That's for 100 words of text.500ml of water is .0001$ here and 140wh is 1.68cents.
So closer to 1.69 cents at my rates. Even cheaper! I wonder what percentage of an African child slave would die in the process.
That's for 100 words of text.
That's for 100 words of text.
I'm not saying that's not a good trade. It was just a frame of reference.if writing 100 words takes 5 minutes, that's a couple pennies in food and ~$7 of my time. Let's pretend it's $0.10 in compute. I'd have be able to write the email in under a five seconds to break even. Good trade.
I'm not saying that's not a good trade. It was just a frame of reference.
I'm just saying that scaling up to either coding or rendering an entire game using inference would be... a lot more than that.
That's for 100 words of text.
I didn't believe it either, but I did the research and got shit on by people way smarter than me and it turned out they were right. (If it helps you believe it, they were men!)Ya I don’t believe this random picture you posted. So it costs the ‘economy’ (or tech companies??) 100s of dollars worth of energy and water (?? Cooling?) to get an AI to take a few minutes to spit out a short novel? It doesn’t even make sense. God imagine what the millions of images being generated constantly costs - must be in the trillions. There would be zero way to actually fund the start up plays if It was true that it’s almost 2 bucks per 100 words, a metric that really doesn’t make sense.
Is this like if I eat a steak I just deleted 1000 gallons of water and made 200lbs of co2 or something? (Because water / nature of course isn’t fungible /boggle)
I didn't believe it either, but I did the research and got shit on by people way smarter than me and it turned out they were right. (If it helps you believe it, they were men!)
The dollar cost of what's in the picture is not high, as Furry said, until you start adding it up over millions of prompts. But all of these companies are bleeding money, massively. OpenAI has stated in public documents that they intend to push the subscription fee up to $44 within 5 years, and even then they will be bleeding money:
OpenAI might raise the price of ChatGPT to $44 by 2029 | TechCrunch
OpenAI is considering raising the price of its AI-powered chatbot, ChatGPT, significantly — to $22 by 2025 and $44 by 2029.techcrunch.com
They wouldn't be bleeding billions a day at 1.69 cents every 100 words, but they are bleeding billions a year. I think Furry's math is off by a bit, but it still probably costs them less than 5 cents every 100 words.Just doesn’t make any logical sense to me. I understand start ups bleed money but if those numbers are true they’d be bleeding billions a day and using oceans worth of water every month. According to the numbers I’d could spend that in 30 minutes going down a few different lines of questIoning. Really stands out as a lies damned lies and statistics type funny number bullshit.
They wouldn't be bleeding billions a day at 1.69 cents every 100 words, but they are bleeding billions a year. I think Furry's math is off by a bit, but it still probably costs them less than 5 cents every 100 words.
For instance, they have tricks like prompt caching, because a lot of people ask the same shit.
I'm sure those people know everything and have no reason to just push out meaningless slop as part of an agenda. Truth is, I don't care how much AI costs other people. If its free to me, I'll use it. If I gotta pay, NOPE. That said, I suspect there are a lot of people who will pay for AI. So I dunno, I wish them luck. Personally, I don't have any real use cases for AI past fucking with it for fun just to see what it will do. I already have 10s of thousands of unread work emails, cause I just can't be bothered to respond in the first place.I didn't believe it either, but I did the research and got shit on by people way smarter than me and it turned out they were right. (If it helps you believe it, they were men!)
The dollar cost of what's in the picture is not high, as Furry said, until you start adding it up over millions of prompts. But all of these companies are bleeding money, massively. OpenAI has stated in public documents that they intend to push the subscription fee up to $44 within 5 years, and even then they will be bleeding money:
OpenAI might raise the price of ChatGPT to $44 by 2029 | TechCrunch
OpenAI is considering raising the price of its AI-powered chatbot, ChatGPT, significantly — to $22 by 2025 and $44 by 2029.techcrunch.com
It based on this study.
AI programs consume large volumes of scarce water
UCR study the first time estimates the huge water footprint from running artificial intelligence queries that rely on the cloud computations done in racks of servers that must be kept cool in warehouse-sized data processing centers.news.ucr.edu
I'm sure those people know everything and have no reason to just push out meaningless slop as part of an agenda. Truth is, I don't care how much AI costs other people. If its free to me, I'll use it. If I gotta pay, NOPE. That said, I suspect there are a lot of people who will pay for AI. So I dunno, I wish them luck. Personally, I don't have any real use cases for AI past fucking with it for fun just to see what it will do. I already have 10s of thousands of unread work emails, cause I just can't be bothered to respond in the first place.
But I've set up AI on my computer, so I'm probably not the typical user.
It based on this study.
AI programs consume large volumes of scarce water
UCR study the first time estimates the huge water footprint from running artificial intelligence queries that rely on the cloud computations done in racks of servers that must be kept cool in warehouse-sized data processing centers.news.ucr.edu
1.69 cents was going off what the things in the picture graphic would cost me to buy at my local business utility rates. Obviously your mileage will vary depending on where you are, and a company can business expense/get other deals that might not just be slapped on the utility company's page.So does the 1.69 factor in prompt caching?
Maybe not a billion a day but the amount of queries and requests, for words, images, videos ect a day has got to be mind boggling voluminous. And the number of queries are probably growing exponentially as people start to realize how easy it is to use “ai”.
A totally uneducated but logical bet i’d make is it costs them less than a cent for their AI to spit out a 100 word email. Anything more would be ruinous. 1.69 is laughable.