That’s the same timeframe as the one used in the article, and sure, they could have made it explicit again, but implicitly it makes sense because it’s the one that’s useful for a direct comparison.
Turns out, the implicit timeframe that should be clear after reading the article was the right one, and it’s pretty damning for bitcoin as is. So again, I am not sure what point you want to make.
I’m on the side of Retiring@lemmy.ml here, since I read the comments before the article. Without the articles’ context I had no idea if this meant all-time usage, per year, or per month.
Since the link is right there though, which says per year, it’s really not a huge deal.
WattHours is a unit of work. If you say that bitcoin uses x amount of Wh it doesn’t say shit about how much it actually consumes. Because you don’t say in what amount of time Bitcoin uses said amount of work, you cannot compare it. I could state, that Bitcoin uses 5 Wh. Which would also be correct.
Its the same as saying, Bob eats 5 apples. Alice eats 2000 apples. Can you compare the two? No, because what I forgot to mention is, that Bon eats 5 apples a week and Alice eats 2000 apples in 3 years. Now i can compare the two.
PoS requires significant staker profits to work, which would create the same inequality as the dollar has. It’s basically dollar bonds but without regulations.
Bitcoin is estimated to consume 172 TWh, which is way more than Google and Microsoft combined.
https://digiconomist.net/bitcoin-energy-consumption
172 TWh per year
Your statement was as useful as the following: A VW Polo car costumes 3000 liters of fuel.
*Edit: Downvote me all you want 😂 if I am right I am right.
Your point?
The data in the article was for one year. This is the same unit.
The comment was 172TWh without specifying a timeframe whatsoever. Is it a year? Is it a day? A month?
It was about the comment about bitcoin, not the post itself.
That’s the same timeframe as the one used in the article, and sure, they could have made it explicit again, but implicitly it makes sense because it’s the one that’s useful for a direct comparison.
Turns out, the implicit timeframe that should be clear after reading the article was the right one, and it’s pretty damning for bitcoin as is. So again, I am not sure what point you want to make.
I’m on the side of Retiring@lemmy.ml here, since I read the comments before the article. Without the articles’ context I had no idea if this meant all-time usage, per year, or per month.
Since the link is right there though, which says per year, it’s really not a huge deal.
The article is also about per year
Yes it is. But your comment still doesn’t make sense until you add “per year”.
The downvotes aren’t because you’re wrong, they’re because you’re bring obnoxious about being right.
So, is Watt-hours/unit-time no longer a meaningful unit?
Because, if so, you better tell every power company I’ve had, because that’s how they’ve billed me.
WattHours is a unit of work. If you say that bitcoin uses x amount of Wh it doesn’t say shit about how much it actually consumes. Because you don’t say in what amount of time Bitcoin uses said amount of work, you cannot compare it. I could state, that Bitcoin uses 5 Wh. Which would also be correct.
Its the same as saying, Bob eats 5 apples. Alice eats 2000 apples. Can you compare the two? No, because what I forgot to mention is, that Bon eats 5 apples a week and Alice eats 2000 apples in 3 years. Now i can compare the two.
Do you get my point?
Yes, bitcoin is trash. But most modern cryptos use far less energy. For example the second largest crypto ethereum uses almost no energy compared to bitcoin/AI..
“AI” can not say the same at all. And, unlike crypto, there’s no realistic improvement in sight. It just keeps getting worse.
PoS requires significant staker profits to work, which would create the same inequality as the dollar has. It’s basically dollar bonds but without regulations.
There’s more to “AI” than just ChatGPT…
I think you’re mixing up what AI actually means here, you would probably like this video: https://www.youtube.com/watch?v=nGIpdiQrFDU
But in brief, what about DLSS? The ML models for that get improved with every driver update.
STT models like whisper that are great at transcribing/translating.
Object recognition models for drones to keep the camera centered on you and for object avoidance.
ML models for finding new cures.
Models in astronomy for finding planets… Etc.
You’re trying to tell me that everything “AI” is trash and not getting better?