The issue is that “AI” has become a marketing buzz word instead of anything meaningful. When someone says “AI” these days, what they’re actually referring to is “machine learning”. Like in LLMs for example: what’s actually happening (at a very basic level, and please correct me if I’m wrong, people) is that given one or more words/tokens, it tries to calculate the most probable next word/token based on its model (trained on ridiculously large numbers of bodies of text written by humans). It does this well enough and at a large enough scale that the output is cohesive, comprehensive, and useful.
While the results are undeniably impressive, this is not intelligence in the traditional sense; there is no reasoning or comprehension, and definitely no consciousness, or awareness here. To grossly oversimplify, LLMs are really really good word calculators and can be very useful. But leave it to tech bros to make them sound like the second coming and shove them where they don’t belong just to get more VC money.
True. Even I’ve been guilty of that at times. It’s just hard right now to see the positives through the countless downsides and the fact that the biggest application we’re moving towards seems to be taking value from talented people and putting it back into the pockets of companies that were already hoarding wealth and treating their workers like shit.
So usually when people say “AI is the next big thing”, I say “Eh, idk how useful an automated idiot would be” because it’s easier than getting into the weeds of the topic with someone who’s probably not interested haha.
There’s some sampling bias at play because you don’t hear about the less flashy examples. I use machine learning for particle physics, but there’s no marketing nor outrage about it.
I love how ppl who don’t have a clue what AI is or how it works say dumb shit like this all the time.
I also love making sweeping generalizations about a stranger’s knowledge on this forum. The smaller the data sample the better!
The base comment was very broad
There is no AI. It’s all shitty LLM’s. But keep sucking that techbro cheesy balls. They will never invite you to the table.
Honest question, but aren’t LLM’s a form of AI and thus…Maybe not AI as people expect, but still AI?
No, they are auto complete functions of varying effectiveness. There is no “intelligence”.
Ah, Mr Donning Kruger, it’s nice to meet you.
There you go, talking into the mirror once more.
Almost as if it’s artificial.
The issue is that “AI” has become a marketing buzz word instead of anything meaningful. When someone says “AI” these days, what they’re actually referring to is “machine learning”. Like in LLMs for example: what’s actually happening (at a very basic level, and please correct me if I’m wrong, people) is that given one or more words/tokens, it tries to calculate the most probable next word/token based on its model (trained on ridiculously large numbers of bodies of text written by humans). It does this well enough and at a large enough scale that the output is cohesive, comprehensive, and useful.
While the results are undeniably impressive, this is not intelligence in the traditional sense; there is no reasoning or comprehension, and definitely no consciousness, or awareness here. To grossly oversimplify, LLMs are really really good word calculators and can be very useful. But leave it to tech bros to make them sound like the second coming and shove them where they don’t belong just to get more VC money.
Sure, but people seem to buy into that very buzz wordyness and ignore the usefulness of the technology as a whole because “ai bad.”
True. Even I’ve been guilty of that at times. It’s just hard right now to see the positives through the countless downsides and the fact that the biggest application we’re moving towards seems to be taking value from talented people and putting it back into the pockets of companies that were already hoarding wealth and treating their workers like shit.
So usually when people say “AI is the next big thing”, I say “Eh, idk how useful an automated idiot would be” because it’s easier than getting into the weeds of the topic with someone who’s probably not interested haha.
Edit: Exhibit A
There’s some sampling bias at play because you don’t hear about the less flashy examples. I use machine learning for particle physics, but there’s no marketing nor outrage about it.