• 0 Posts
  • 6 Comments
Joined 2 years ago
cake
Cake day: October 8th, 2023

help-circle

  • I’ll take a step back. These LLM models are interesting. They are being trained in interesting new ways. They are becoming more ‘accurate’, I guess. ‘Accuracy’ is very subjective and can be manipulated.

    Machine learning is still the same though.

    LLMs still will never expand beyond their inputs.

    My point is it’s not early anymore. We are near or past the peak of LLM development. The extreme amount of resources being thrown at it is the sign that we are near the end.

    That sub should not be used to justify anything, just like any subreddit at any point in time.


  • It’s not easy to solve because its not possible to solve. ML has been around since before computers, it’s not magically going to get efficient. The models are already optimized.

    Revenue isn’t profit. These companies are the biggest cost sinks ever.

    Heating a single building is a joke marketing tactic compared to the actual energy impact these LLM energy sinks have.

    I’m an automation engineer, LLMs suck at anything cutting edge. Its basically a mainstream knowledge reproducer with no original outputs. Meaning it can’t do anything that isnt already done.


  • None of this is true.

    I’ve worked on data centers monitoring power consumption, we need to stop calling LLM power sinks the same thing as data centers. Its basically whitewashing the power sucking environmental disasters that they are.

    Machine learning is what you are describing. LLMs being puppeted as AI is destructive marketing and nothing more.

    LLMs are somewhat useful at dumb tasks and they do a pretty dumb job at it. They feel like when I was new at my job and for decades could produce mediocre bullshit, but I was too naive to know it sucked. You can’t see how much they suck yet because you lack experience in the areas you use them in.

    Your two cost saving points are pulled from nowhere just like how LLM inference works.


  • The current machine learning models (AI for the stupid) rely on input data, which is running out.

    Processing power per watt is stagnating. Moors law hasn’t been true for years.

    Who will pay for these services? The dot com bubble destroyed everyone who invested in it. Those that “survived” sprouted off of the corpse of that recession. LLMs will probably survive, but not in the way you assume.

    Nvidia helping openAI survive is a sign that the bubble is here and ready to blow.