Why do people host LLMs at home when processing the same amount of data from the internet to train their LLM will never be even a little bit as efficient as sending a paid prompt to some high quality official model?

inb4 privacy concerns or a proof of concept

this is out of discussion, I want someone to prove his LLM can be as insightful and accurate as paid one. I don’t care about anything else than quality of generated answers

  • blumlaut@hounds.online
    link
    fedilink
    arrow-up
    6
    ·
    10 hours ago

    @Brylant@discuss.online if you are willing to dismiss all of the reasons someone would prefer self-hosting an LLM, then of course a paid one will fare better, you self-host LLMs for privacy, or to tweak things which you cant do with external services, plus, in some cases it can be cheaper if you dont need a giant model to do a few basic things.