

Sweet!


Sweet!


If I’m understanding, this is like the old trick of using Google translate to bypass web blacklists.
But it won’t work for networks which implement whitelists.
Are you the only user of this searxng?
So like, if you passed on the rate limiting to your users, would you get less rate limited from the search providers that SearXNG uses?


Love this idea!
Apologies if you’ve written this elsewhere, but do you have a write up of what inspired this project? Particularly why a selfhosted solution vs. client software?
My guess is:
Thanks!


Don’t be mad when ppl don’t like LLM code. You can release something for free but calling people ungrateful for not liking seems a bit… entitled.


Do not want what?/I’m not sure what you’re talking about?
I was just saying it is fine to browse Lemmy reading comments to find out what a video is about, not that we should have low effort video posts.


Live and let live I say.


That is what Lemmy is for. Sometimes you will take the risk and watch the video and comment, if it interests you more.
Other times you’ll let others do it for you, or skim over it if it interests you less.
That’s how I use Lemmy anyway.


Respectfully disagree


GPU is a processor, right? It is the gfx card that will increase in price


I run nextcloud on a Raspberry Pi and I don’t use the other apps, just the file stuff. It seems like the best supported option to me.
Understandable why you would want to selfhost. I also use proton and for me it is something that I would rather pay for so I don’t have to administer it. I also hope they’ll keep improving the auto-fill experience.