+1 for Niagra. It takes a few days to get used to but it’s the launcher every power user didn’t know they wanted. Lifetime purchase options and a very responsive/passionate dev
Dran
- 5 Posts
- 634 Comments
Dranto
Technology•Judge orders Anna’s Archive to delete scraped data; no one thinks it will complyEnglish
13·26 days agohttps://en.wikipedia.org/wiki/Markov_chain
Before the advent of AI, I wrote a slack bot called slackbutt that made Markov chains of random lengths between 2 and 4 out of the chat history of the channel. It was surprisingly coherent. Making an “llm” like that would be trivial.
It definitely can be disabled post-install but is much simpler to install without it at install-time, and has the added benefit of not pulling 2-5gb of other things that won’t be relevant to your use case. It’s not that the disk waste is that big of a deal, but any issues you run into will be that much easier to troubleshoot with fewer moving parts.
That wasn’t quite the takeaway I was going for. You can get a lot done on 8gb of ram. I was just trying to point out that it would probably be your first bottleneck as you started to scale out, and that you should consider using the server headless to make the ram you have go that much further.
All of those would be perfectly cromulent nodes for small containers. The first issue you’ll run into is the low ram. Some homelab projects would cause you to exceed 8gb, but the good news is if you’re using an external backend via NFS, you can always scale out (more nodes) or up(more compute per node,) later with minimal headache.
If you’re going to be memory constrained, don’t waste 1-2gb on a gui, install Ubuntu/Debian/whatever headless
Dranto
Linux Gaming•PC Gamer: "I'm brave enough to say it: Linux is good now, and if you want to feel like you actually own your PC, make 2026 the year of Linux on (your) desktop"English
6·1 month agoCompsci labs or everywhere?
Dranto
Privacy@lemmy.ml•I didn't realize my LG TV was spying on me until I turned off this setting
42·2 months agoNot OP but I think this guy is remembering a scene from silicon valley, not from reality. That said it’s probably not that far off. Amazon smart devices absolutely have this “feature” in production today-- and it’s opt-out, not opt-in.
https://en.wikipedia.org/wiki/Amazon_Sidewalk
Dranto
Technology•Half-Life 3 Reportedly Delayed Due to Steam Machine Price, Leak ClaimsEnglish
10·2 months agoNot the same chips, but ddr5, gddr7, and hbm2 are made off the same wafers in the same plants. The issue is allocation in wafer and production time skewing towards the higher-margin items. DDR5 additionally is being made more into the server ecc variant, which companies are buying in droves for cost-efficient MOE inference.
Have you played Baulder’s gate 3 and expedition 33 yet?
It means nothing, except that the meaninglessness of it annoys boomers, so the zoomers and alphas keep doing it to annoy the boomers harder. It’s literally that simple
What is the flag for this?
CGNAT does have a designated range by spec. 100.64.0.0/10, which covers addresses from 100.64.0.0 to 100.127.255.255. Technically they could be using any other private address space but it would be very uncommon in a modern ISP.
Dranto
Technology•I built an AI app that helps people choose what to watch in secondsEnglish
10·2 months agoNo offence: but the problem is an app forces me to trust you; a website does not. I have toghter and easier control over a web request than I do over an app, and even if an app doesn’t have these permissions today, an update or an update after a sale could trivially and silently introduce them.
A website is obvious if the deal changes-- you put up a login wall to harvest data; I stop using the site. You put trackers and ads into the UI; I block it at the DNS level.
Dranto
Technology•I built an AI app that helps people choose what to watch in secondsEnglish
19·2 months agoFirst instinct: being an app gives me over-permissive data collection scam vibes. I will not be installing it even though I might otherwise find a website of similar capability useful.
Unfortunately it not only has to be companies, but unless you are a producer of products that are HDMI certified already your membership will be denied. It would take a lot of fuckery to make that many corporations and not have all of their membership applications be denied. Also I’m not sure that it’s even a voting democracy in the traditional sense even if you could.
Dranto
Linux@lemmy.ml•Have Nvidia drivers on Linux gotten worse over later generations?
2·2 months agoI suspect the difference in experiences is more due to x11/pulse(my custom systems) vs Wayland/pipewire(bazzite) than it is any particular GPU vendor or driver branch. Which I guess is a roundabout way of saying
Maybe? Probably?
Judging by the protondb entry on CS2 I strongly suspect I would have at least the audio issue regardless of gpu.
Dranto
Linux@lemmy.ml•Have Nvidia drivers on Linux gotten worse over later generations?
31·2 months agoAppreciate the recommended fixes. I did find similar and was able to work through some of the issues with CS2 but I did that on instinct, and it wasn’t until I was halfway through troubleshooting game 2 of 2 attempted that I realized it wasn’t where I needed it to be for a remote support hand-me-down.
I did briefly entertain the idea of setting up rustdesk on it but the atomic nature + Wayland made unattended (read: “help I broke it and I can’t log in”) not really viable. By the time I got to “hrm, I could probably set up a reverse ssh tunnel into my homelab for persistent support?” I decided windows was probably the play here.
Dranto
Linux@lemmy.ml•Have Nvidia drivers on Linux gotten worse over later generations?
32·2 months agoLike the other guy said I think this is a bazzite-induced problem. I have other Linux systems at home. My daily driver and my wife’s daily driver are both highly custom Ubuntu server derivatives, we both have Nvidia GPUs (3050, 5070), and neither of us have similar issues.
The reason I wanted to try bazzite was that I didn’t want to remotely support something super custom.
Dranto
Linux@lemmy.ml•Have Nvidia drivers on Linux gotten worse over later generations?
52·2 months agoI just went to repurpose some old hardware for my nephew (4790k + 32gb ddr3 + rtx 3050) which I thought would make a very passable bazzite box. I put 2 drives in the test rig, one with bazzite Nvidia + kde and one with win11 running with the rufus tpm bypass hacks.
CS2 ran at ~40fps in bazzite with no sound once you got in game, win11 ran at ~100
Helldivers2 ran at ~50fps in bazzite with constant frame drops even after letting it precompile shaders. On windows it was a very playable 70fps.
I mainline Linux myself and I wanted bazzite to be the set-and-forget answer but it really wasn’t. I can’t in good faith hand that build over to an 12 year old with bazzite and that was super disappointing.




There are server chips like the E7-8891 v3 which lived in a weird middle ground of supporting both ddr3 and ddr4. On paper, it’s about on par with a ryzen 5 5500 and they’re about $20 on US eBay. I’ve been toying with the idea of buying an aftermarket/used server board to see if it holds up the way it appears to on paper. $20 for a CPU (could even slot 2), $80 for a board, $40 for 32gb of ddr3 in quad chanel. ~$160 for a set of core components doesn’t seem that bad in modern times, especially if you can use quad/oct channel to offset the bandwidth difference between ddr3 and ddr4.
I think finding a cooler and a case would be the hardest part