An algorithm, among the many other things they ruin
What a joke! How about talking about the good things algorithms do, like everything a computer does?
An algorithm, among the many other things they ruin
What a joke! How about talking about the good things algorithms do, like everything a computer does?
I’ve been using glauth + Authelia for a couple years with no issues and almost zero maintenance.
Yes, absolutely. Ideally there would be an automated check that runs periodically and alerts if things don’t work as expected.
Monitoring if the backup task succeeded is important but that’s tue easy part of ensuring it works.
A backup is only working if it can be restored. If you don’t test that you can restore it in case of disaster, you don’t really know if it’s working.
Ah got it. I didn’t know there was a free tier!
How do you use ChatGPT anonymously? It requires a valid login linked to a payment method. It doesn’t get any less anonymous than that.
The main “instability” I’ve found with testing
or sid
is just that because new packages are added quickly, sometimes you’ll have dependency clashes.
Pretty much every time the package manager will take care of keeping things sane and not upgrading a package that will cause any incompatibility.
The main issue is if at some point you decide to install something that has conflicting dependencies with something you already have installed. Those are usually solvable with a little aptitude
-fu as long as there are versions available to sort things out neatly.
A better first step to newer packages is probably stable
with backports
though.
Not much use to go Ubuntu or Mint, unless you have specific issues with Debian that don’t happen with those. Even then, it may be one apt install
away from a fix.
If you want to try out BSD, power to you. I wouldn’t experiment on a backup computer though, unless by backup you just mean you want to have the spare hardware and will format it with Debian if you ever need to make it your main computer anyway.
Otherwise, just run Debian!
Stability is no longer an advantage when you are cherry picking from Sid lol.
This makes no sense. When 95% of the system is based on Debian stable
, you get pretty much full stability of the base OS. All you need to pull in from the other releases is Mesa and related packages.
Perhaps the kernel as well, but I suspect they’re compiling their own with relevant parameters and features for the SD anyway, so not even that.
Why would they manually package them? Just grab the packages you need from testing
or sid
. This way you keep the solid Debian stable
base OS and still bring in the latest and greatest of the things that matter for gaming.
I don’t think I’ve ever come across a DNS provider that blocks wildcards.
I’ve been using wildcard DNS and certificates to accompany them both at home and professional in large scale services (think hundreds to thousands of applications) for many years without an issue.
The problem described in that forum is real (and in fact is pretty much how the recent attack on Fritz!Box users works) but in practice I’ve never seen it being an issue in a service VM or container. A very easy way to avoid it completely is to just not declare your host domain the same as the one in DNS.
If they’re all resolving to the same IP and using a reverse proxy for name-based routing, there’s no need for multiple A records. A single wildcard should suffice.
That something else already exists and is called Matrix. I hope more people and projects migrate there!
Ok so it’s unknown.
Whilst I agree that it’s unlikely that it was an RCE in EAC like it’s been floating around, nothing can be entirely discarded yet.
I do agree that it’s likely safe to play Halo, if the hack happened due to calls made from Apex to EAC, that means EAC’s APIs made it possible (still unlikely to be an RCE though). With that in mind, bugs or malicious code in any game that interacts with the EAC APIs could cause the same issue.
This is one of the dangers of kernel-level anti-cheat systems.
It should be safe® on Linux though, as it has no direct access to the kernel.
It’s not clear (to me) if EAC was a factor in the hack.
Regardless, on Linux it runs in Proton so it should be entirely in userspace. In Windows it runs in the kernel which makes it a lot more dangerous.
Has this been established? Have EA published their findings somewhere?
Yeah I don’t think that’s gonna work. It uses Wayland which AFAIK is not supported by the proprietary Nvidia driver. No idea about the open source one but I don’t think that’s ready for prime time yet anyway.
The games I play on my hardware tend to perform the same or a little better on Linux.
I’m not saying this is true generally but it is for my relatively small sample.
For reference, I have a recent Radeon GPU. Games like Cyberpunk 2077, Baldur’s Gate 3 and even Starfield (which I haven’t played in a while because 🥱) all fit this experience.
The open source driver for Nvidia seems to be catching up lately, so hopefully everyone will soon have a prime time on Linux!
2 years since I’ve built my gaming rig. I’ve booted Windows on it once, and at this point I don’t even have a Windows partition anymore.
Ah NFS… It’s so good when it works! When it doesn’t though, figuring out why is like trying to navigate someone else’s house in pitch dark.