Maybe.
Linux won because it worked. Hurd was stuck in research and development hell. They never were able to catch up.
Maybe.
Linux won because it worked. Hurd was stuck in research and development hell. They never were able to catch up.
However, Linus’s kernel was more elaborate than GNU Hurd, so it was incorporated.
Quite the opposite.
GNU Hurd was a microkernel, using lots of cutting edge research, and necessitating a lot of additional complexity in userspace. This complexity also made it very difficult to get good performance.
Linux, on the other hand, was just a bog standard Unix monolithic kernel. Once they got a libc working on it, most existing Unix userspace, including the GNU userspace, was easy to port.
Linux won because it was simple, not elaborate.
Zsh
No plugin manager. Zsh has a builtin plugin system (autoload
) and ships with most things you want (like Git integration).
My config: http://github.com/cbarrick/dotfiles
Exactly.
My take is that the issue isn’t with tmpfiles.d, but rather the decision to use it for creating home directories.
Yeah, but I want both GPU compute and Wayland for my desktop.
Long term, I expect Vulkan to be the replacement to CUDA. ROCm isn’t going anywhere…
We just need fundamental Vulkan libraries to be developed that can replace the CUDA equivalents.
cuFFT
-> vkFFT
(this definitely exists)cuBLAS
-> vkBLAS
(is anyone working on this?)cuDNN
-> vkDNN
(this definitely doesn’t exist)At that point, adding Vulkan support to XLA (Jax and TensorFlow) or ATen (PyTorch) wouldn’t be that difficult.
Unfortunately, those of us doing scientific compute don’t have a real alternative.
ROCm just isn’t as widely supported as CUDA, and neither is Vulkan for GPGPU use cases.
AMD dropped the ball on GPGPU, and Nvidia is eating their lunch. Linux desktop users be damned.
There’s a Wikipedia article on multiple encryption that talks about this, but the arguments are not that compelling to me.
The main thing is mostly about protecting your data from flawed implementations. Like, AES has not been broken theoretically, but a particular implementation may be broken. By stacking implementations from multiple vendors, you reduce the chance of being exposed by a vulnerability in one of them.
That’s way overkill for most businesses. That’s like nation state level paranoia.
No, you don’t split the file. You split the master decryption key.
Each user just needs to remember their own password, and SSS can reconstruct the master key when enough users enter their passwords.
multiple people to agree on decrypting a file
For that, you would use Shamir’s Secret Sharing algorithm rather than multiple encryption.
AES has been accelerated on all Intel CPUs since Broadwell, was common as far back as Sandy Bridge, and has been available since Westmere.
AMD has had AES acceleration since Bulldozer.
But the commenter is right that adding a second layer of encryption is useless in everything except very specific circumstances.
Does Hyperland support scrolling tiling?
That seems to be the new hotness in tiling WMs.
(Also, you can change the default editor
visudo
uses, but I don’t remember the command because I won’t be changing it until I get a grip on vim and can make a decision about which editor I want to use.)
It just uses your preferred editor, which you set with the EDITOR
environment variable. In fact, any program that opens an editor should use this to determine the user’s preference.
I set mine to VS Code:
export EDITOR="code -nw"
Examples of programs that use this variable include visudo
, crontab -e
, and git commit
.
After two years of development and some deliberation, AMD decided that there is no business case for running CUDA applications on AMD GPUs. One of the terms of my contract with AMD was that if AMD did not find it fit for further development, I could release it. Which brings us to today.
Yeah, if you’re not up to the challenge of maintaining your own system, Arch isn’t for you.
Nothing. They’re mostly the same thing.
The Ubuntu version will sometimes print “ads” to your terminal :P.
For a prod server, I’d choose Debian over Ubuntu if I didn’t have paid support, because I’m not a fan of Canonical. If I needed paid support, I’d choose Ubuntu, because Debian is strictly a community distro. (That community happens to include major companies, like Google.)
See the other thread.
TL;DR: Useful abstractions and a hell of a dependency solver.
Sure, but the interface is probably just as important as the actual logic behind it, isn’t it?
The logic is why I love Apt. Most robust dependency resolution algorithms I’ve used.
But also, I don’t have any issues with the CLI. Having a distinction between apt-get
and apt-cache
and apt-mark
doesn’t feel weird to me. You’re practically just separating the top-level sub commands by a dash instead of a space. The apt
command is really just a convenience thing, and there are specialized tools for the more advanced things. Which is fine by me.
Also, the top level apt
command doesn’t guarantee a stable CLI, so for scripting you’re supposed to use apt-get
and friends anyway.
Honestly I would consider that one of the fundamental things a package manager must do.
You’d be surprised. Homebrew (the de facto standard package manager for macOS) doesn’t do this. Though, you can at least lookup the “leaf” packages which are not dependencies of any other package.
And, most language-specific package managers can’t do this. E.g. if you install software with pip
or cargo
.
you have to remember the extra step after it’s finished installing
If the package is in use, it shouldn’t be an orphan.
For example, what if you race with a cleanup job that is removing orphans? (Debian is hyper stable, so I often enable unattended upgrades with autoremove. I’m not so comfortable doing that on Arch ;)
What you’ve described is just an apt-get install
when you start and and apt-get remove
when you’re done. Or more properly setting it as a build dependency in your source package, to let Apt handle it.
But also, why uninstall build tools?
This, at least version constraints, is another one I’d consider essential tbh. The rest are great though, I agree.
Yeah, version constraints are common. But most other package managers bail with an error when they encounter a conflict. Apt is really good about solving conflicts and proposing solutions. Often it will propose multiple solutions to your conflict for you to choose from.
Again, it’s the solver part of Apt that makes it the best IMO.
+1
From an order of magnitude perspective, the max is terabytes. No “normal” users are dealing with petabytes. And if you are dealing with petabytes, you’re not using some random poster’s program from reddit.
For a concrete cap, I’d say 256 tebibytes…