I think your reply would have been more useful if you’d given some pointers about how, instead of just “do it right”.
I think your reply would have been more useful if you’d given some pointers about how, instead of just “do it right”.
It’s fine, but not going to be the cheapest.
Cheap to buy: Any old PC desktop, really. Most will run linux and windows fine, depending on what you want. Anywhere from free to £100. If you have an old desktop or laptop already, use that to start with.
Cheap to run: Any mini PC. I run a Lenovo ThinkCentre M53 for low power duties. Cost £40 and runs silently at 10watts, idle. (I have a secondary, much beefier server for other stuff that runs at around 100w which lives in the garage)
But plenty of people do run mac minis as home servers, often on Linux. They’re fine - just do your homework on the CPU ability, how much ram you can add, and whether you’re okay with external disks if you can’t fit enough inside.
As far as I can read from that, they’re still maintainers, just have had their credit removed from the contributors page, no?
Still a strange thing to do and I look forwards to an explanation.
Fair play - it’s good that there’s choice and if it works for you, great. I also totally get the fun of building something yourself.
The local storage is a big one if you don’t have a nas or home server on the network. Although, if you’re linked into the *arrs then I would think most people already do. It’s nice when new episodes just turn up automagically in Jellyfin.
I tried Kodi before but I found the commercialisation of it very jarring. Jellyfin is entirely free - your fifth point might give it extra credit for that. The Jellyfin app doesn’t (afaik) feed any info to anyone, but you do need to load it from the Amazon fire menu, so you can’t entirely skip their advertising. It is the only thing I use the fire stick for, and the price is cheap compared to anything else - it cost £25 and works on any TV. Being a dongle, there’s no noise either.
Why not? It’s a computer that displays tv? At 4k, 5.1 audio, that’s not too shabby, no?
I made a PC specifically for streaming video back before sticks were a thing, but it was expensive, noisy and not very good in comparison and I don’t miss it. What about a stick is inferior to what you’re talking about? Genuine question - educate me, please. What software, what hardware, why choose it over something else?
Using the native Jellyfin app available for Amazon’s fire tv stick.
Not really suitable for a Home Theatre PC
Not sure where you got that idea, but it’s absolutely what I use it for. That I can also watch content from multiple sources as well is part of the appeal. Plus no constant upsell like Kodi and Emby.
Scuse the cut and paste, but this is something I recently thought quite hard about and blogged, so stealing my own content:
What to back up? This is a core question to ask when you start planning. I think it’s quite simply answered by asking the secondary question: “Can I get the data again?” Don’t back up stuff you downloaded from the public internet unless it’s particularly rare. No TV, no Movies, no software installers. Don’t hoard data you can replace. Do back up stuff you’ve personally created and that doesn’t exist elsewhere, or stuff that would cause you a lot of effort or upset if it wasn’t available. Letters you’ve written, pictures you’ve taken, code you authored, configurations and systems that took you a lot of time to set up and fine tune.
If you want to be able to restore a full system, that’s something else and generally dealt best with imaging – I’m talking about individual file backups here!
Backup Scenario Multiple household computers. Home linux servers. Many services running natively and in docker. A couple of windows computers.
Daily backups Once a day, automate backups of your important files.
On my linux machines, that’s things like some directories like /etc, /root, /docker-data, some shared files.
On my windows machines, then that’s some mapping data, word documents, pictures, geocaching files, generated backups and so on.
You work out the files and get an idea of how much space you need to set aside.
Then, with automated methods, have these files copied or zipped up to a common directory on an always-available server. Let’s call that /backup.
These should be versioned, so that older ones get expired automatically. You can do that with bash scripts, or automated backup software (I use backup-manager for local machines, and backuppc or robocopy for windows ones)
How many copies you keep depends on your preferences – 3 is a sound number, but choose what you want and what disk space you have. More than 1 is a good idea since you may not notice the next day if something is missing or broken.
Monthly Backups – Make them Offline if possible
I puzzled a long time over the best way to do offline backups. For years I would manually copy the contents of /backup to large HDDs once a month. That took an hour or two for a few terabytes.
Now, I attach an external USB hard drive to my server, with a smart power socket controlled by Home Assistant.
This means it’s “cold storage”. The computer can’t access it unless the switch is turned on – something no ransomware knows about. But I can write a script that turns on the power, waits a minute for it to spin up, then mounts the drive and copies the data. When it’s finished, it’ll then unmount the drive and turn off the switch, and lastly, email me to say “Oi, change the drives, human”.
Once I get that email, I open my safe (fireproof and in a different physical building) and take out the oldest of three usb Caddies. Swap that with the one on the server and put that away. Classic Grandfather/Father/Son backups.
Once a year, I change the oldest of those caddies to “Annual backup, 2024” and buy a new one. That way no monthly drive will be older than three years, and I have a (probably still viable) backup by year.
BTW – I use USB3 HDD caddies (and do test for speed – they vary hugely) because I keep a fair bit of data. But you can also use one of the large capacity USB Thumbdrives or MicroSD cards for this. It doesn’t really matter how slowly it writes, since you’ll be asleep when it’s backing up. But you do really want it to be reasonably fast to read data from, and also large enough for your data – the above system gets considerably less simple if you need multiple disks.
Error Check: Of course with automated systems, you need additional automated systems to ensure they’re working! When you complete a backup, touch a file to give you a timestamp of when it was done – online and offline. I find using “tree” to catalogue the files is worthwhile too, so you know what’s on there.
Lastly – test your backups. Once or twice a year, pick a backup at random and ensure you can copy and unpack the files. Ensure they are what you expect and free from errors.
Surely y’all have monitoring and alerts for excessive cpu load already?
Especially during the transition from 2 to 3. Let’s hope that’s all behind us.
Same. Just…works?
Why did we have to learn what modelines were to get a picture on screen?
This just makes me worried to rely on uBO but more because what if the author just fucks off because someone else pissed them off.
That is very concerning to me, also.
Large parts of the internet relying on one or two tiny one-man FOSS projects? (UBO and ADguard are often cited as the only two reliable-ish and safe adblockers)
If he can’t be bothered with that nonsense, how secure is UBO’s future? How secure is the future of adblocking?
I would bet that advertising companies are rubbing their hands now and planning to ramp up pressure against these poor devs.
I think Ubuntu made sense back in the day when Debian wasn’t as user-friendly.
This is a very good point.
When Ubuntu launched, it was a big moment for linux. Before then, setting up a linux GUI was a lot of pain (remember setting modelines for individual monitors and the endless fiddling that took - and forget about multiple monitors). Ubuntu made GUI easy - it just worked out of the box for most people. It jumped Linux forwards as a desktop a huge way and adoption grew a lot. They also physically posted you a set of CDs or a DVD for free! And they did a bunch of stuff for educational usage, and getting computers across Africa.
That was all pretty amazing at the time and all very positive.
But then everyone else caught up with the usability and they turned into a corporate entity. Somewhere along the way they stopped listening to their users, or at least the users felt they had no voice, and a lot more linux distros appeared.
I used to write to DVD’s, but the failure rate was astronomical - like 50% after 5 years, some with physical separation of the silvering. Plus today they’re so relatively small they’re not worth using.
I’ve gone through many iterations and currently my home setup is this:
Having the hdd’s in the safe means that total failure/ransomware takes, at most, a month’s worth. I can survive that. The safe is also fireproof and in another building to the server.
This sort of thing doesn’t need to be high capacity HDDs either - USB drives and micro-SD cards are very capable now. If you’re limited on physical space and don’t mind slower write times (which when automating is generally ok), the microSd’s and clear labelling is just as good. You’re not going to kill them through excessive writes for decades.
I also have a bunch of other stuff that is not critical - media files, music. None of that is unique and can be replaced. All of that is backed to a secondary “live” directory on the same pc - mostly in case of my incompetence in deleting something I actually wanted. But none of that is essential - I think it’s important to be clear about what you “must save” and what is “nice to save”
The clear thing is to sit back and work out a system that is right for you. And it always, ALWAYS should be as automated as you can make it - humans are lazy sods and easily justify not doing stuff. Computers are great and remembering to do repetitive tasks, so use that.
Include checks to ensure the backed up data is both what you expected it to be, and recoverable - so include a calendar reminder to actually /read/ from a backup drive once or twice a year.
I’m inclined to give Linux more benefit of the doubt than, say, Windows. That’s because of the motives behind it.
Microsoft have a very long history of making design choices in their software that users don’t like, and quite often that’s because it suits their interests more than their customers. They are a commercial business that exists to benefit itself, after all. Same with Apple. Money spoils everything pure, after all. You mention privacy, but that’s just one more example of someone wanting to benefit financially from you - it’s just in a less transparent and more open-ended way than paying them some cash.
Linux, because that monetary incentive is far less, is usually designed simply “to be better”. The developers are often primary users of the software. Sure - sometimes developers make choices that confuses users, but that over-arching driving business interest just isn’t there.
If it did, then the world would be a perfect place by now. Indeed, many things are better - but there’s enough people hard at work sowing discontent and hate to ensure it isn’t.
Bless you for being an optimist, but I don’t think it works like that. I really wish it did though.
Agree - after they started bundling adware in downloads (2013ish?), all the decent projects seemed to move to github en masse.
Those projects that stayed were mostly already stagnant, or the maintainers didn’t use git and didn’t want to learn, or had some other reason that allowed them to accept advertising on their work.
You’re making the mistake in that his intention is to improve things.
He doesn’t have to make things better, he just has to say he has made them better. That’s all he’s ever done and it’s worked.