• 2 Posts
  • 256 Comments
Joined 4 years ago
cake
Cake day: January 21st, 2021

help-circle









  • kevincox@lemmy.mltoSelfhosted@lemmy.worldMini pc arriving tomorrow
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    6
    ·
    3 months ago

    IMHO Arch is actually a great choice. They do have a minimum update frequency you need to maintain (I don’t recall exactly, I think it is somewhere between 1 and 3 months) but if you do, and read the news before updates (and you are usually fine if you don’t, usually the update will just refuse to run until you intervene) things are pretty seamless. I had many arch machines running for >5 years with no issues and no reason to expect that it would change. This is many major version updates for other distros which are often not as seamless.

    That being said I am on NixOS now which takes this to the next level, I am running nixos-unstable but thanks to the way NixOS is structured I don’t need to worry about any legacy cruft accumulating from the many years of updates.

    And after all of that I don’t think it really matters. I think any major distro you pick, weather stable, release-based or LTS will be fine. They all have some sort of update path these days. (unlike in the past where some distros just recommended a re-install for major updates).


  • That’s true. And I’m not saying B2 is bad, it is just something that you should be aware of.

    Their automatic replication isn’t quite as seamless as GCS or S3 though. For example deletes aren’t replicated so you will need a cleanup strategy. Plus once you 2x or 3x the price B2 isn’t as competitive on price. My point is that it is very easy to compare apples to oranges looking at cloud storage providers and it is important to be aware.

    For me B2 is a great fit and I am happy with it, but I don’t wan to mislead peope.


  • I think it depends on your needs. IIUC their storage is “single location”. Like a very significant natural disaster could take it offline or maybe even lose it. Something like S3 or Google Cloud Storage (depending on which durability you select) is multi-location (as in significantly distinct geographical regions). So still very likely that you will never lose any data, but in the extreme cases potentially you could.

    If I was storing my only copy of something it would matter a lot more (although even then you are best to store with multiple providers for social reasons, not just technical) but for a backup it is fine.







  • there will be scaling with all of its negative consequences on perceived quality

    In theory this is true. If you had a nice high-bitrate 1080p video it may look better on a 1080 display than any quality of 1440p video would due to loss while scaling. But in almost all cases selecting higher resolutions will provide better perceived quality due to the higher bitrate, even if they aren’t integer multiples of the displayed size.

    It will also be more bandwidth efficient to target the output size directly. But streaming services want to keep the number of different versions small. Often this will already be >4 resolutions and 2-3 codecs. If they wanted to also have low/medium/high for each resolution that would be a significant cost (encoding itself, storage and reduction in cache hits). So they sort of squish the resolution and quality together into one scale, so 1080p isn’t just 1080p it also serves as a general “medium” quality. If you want “high” you need to go to 1440p or 2160p even if your output is only 1080.


  • the reason no one posts the bitrates is because it’s not exactly interesting information for the the general population.

    But they post resolutions, which are arguably less interesting. The “general public” has been taught to use resolution as a proxy of quality. For TVs and other screens this is mostly true, but for video it isn’t the best metric (lossless video aside).

    Bitrate is probably a better metric but even then it isn’t great. Different codecs and encoding settings can result in much better quality at the same bitrate. But I think in most cases it correlates better with quality than resolution does.

    The ideal metric would probably be some sort of actual quality metric, but none of these are perfect either. Maybe we should just go back to Low/Med/High for quality descriptions.