• 0 Posts
  • 39 Comments
Joined 1 year ago
cake
Cake day: June 6th, 2023

help-circle

  • I played around with Mandrake and Debian around the turn of the century. A bit of a break, but then I started dual-booting Ubuntu in the Windows Vista/X86 OSX era. I jumped to Xubuntu and started running Linux by itself on several machines around 2012.

    I largely shifted to Arch around the time that snaps came out because they weren’t playing nice with some of my low-end machines. Nowadays, mainly Arch. Exceptions: Fedora on my M1, Debian Bookworm on an old x86 tablet and any time I set up WSL on a Windows machine.






  • I agree.

    A part of me misses the days of dual-using a rock solid professional server OS for business and a cobbled-together similar OS for home computers and older hardware.

    Cobbled-together became good enough. Then it became better in some cases. Then it became better in most cases. Now I haven’t bothered with a non-Linux for over half a decade.


  • I always assumed that a lot of this boils down to semantics and trademark law.

    OpenIndiana is a direct code-line descendant of Unix System V through OpenSolaris via Solaris. Thank you for that, Sun Microsystems. I understand (but haven’t looked) that a lot of code these days is simply ported over from BSD or Linux. If you compare the source code to an old copy of the Lions book, you’re probably not going to see any line-by-line overlap. Thank goodness - we shouldn’t be literally running old operating systems from the '80s. I don’t think that OpenIndiana is Unix-certified by the Open Group (Trademark).

    The BSDs started out as a sort of ‘Ship of Theseus’ rebuild of an academic-licensed copy of Unix around the time that AT&T was getting litigious and corporate Unixes (Unices?) were starting to Balkanize.

    GNU/Linux started out as a work-alike (functions the same but with totally different code) inspired by MINIX, which in turn was an education-licensed Unix work-alike designed to show basic operating system principles to students. I think that one or more linux-based operating systems have obtained UNIX certification from the Open Group, just like Apple did for MacOS (paying money and passing some tests). It doesn’t seem like any of them are still paying to keep up the certification. Does it matter if they did at one point?

    Going back to proprietary corporate Unixes, I believe that IBM AIX and HP-UX still exist as products. They started out as UNIX and have been developed continuously since then. They are both Certified Unix. By now, their codebases probably diverge substantially both from one another and from all of the Unix-likes. IBM also has a mainframe OS with a fascinating history that has nothing to do with UNIX. It is Certified Unix because it passes the right tests and IBM paid for certification. It is not UNIX code and doesn’t descend from UNIX code.

    Simple as.





  • Ubuntu isn’t my favorite, but I used xubuntu for many years. A lot of noise gets thrown around about Snaps, but from an end-user perspective they tend to work fine unless you have very low system constraints. Better than adding a half-dozen repositories that may or may not be around for long. A lot of developers work to make sure that their software runs well in Ubuntu and the LTS releases tend to be a good long-term option if you don’t want any significant changes for a long time.

    Even with their regular releases, I daisy-chained upgrades on an old Core2 laptop for something like seven years without any major (computer becomes a paperweight) issues. Sometimes (like with Snaps) Ubuntu insists on going its own way, which can result in errors/shitty OS things that don’t pop up in other distributions. I’ve had to deal with some minor issues with Ubuntu over the years (broken repositories, upgrades causing hiccups, falling back to older kernels temporarily), but I think that you’ll get issues like that regardless of what distro you pick.



  • I didn’t, but only because my solution wasn’t novel or generalized for other people. I made a script to fire up tmux on a ‘primary’ computer with key-based access to my other computers, load up a set of windows and panes, and ssh into each computer. One window would be computers in one section of my home, another window would be computers elsewhere. The only challenge was getting a baseline grasp of the tmux scripting syntax.

    I initially set it up to run htop on each computer (dashboard goal, plus easy ability to terminate programs), but the basic setup was flexible. I could set other programs to run by default or and send terminal command updates to each computer from any device that could ssh into the primary computer. Automating updates on a computer-by-computer basis is a better solution, but the setup let me quickly oversee and interactively start multiple system updates at once, from a phone, tablet, or laptop.




  • Bob Smith@sopuli.xyztoLinux@lemmy.mlLinux tablet?
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    6 months ago

    At that price range, be sure to carefully check compatibility for your favorite distribution and for any hardware that you intend to use.

    For what it’s worth, I have an old HP Stream 7 that currently runs Debian Bookworm. I think that it cost about $100 new. I can use it as a pdf reader and to sync files, but there are plenty of tradeoffs due to the 1gb of RAM, the weak Atom processor, the small amount of built-in storage, the mediocre touchscreen, and the general poor quality of touchscreen interfaces among low-resource window managers. Neither camera works and several distributions can’t support the built-in audio. Screen rotation is a crapshoot. Forget about low-power standby. Some of these issues are unique to my tablet, but some of them are problems that people tend to run into when they try to shoehorn linux into a tablet that wasn’t built with linux in mind. Something like a Pinetab would be a better bet.

    I saw another person suggest an aftermarket Surface. If you go this route, carefully research the exact model number to verify that the hardware supports linux and that there is a clean way of installing your preferred distribution.

    Another thing worth mentioning. Installing linux can be a special kind of hell. Most distributions don’t have a touchscreen-friendly installer. For my cheap tablet, this meant cobbling together a flash drive, a powered USB hub, a USB keyboard, a USB ethernet adapter, and a USB-OTG cable for the single micro-usb port on the tablet. Then, I had to race the decade-old tablet battery to the finish line during the install process. Plus something about a 32-bit EFI bootloader combined with a 64-bit processor.




  • Arch seems to target users who are inclined to read the wiki and manpages, so it doesn’t surprise me that beginners run across some saltiness if they approach people who aren’t focused on beginners. Even the installation process seems to be designed as a screening mechanism. It wasn’t a big hurdle when I first tried it out, but it was a small one.

    There are plenty of distributions that focus on people who are just getting started. For whatever it might be worth, this includes several distros based on Arch. I usually suggest Mint or Xubuntu over Debian for people with no prior exposure to Linux. Even though I like it personally, I try not to suggest vanilla Arch to anybody. They can try it if they want to, but there are plenty of reasons to try something else instead.