• 0 Posts
  • 85 Comments
Joined 1 year ago
cake
Cake day: June 21st, 2023

help-circle

  • my rant was not about your meme. But people actually use this argument seriously, and that frustrates me.

    And I will admit that learning a new system has a time cost, but once you reach experience parity, the time cost per problem is less, and the number of problems is less. In that way, the “time spent” is an investment rather than wasted.

    So A+ meme, it triggered me in all the ways it was supposed to.



  • bisby@lemmy.worldtolinuxmemes@lemmy.worldlow effort maymay
    link
    fedilink
    English
    arrow-up
    27
    ·
    8 days ago

    The thing I hate about the “value your time” argument is that windows is shit.

    Let’s be generous for a minute and assume that windows and linux have the same amount of problems. Someone who is on windows for the past 30 years has 30 years of acquired knowledge and will probably know quickly how to solve it on windows, but not linux. Someone who is on linux for the past 30 years has 30 years of acquired knowledge and will probably know quickly how to solve it on linux, but not windows.

    So the entire argument is just “but I have muscle memory tied to windows, and I already know how to solve those problems, but I dont know how to solve the linux ones, so they take me a lot of research and time to solve, therefore all linux problems always take a lot more time to solve”

    On windows, I have to spend time fighting BSODs and finding out where to download software from that isn’t just bloated up with viruses, and how to run registry hacks to get rid of start menu ads and to stop microsoft from phoning home. None of those things i have to do on linux.

    On linux, today my biggest issue was figuring out how to change the keybinding for taking a screenshot… And that was an easy issue, but it’s also not even possible on windows.

    So I guess different types of problems. My “wasted” time is customizing my OS/environment so it works the way I want it to, not trying to fight back any ounce of control.



  • I thought that passing everything through it would allow the USB to feed/write the video stream without any other processing

    Unfortunately no. It captures the signal and turns it into something that the computer can digest, but the signal isn’t something that just proxies straight through to twitch. OBS is always going to do some re-rendering.

    A few tips:

    If you open OBS settings, there is a “Output” section. You can change the output mode to “Advanced”, and then select a “Video Encoder” … this is where you would find NVENC (there might be a way to do it in the simple output mode too, but I dont have an nvidia GPU to confirm.

    You’ll most likely want to change the Output resolution on the “Video” section of the settings down to 1280x720. Twitch limits your bandwidth anyway, and people tend to find that 1080p at low bandwidth doesn’t look any better than 720p at the same bandwidth (less compression artifacts because it doesnt have to compress as much, if at all)

    Twitch has an option for bandwidth tests (or at least used to). This will make their servers accept the stream, but you don’t actually go live on the site. You can use this to see how your computer handles the streaming. On the main OBS dashboard, you’ll see a 30.00 / 30.00 FPS in the bottom right corner (or whatever your resolution you’ve selected). There’s also a CPU meter down there.

    In the Docks menu there’s also a Stats dock. It will tell you how many Frames are missed due to rendering or encoding lag. If you have 0 missed frames, then your PC is handling the encoding just fine. It will also list how many dropped frames due to NETWORK you’ve had. This would indicate that there is a problem between you and Twitch/Youtube on the internet. Your computer is rendering the frames just fine, but Twitch isn’t receiving them.

    Use the stats dashboard to figure out where you are losing frames and then fix that (if its rendering/encoding, then its NVENC or your CPU struggling. if its Network, then its your ISP struggling). And if you aren’t losing frames, then you have nothing to worry about. This dashboard will also show you CPU and memory usage, but realistically, if youre using a 3080 with nvenc, those usages will probably be very low.


  • Even with the elgato doing “video encoding”, how does it get to Twitch/Youtube? It doesn’t do THAT kind of encoding. It’s encodes the HDMI capture into a local format that is basically a webcam stream. It has to be broadcast from OBS. and even if you are using the Elgato as a video source, OBS is going to re-encode it into what it wants to broadcast. There isn’t really getting around the video encoding cost of OBS, unless you have a device that streams to the internet directly from the capture card (which it doesn’t seem like Elgato makes one. Someone else might, but that’s not really what they are for)


  • They can. But Elgato also makes a “Camlink” in addition to the “HD60” series. And the Camlink dongles create a UVC device, which can be used as a webcam with no further tweaking necessary. Using a full desktop capture card for a webcam is slightly overkill, but absolutely works.


  • Streamers use a capture device to stream on a second computer, with an extra GPU so the stream doesn’t interfere with their gaming performance. Don’t want stream encoding to hurt your framerate.

    I’ve never heard of anyone using a multiple device setup for internet bandwidth reasons (im sure its happened, but I would have to believe it’s generally not the reason people use multiple devices)


  • bisby@lemmy.worldtoLinux Gaming@lemmy.worldStreaming on Linux
    link
    fedilink
    English
    arrow-up
    23
    ·
    edit-2
    23 days ago

    … What exactly do you need the Elgato for? All the Elgato does is capture external HDMI signals.

    If you had 2 PCs, you would use the Elgato to send the gaming PC’s screen to the streaming PC. If you had an Xbox, you would use it to capture the Xbox’s screen on your PC for streaming.

    If you have 1 PC, you don’t need an Elgato, KDE already knows what your PC screen looks like, it is laying it out.

    What you should be doing is just “open OBS and set up your scenes and start streaming.” The only thing you might want to do is go into the video settings and set it to use NVENC (I think you can do that on Linux) to offload the encoding to your GPU (which has dedicated encoding hardware) instead of your CPU.

    Everything else should just work the same as it does on Windows.

    To be clear: The Elgato HD60 X does not do any streaming… it is a video capture device. OBS does all the streaming, and it already has access to all the things it needs to capture by nature of being on the PC. You can just capture your desktop in OBS without the Elgato.



  • You’re right. There are multiple definitions of the word stable, and “unchanging” is a valid one of them.

    It’s just that every where else I’ve seen it in computing, it refers to a build of something being not-crashy enough to actually ship. “Can’t be knocked over” sort of stability. And everyone I’ve ever talked to outside of Lemmy has assumed that was what “stable” meant to Debian. but it doesn’t. It just means “versions won’t change so you won’t have version compatibility issues, but you’ll also be left with several month to year old software that wasn’t even up to date when this version released, but at least you don’t have to think about the compatibility issues!”


  • Debian aims for rock solid stability

    To be clear, Debian “stability” refers to “unchanging packages”, not “doesn’t crash.” Debian would rather ship a known bug for a year than update the package if it’s not explicitly a security bug (and then only certain packages).

    So if you have a crash in Debian, you will always have that crash until the next version of debian a year or so from now. That’s not what I’d consider “stable” but rather “consistent”





  • IMO it doesn’t matter. People don’t read news on updates. Should they? Yes. Do they? No. Should they have to? Also no.

    Linus’s point is to never blame the end user for something the kernel changed. If you want software to have widespread adoption, adding homework to simple updates isn’t how you do it. People don’t want a hobby or something to babysit, they want an operating system. Debian will go out of their way to make in-release updates go as smooth as possible, but are willing to through out entire parts of functioning packages between releases.

    But this isn’t even about breaking things for the end user. This will create excessive amounts of noise on the upstream repo. People will say “Hey! My keepassxc broke!” and they report it to keepassxc, and not to Debian. To which keepassxc just has to constantly reply “no, debian changed this on you, this is not a bug.” If Debian had to deal with the fall out of their own decisions, I would say “yeah, im not sure if i agree with the decision, but oh well”… But they are increasing the workload for other teams.

    It is already happening. The debian dev’s stance is “This will be painful for a year.” But it will be painful for keepassxc, NOT debian. The keepassxc devs asked them to not do this. Debian’s response might as well be “Im inflicting this pain on you, even though you’ve asked me not to. But on the plus side, it won’t hurt me at all and it will only last a year for you.” If they really have that much disdain for the project, they should just stop packaging it altogether.

    So yeah, debian has the legal right to do whatever they want because keepassxc is open source. but “just because I can, and you cant legally stop me, and its extra work for you, not me” is kind of a jerk move. This is what drives FOSS contributors to get burnt out and abandon otherwise good projects.


  • It’ll also break all your keepassxc plugins soon. Because debian version to version compatibility is not a priority. They also don’t care if them breaking something triggers a ton of upstream bug reports, because it will only “be painful for a year”

    Linus for the kernel has a strict “don’t break userspace” policy, and Debian has a “break things whenever you want, and just blame the user for not reading the news file” policy.


  • Oh look. Debian changed the keepassxc package and now the keepassxc repo is getting all the bug reports for it. Their stance is “it will go away in a year or so”

    Regardless of whether or not it is a good idea, it’s undeniable that Debian makes a lot of decisions that negatively impact their upstream. And since it’s someone else’s problem, oh well.

    There is a reason upstream repo maintainers wind up angry about problems that someone else caused.