According to Dave Plummer, a retired Windows Engineer, there are actually bugs in some of the windows components because he intended for them to be temporary solutions, like the CPU or Hard Drive usage numbers had to be Massaged to be lower than 100%, for example. When the Task Manager doesn’t respond you can actually use Ctrl+Shift+Esc to queue up a new Task Manager if the old one doesn’t revive itself. That stuff hasn’t changed since 1996.
He also wrote the File Formatter, which has a file size limit of 32Gb for the Fat32 format for the same reason: it wasn’t supposed to be permanent, but it hasn’t changed for over 20 years. The concept at the time was that Cluster Slack would make a large drives like a terabyte more than 99% wasted space in the format, so 32Gb was arbitrarily chosen as a limit.
The fuck? I only want to turn it off because I’m testing something and I need a change of ip to test an application and I’m feeling lazy, so I turn off the nic to go to wifi. Good enough? Nope.
So stand up and unplug the cord.
Cool. Switched over. Test didn’t work as expected. Plug cord back in.
Next day computer reboots for updates and I’ve got no internet. Go crazy trying to figure out what it was then remember it needed a reboot to disable the nic.
He also wrote the File Formatter, which has a file size limit of 32Gb for the Fat32 format for the same reason: it wasn’t supposed to be permanent, but it hasn’t changed for over 20 years.
I was thinking about this recently, so it is a bug, not a feature
Yes, the final line of my comment explains that, it’s just that the cluster size in Fat32 has a lower bound so if you have files smaller than the cluster then they take a whole cluster, and that can lead to cluster slack that is vast majority wasted space.
The limit on formatting drives as fat32 is 32GB on windows though anything above 32GB and you have to go find a 3rd party tool to convert larger disks to fat32
The fuck? I only want to turn it off because I’m testing something and I need a change of ip to test an application and I’m feeling lazy, so I turn off the nic to go to wifi. Good enough? Nope.
So stand up and unplug the cord.
Cool. Switched over. Test didn’t work as expected. Plug cord back in.
Next day computer reboots for updates and I’ve got no internet. Go crazy trying to figure out what it was then remember it needed a reboot to disable the nic.
According to Dave Plummer, a retired Windows Engineer, there are actually bugs in some of the windows components because he intended for them to be temporary solutions, like the CPU or Hard Drive usage numbers had to be Massaged to be lower than 100%, for example. When the Task Manager doesn’t respond you can actually use Ctrl+Shift+Esc to queue up a new Task Manager if the old one doesn’t revive itself. That stuff hasn’t changed since 1996.
He also wrote the File Formatter, which has a file size limit of 32Gb for the Fat32 format for the same reason: it wasn’t supposed to be permanent, but it hasn’t changed for over 20 years. The concept at the time was that Cluster Slack would make a large drives like a terabyte more than 99% wasted space in the format, so 32Gb was arbitrarily chosen as a limit.
fyi Dave was involved in some scareware bullshit as one of the main actors and sued for it. Fuck this guy.
Damn seems pretty legit… https://www.atg.wa.gov/news/news-releases/attorney-general-s-office-sues-settles-washington-based-softwareonlinecom
Thanks for looking this up ❤️
Very unfortunate to hear. I wonder how much of his YT Channel were lies?
I aggree. I really liked his videos until came across this info (back then on reddit).
I went to go disable my nic.
It needed a reboot to take effect.
The fuck? I only want to turn it off because I’m testing something and I need a change of ip to test an application and I’m feeling lazy, so I turn off the nic to go to wifi. Good enough? Nope.
So stand up and unplug the cord.
Cool. Switched over. Test didn’t work as expected. Plug cord back in.
Next day computer reboots for updates and I’ve got no internet. Go crazy trying to figure out what it was then remember it needed a reboot to disable the nic.
I was thinking about this recently, so it is a bug, not a feature
If it has been a bug for 20+ years, we can safely say it’s a feature for backwards compatibility.
I mean, it was intentional in a way, so the definition of bug is hazy, but the functioning version would be the ExFAT format.
But the problem isn’t in Fat32 itself, as you can format larger disks in that format just fine
Yes, the final line of my comment explains that, it’s just that the cluster size in Fat32 has a lower bound so if you have files smaller than the cluster then they take a whole cluster, and that can lead to cluster slack that is vast majority wasted space.
Dave’s youtube channel is great for these stories from back in the day. Link for the lazy: https://www.youtube.com/@DavesGarage
Here is an alternative Piped link(s):
https://www.piped.video/@DavesGarage
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
The limit was a 4GB limit, tho
The limit on formatting drives as fat32 is 32GB on windows though anything above 32GB and you have to go find a 3rd party tool to convert larger disks to fat32
They’re talking about the overall size, not the per file size limit.
I went to go disable my nic.
It needed a reboot to take effect.
The fuck? I only want to turn it off because I’m testing something and I need a change of ip to test an application and I’m feeling lazy, so I turn off the nic to go to wifi. Good enough? Nope.
So stand up and unplug the cord.
Cool. Switched over. Test didn’t work as expected. Plug cord back in.
Next day computer reboots for updates and I’ve got no internet. Go crazy trying to figure out what it was then remember it needed a reboot to disable the nic.