And we're not even getting into the difference between GB and Gb.one Reddit topic yesterday asking "is the data usage GB the same as file size GB?"
Yes. Especially depending on what features you haven't disabled.So is using at least 4.5 Gigs of Memory while idle normal for modern Windows?
I agree with Pat. It's a rare day I idle at less than 10 gigs, lately. Though, usually my multiple browser tabs and running Idle Champions 24/7 keeps me around 20 gigs.So is using at least 4.5 Gigs of Memory while idle normal for modern Windows? I'm glad I upgraded years ago to 16 gig, it originally came with just 4.
On the subject I'm probably upgrading to 32 gigs anyway.
I think that was actually windows 10, around 2020. XP was pretty reliable and idled low. But starting with 7, Windows started "idling" at higher mem usage, and also started obfuscating a lot of system processes behind nebulous application names (SVCHOST for one) so that you couldn't really tell what your ram was being used by.What version of Windows was it where users were freaking out because the OS was consistently reporting that all of the physical RAM was constantly in use, no matter how much RAM a system had, or what programs were running, because Windows was counting all the past disk activity it had cached, regardless of it was still being actively used? Was that XP? Or did that not start up until Vista? Because I remember a lot of confusion over the matter. Both over the apparently maxed out RAM usage, and then confusion over if the caching was still happening or not after Microsoft changed the way that RAM usage was tracked.
I want to get excited about the next gen of tech. Since Socket 7, I've always been right on top of the next CPU and motherboard chipset releases, but now it's just so...meh. Oh, you support DDR5 now? So what. x64 CPUs have kinda plateaued, and Microsoft is basically being openly hostile to anyone whose hardware is more than 5 years old. Why is that, Microsoft? Why did you decide that would be the cutoff? What was so significant about the 2017-2018 timeframe that caused EVERYBODY in computing (Microsoft, Apple, even most automakers) to significantly switch up whatever they were doing around then?Everything I've seen about 11 has convinced me I've bought my last windows PC.
Well yeah, that happens when an OS stops getting security updates. But imagine if, instead of Vista/7, Microsoft had just redone XP's paradigms, drivers, and user experiences into 64 bit, and supported that, instead of deciding to redecorate, relocate, and obfuscate every useful thing in the control panel for no reason other than to give programmers something to do? And don't even get me started on 10/11's bullshit "settings" app that I have to circumvent every time I actually want to make a change to my hardware configuration.Sadly, running XP these days is pretty much impossible on any Internet-connected device. Turns into a swap of bots and zombieware within hours.
and now everyone is hostile to any physical media and any sort of retention of the ability to watch anything that is older than 5 years old.Again, the reason XP was killed (not discontinued, IMO killed) was because it did not have built-in protected media pathways, which therefore made it easier to circumvent DRM and rip media using machines running XP. That's it. That's the whole reason.
--Patrick
And as I've always said, Copyright law as it stands, especially the DMCA, is the bane of everything. Half the headaches in my job come from bogus HDCP shit. If companies spent half the effort in just making their prices fair and their products easy to purchase and use, piracy would dry up. Steam sales still have me being 10+ years of being clean on software piracy, whereas before 2010 I was probably the biggest, most unrepentant software pirate in existence. Now I'm 100% legit. During the golden age of netflix, I stopped bothering to pirate shows and movies, too. But corporate greed put paid to all that.Again, the reason XP was killed (not discontinued, IMO killed) was because it did not have built-in protected media pathways, which therefore made it easier to circumvent DRM and rip media using machines running XP. That's it. That's the whole reason.
--Patrick
Just so you know, there are a lot of devices available that I'm not allowed to professionally recommend, that will call themselves something like "edid minders" when what they really are is HDCP terminators with HDMI signal passthru. Sometimes they also call themselved "HDCP Bypass." 1080p ones are generally 15 bucks on amazon... 4k is a little more expensive.HDCP IS THE DEVIL.
My A/V stack works only because of the obfuscation of running my audio through so many different analog systems that the system cant tell what the fuck is going on beyond the DAC being fed by USB and being "trusted" this also has allowed me to literally fed the signal from switch into my computer and so I can leave it docked and play god damn nintendo games while talking to people on discord. I AM NOT PROUD.
"Signal regeneration."call themselves something like "edid minders" when what they really are is HDCP terminators with HDMI signal passthru. Sometimes they also call themselved "HDCP Bypass."
Copyright law as its used is the bane of everything. If governments would remember that consumers have a personal use right to make backups of the media they've bought & told companies that infringing on that right is illegal things would be better.And as I've always said, Copyright law as it stands, especially the DMCA, is the bane of everything.
Please tell me that you're posting this knowing full well the absurdity of telling people they're free to do something as long as they don't do one of the essential steps necessary to do the thing.Just remember that there's nothing illegal about making backup copies of your media. The DMCA only says that it's illegal to break the copy protection on that media to make your backups. Completely different thing. Make all the backups you want. Just don't break the copy protection.