Tech minor victory thread

GasBandit

Staff member
I have accomplished 5 of the 8 miracles that needed to happen for my Programming exam to be ready to submit. I still have 2 weeks left til my deadline.

For the first time I feel like I might actually not go down in flames on this.
 

GasBandit

Staff member
... It's ready. I don't know if I'll pass, but at least I completed it. Submitting it today, 11 days ahead of my deadline (though the last day I would have to work on it before "real work" takes back over again on monday).

Re-reading the scope documentation, I see if they find problems they'll give me 30 more days to fix it.

I don't know if I'll pass first try (probably not), but man, I had serious doubts I'd even get this far.
 
Hot take: The entire time, it wasn't actually a test of your existing skills, it was crafted to spur you to independently acquire the required skills/training without any investment of effort on their part.

--Patrick
 

GasBandit

Staff member
Hot take: The entire time, it wasn't actually a test of your existing skills, it was crafted to spur you to independently acquire the required skills/training without any investment of effort on their part.

--Patrick
Oh even moreso, to even be eligible to take this exam required me to attend three separate very expensive week-long training sessions over the last few years.

They are making money off me (well, my company) HAND OVER FIST on this transaction.

You know, in addition to all the Crestron gear we're reselling for them as a dealer.
 

GasBandit

Staff member
Got a response. They accepted my submission, and say I should hear back in 60-90 days.

Nothing like being kept in suspense, amirite?

But yeah, it feels like a long time, but to be fair, it took me 120 hours over 80 days to write, and it reads like the computer code equivalent of a mad priest's eldritch screed against God, so it's not unreasonable to need a similar amount of time to evaluate.
 
I cannot see myself changing anytime soon. A 4090 is 2500 Canadian dollars and it still struggles to run new games at 4K 60 fps with all the new raytracing/pathtracing etc all relying on DLSS to get decent performance.

My ancient Ryzen and 2080 are doing just fine in the 1080p department. I even downgraded and sold my 1440p monitor many years ago because I hated the performance hit.
 

GasBandit

Staff member
I'm loving my refurbed 1080ti. There has not been another card since the 10 series that I thought was worth snagging, especially not the way I felt I had to leap on the 1060 when it first came out.
 
I cannot see myself changing anytime soon. A 4090 is 2500 Canadian dollars and it still struggles to run new games at 4K 60 fps with all the new raytracing/pathtracing etc all relying on DLSS to get decent performance.

My ancient Ryzen and 2080 are doing just fine in the 1080p department. I even downgraded and sold my 1440p monitor many years ago because I hated the performance hit.
Game devs and GPU manufacturers have been focusing on things that most gamers really don't care about. You can see it in console development, where every new game that comes out reports itself as 4k raytracing super Fidelity that runs at a paltry 30fps, or relies on upscaling technologies that make everything look incredibly blurry, where most people who play games would prefer lower resolution and higher framerates.
 

GasBandit

Staff member
Waiting for it to come out that the GPU mfrs have secretly been paying devs to keep adding more dancing bologna to their games in order to force gamers to keep upgrading.

--Patrick
I am 100% certain this happened with Darktide. It's using the Vermintide 2 engine. It is at no level more graphically complex than Vermintide 2. And yet it needs windows 10 instead of 7 and way more GPU power.
 
I am 100% certain this happened with Darktide. It's using the Vermintide 2 engine. It is at no level more graphically complex than Vermintide 2. And yet it needs windows 10 instead of 7 and way more GPU power.
I can't find it at the moment, but I know I previously posted something like, "Wouldn't it be hilarious if the whole reason for games having increased system requirements wasn't because of the game/engine itself, but because it's required by the newest flavor of DRM that's been woven into it?"

--Patrick
 

GasBandit

Staff member
I can't find it at the moment, but I know I previously posted something like, "Wouldn't it be hilarious if the whole reason for games having increased system requirements wasn't because of the game/engine itself, but because it's required by the newest flavor of DRM that's been woven into it?"

--Patrick
If a single FLOP from my *GPU* is being used to process DRM, I will see code monkeys flayed alive and hanging by their feet.
 
Wasn't there something about people working to create a completely separate VM that would reside in GPU VRAM and was suggested could do things just like what you describe?

--Patrick
 
Last night I finally switched out my old router for that purpose-built PFSense box I've been talking about.
Today I finally switched from PFSense to OPNSense after the PFPeople announced in 2021 that they would be hamstringing the free version of their product. Yeah I know it took almost an extra year to finally do it, but whatever. It's almost the same thing, except for some very slight differences which meant we were without Internet an extra 90min while I looked up the new places to find the equivalent preferences I needed to change in order to get everything working and then back as close to my original intent as I wanted.
Now I just have to restring one wired connection to upgrade it from CAT5 to CAT6 so I can flip that line from 100 to 1000 speed without straining my NIC, but that means having to go crawl around in the basement rafters, so...maybe I'll get it done before another year goes by?

--Patrick
 
Top