OBS is your best bet. Run it on the computer transmitting the stream, and just use it to record. You can then upload the recording separately later.Looking for suggestions. As you know, my wedding will be livestreamed. It's over YouTube. I'd like to record the stream with sound and all, for posterity. However, everyone I know who's a bit tech savvy, will be in the room or unavailable.
What software or way would be the absolute dead easiest to explain to a 60 year old to record the stream in a somewhat probably OK way?
I mean, yes, some sort of FRAPS setup, maybe, but I've never used those before so I couldn't even explain it. Any good/easy options?
I sadly don't have any control over the pc broadcasting, as that's done by City Hall.OBS is your best bet. Run it on the computer transmitting the stream, and just use it to record. You can then upload the recording separately later.
OBS is still your best bet. Does Fraps even still exist? >.>I sadly don't have any control over the pc broadcasting, as that's done by City Hall.
Dei is right. It's just a shame you can't record before it's sent... because recording what you receive means it's going to compress the compressed image. It's the modern equivalent of dubbing a dub.OBS is still your best bet. Does Fraps even still exist? >.>
If you can tune into the live feed before it begins, you can just set up OBS to record your screen and just let it run, and edit it later.
It’s not a bad processor, really. Still one of the top ten consumer processors of all time, and even when games move too far ahead for it, the twelve cores mean it’ll still be a productivity beast for quite a while after.I knew they were going to announce something soon, but I thought it was GPU, not CPU. Oh well. I still like my Ryzen 9 3900X.
I've run AMD in the past. Often they're just as good for less money. Really, right now, the only reason to go NVidia is if you HAVE to have raytracing, or need NVENC for streaming (it can really take a load off your CPU in those cases).I've been a Nvidia user since I got my first gaming PC. I don't know enough about hardware or specs or the nitty gritty to really explain why I buy their stuff, I've just had good experiences with their hardware so I prefer to continue with that. But now with this stunt, I'm going to be looking at all of their future video cards with a fry_squinting.jpg.exe expression.
I mean, now I'm going to have to actually learn the differences between AMD and Nvidia cards.
That was definitely a thing 15 or so years ago. I've gone back and forth between AMD and NVidia several times. But the most recent AMD card I had (which I bought in 2013), I didn't have any driver problems with. I used it until the NVidia 10 series came out in 2016. And when I upgraded to a GTX 1060, I gave my AMD to a friend who used it for a couple more years without problems.I constantly hear about driver issues with AMD, which makes me more leary of them.
It WAS one PR hack overstepping his bounds.Whatcha wanna bet this was that one PR hack overstepping his bounds?
Until proven otherwise, I'll stick with Hanlon's Razor.Eh, could still be a scapegoat, and the only bounds that were overstepped was stating Nvidia's company line explicitly to a reviewer.
I've seen some of these reviews and I agree. There are some feature enhancements (PCIe 4.0, ABT, Xe iGPU, AVX512 support) and they can overclock to 7GHz under hyper-exotic cooling methods, but any improvements to their real-world performance are definitely in the why-did-they-bother-doing-this category, since even though the core performance is ~13% faster than the 10th gen model it replaces, it has two fewer cores (8c v. 10c).The 11th gen Intel CPUs have no valid reason to exist.