Yeah, thanks a lot, Gas.Aaand so much for my earlier attempts not to spoil it for the one person who hasn't played it yet. heh.
--Patrick
Yeah, thanks a lot, Gas.Aaand so much for my earlier attempts not to spoil it for the one person who hasn't played it yet. heh.
Hey don't look at me, it was the other guy who spoiled itYeah, thanks a lot, Gas.
--Patrick
They did this last E3. Microsoft is notorious for running their even stuff on top of the line PCs, mostly because the One uses a modified Windows 8 OS.
The incredible thing is that it was basically made like one. The animation is hand-drawn and the backgrounds are hand-painted. About the only change from the old-school methods is they colorized characters in Photoshop.I commented on this in the E3 thread, but I really like the look of Cuphead. It definitely feels like you're playing in an old 1930's cartoon!
They attempted to hand-color everything, apparently, but they said it would have taken another few years to get the game done and didn't want to hold it up even longer.The incredible thing is that it was basically made like one. The animation is hand-drawn and the backgrounds are hand-painted. About the only change from the old-school methods is they colorized characters in Photoshop.
I was gonna say.Hey don't look at me, it was the other guy who spoiled it
They actually made this way longer in the American release. It used to just be a simple animation. Then again, they also introduced Diamond, Ruby, and Emerald Weapon in the US version... and Diamond drops a weapon that used to be dummied out. So it's not all bad.Also, Sephiroth's Supernova was the final push that made me actually hate the game.
To be fair Diamond still existed in the original as it's plot-centric, you just didn't actually get a chance to fight it.They actually made this way longer in the American release. It used to just be a simple animation. Then again, they also introduced Diamond, Ruby, and Emerald Weapon in the US version... and Diamond drops a weapon that used to be dummied out. So it's not all bad.
Or possibly the pre-New 52 Batgirl comics with Stephanie Brown, where she had a sidekick that wore the Grey Ghost outfit and name for awhile.A few folks have found a neat reference to Batman: the Animated Series.
Maybe you need to delete all your Stones mp3s.That's what my Skyrim says right before it crashes to black.
What? No comments about the analog stick?And now I keep wanting to make a pun about working the bumpers and using the D-pad...
Yeah, ATI PC cards have less trouble if they only have to render at 30fps in 720p, tooWhat? No comments about the analog stick?
I assume whoever did those is probably partial to NVIDIA (since XB1 and PS4 both also use ATI graphics).
--Patrick
Yep.Yeah, ATI PC cards have less trouble if they only have to render at 30fps in 720p, too
She has 4 hands, just no forearms.The WiiU having no hands was a cute touch. She actually looks cute, too, in a Velma Dinkley-like way.
No kidding. I'm on my second RMA. And this is XFX, which is a reasonably reputable brand, too. Supposedly.
Speaking as an ATI user (for the moment), that second one is truth.
XFX has usually been pretty reputable, though they have had a few stinkers.No kidding. I'm on my second RMA. And this is XFX, which is a reasonably reputable brand, too. Supposedly.
They make the best silk purses they can out of those sow ears.[DOUBLEPOST=1435262451,1435262313][/DOUBLEPOST]No kidding. I'm on my second RMA. And this is XFX, which is a reasonably reputable brand, too. Supposedly.
All but one of the ATI cards I've ever owned has been flaky. Even the card I own right now (which is the same model as the one giving Fade so much trouble) routinely experiences driver crashes while doing nothing at all. Just a microsecond of the screen blipping to a blue screen of death then immediately back to desktop, as if nothing ever happened, when the driver restarts. It bothered me a lot at first, but I got used to it over time I guess. But you can bet your ass the next card I buy is NVidia again. I only have ATI now because it was a gift.My AMD cards have been relatively painless other than janky Nvidia technologies which barely work on their own cards.
You'd think they ought to be able to outclass Intel on the on-board graphics side of things, at least. But even there all I hear is that Intel's latest are leaps and bounds better.It's a bit of a shame, really.
AMD's GPU design is one that was designed to be very strong at GPGPU computing, but they haven't been able to get their power consumption/heat under control, and so they have to make compromises.
Their biggest trouble is that they're fighting on two fronts, against NVIDIA for GPU and Intel for CPU. They've come up with some pretty impressive stuff, but they end up so rushed because they have to try and keep up on both ends that they're never really able to refine their stuff. And that's why Intel+NVIDIA currently rules the roost.
--Patrick
They have consistently outclassed Intel in the IGP sector, but with the latest 6xxx-series integrated GPU introduced in June, Intel has finally been able to surpass AMD's integrated GPU for gaming performance, which was one of AMD's last major strongholds.You'd think they ought to be able to outclass Intel on the on-board graphics side of things, at least. But even there all I hear is that Intel's latest are leaps and bounds better.