Need a Temporary Gaming Video Card

Status
Not open for further replies.
C

Cuyval Dar

You don't buy a laptop graphics card; you buy a new laptop. That 8200M isn't upgradeable anyway.

---------- Post added at 04:01 PM ---------- Previous post was at 03:58 PM ----------

Strange, the video card heirarchy shows the 9500gt lower than the 7800 I have right now. :blue:
It's because there are different versions of both. The 7800 GT is better than the 9500 GT with DDR2 RAM, but about the same as the 9500 GT with DDR3.

And the 7800 GTX is just better.

Just something to keep in mind about the hierarchy chart (as with all benchmarkers really). Different cards will perform differently with different tests.

I like Tom's because they use both 3DMark that just pushes the card's capabilities from a math perspective and they current games at varying levels of video settings.[/QUOTE]

3dMark is a synthetic benchmark POS that has no bearing on real-world performance. The only reason that Futuremark is around as a company is because Intel bribes them.
 
Strange, the video card heirarchy shows the 9500gt lower than the 7800 I have right now. :blue:
Kinda my point. ;)

Take my advice for what you will, my main computers still get by on an ATI FireGL X3 (X800XT equiv) and a Quadro FX 4000 (6800 Ultra equiv). The wife's machine has a 7800GS AGP installed. Everything else I know (ie, newer cards), I know from specs and reviews.

--Patrick
 

Necronic

Staff member
Well, previously the Nvidia and ATI cards would be neck and neck and PhsyX pushed Nvidia over the edge for me.
Nvidia has been severely lagging behind since the 4870X2 came out, that beast blew the doors off of anything NVidia had, and the only way they could come close to matching it at the time was duct taping two inferior cards together, and that still ended up way below the 4870.

That said, Nvidia manufacturers generally have massively better customer support than ATI manufactureres. EVGA and BFG's lifetime warranties blow Sapphire's shitty little 2 year warranty away.

PatrThom said:
a Quadro FX 4000 (6800 Ultra equiv)
Damn, that is a slightly expensive card. I always find it funny when I read newegg reviews on those and people will say "Gaming performance was terrible, waste of moneys!!!" Makes me giggle. So you do CAD or something? Friend of mine put one of those into a laptop for some engineering work. Brutally expensive piece of equipment.
 
So I'm getting curious

Two 5870s or One 5970?

From what I've been seeing on benchmarks is to go with the dual 5870s?
 
Gelato[/URL] GPU rendering with it, but I just never set aside the time to pursue my 3D modeling dreams.
Two 5870s or One 5970?
I always prefer to recommend multiple single cards unless you really need to have 3 or 4 GPUs in the machine. Single cards are usually clocked faster/overclock better, have more RAM, and are a lot easier to swap out/troubleshoot if one goes bad. Also, for those games where dual GPU doesn't give you any boost (or even gives a penalty!), you can just turn CF off (or remove a card to save power/heat). I've always been a big fan of modularity, and will happily sacrifice 1 or 2fps for the versatility and/or ability to enormously simplify troubleshooting.

--Patrick
 
C

Cuyval Dar

When Hydra motherboards hit the market, that recommendation may change.
 
It was a tossup between the GTX275 and the Radeon4980, but I decided I'll give ATI a shot again before jumping straight to Nvidia.
 
ATI cards are generally better for anything not specifically optimized for a GeForce or Physx Card. The only reason I use a GeForce is because of City of Heroes.
 
I would be interested in hearing how well it works, Shego. I would expect to get about 2/3 the performance/fps of your GTX295 with it (on average). I know that card has issues (the entire high end of the 4xxx/RV770 series, actually), but they seem to be the sort of issues that don't come up under 'normal usage.' If you DO run into these problems, they can be solved by slightly underclocking your card.

--Patrick
 
C

Cuyval Dar

Yeah, stress test programs that I've never heard of seem like a really accurate benchmark for stability and reliability. Now, show me issues that crop up in real-world scenarios, and the we're talking.

---------- Post added at 09:36 PM ---------- Previous post was at 09:35 PM ----------

Oh, and if that wasn't clear enough: I call bullshit.

---------- Post added at 09:38 PM ---------- Previous post was at 09:36 PM ----------

Originally Posted by Tetedeiench
UPDATE : the 4870 on the test is a 4870 PCS+ from PowerColor whose VRM are a 4-phase numerical VRM instead of 3, and that's why it is not crashing.

Sorry, i thought it was an Asus design. It is Not. My mistake.

Eastcoasthandle:
So you basically designed a new test which is designed for 4-phase numerical VRM instead of 3?
Just what I thought.
 
Honestly, I pumped up the graphics in L4D2 pretty high (not 16xAA or anything) but on par with my GTX295 and I have to say, it runs smooth as butter. Granted my load times are a little longer, but that's because this older system has an older motherboard and less RAM.

On the graphics side, I can safely say, this will easily keep me tided over till my main system comes back, and I'm really looking forward to Crossfiring two 5870s in a few months.
 

figmentPez

Staff member
Little of column C, little of column D.
Ah, I see. You sold nude pictures of yourself to some poor sap, and now you're going to kill him and any other man who saw them. Devious. :twisted:

---------- Post added at 12:09 AM ---------- Previous post was at 12:05 AM ----------

Link really doesn't care for me.
23" LG W2361VG-PF

At least, that's what comes up at the link for me. It's native resolution is 1920 x 1080, so I'm not sure why Shego would be playing @ 1680x1050 on it.
 
C

Cuyval Dar

The aspect ratio is better at 1920x1080. I say that from experience I prefer the 16:9 aspect ratio. It just seems better for gaming
 
Yeah but frame rates are hurt by larger resolutions.
True, but playing the game at a different aspect ratio than the native one for your monitor can affect performance.

In this case, probably not by much, but the fps difference between 1680x1050 and 1920x1080 probably isn't that much either (at least in L4D2).
 
I'll give it a try then, I've just always played all my games at 1680.
Testing on an actual game is really the best way to check. :p

When I installed Dawn of War on my old PC desktop with a spanky new 9600 GT, it recommended extremely low graphics settings as "optimal". I tried it out, then turned all the settings to max and tried it again. Didn't affect performance barely at all, and looked much better.
 

figmentPez

Staff member
I'll test when I get my main system back, this one is having a hard enough time as it is.
It could be CPU bound. My 2.5Ghz Core 2 Duo with Radeon 4830 is on Left 4 Dead (I haven't tested L4D2). I get pretty much the same frame rates at 1280x720 as I do at 1920x1080 (both 16:9 resolutions). I made a time demo and ran it a few times with varying AA levels, and they were all within a few fps in the results.
 
I'll test when I get my main system back, this one is having a hard enough time as it is.
It could be CPU bound. My 2.5Ghz Core 2 Duo with Radeon 4830 is on Left 4 Dead (I haven't tested L4D2). I get pretty much the same frame rates at 1280x720 as I do at 1920x1080 (both 16:9 resolutions). I made a time demo and ran it a few times with varying AA levels, and they were all within a few fps in the results.[/QUOTE]

That sounds right since they're both 16:9. As best as I understand it, as long as the native resolution of your monitor is well within the capabilities of your graphics card (you're not running at 2560x1600 or something like that), performance at different resolutions of the same aspect ratio won't be substantially different, assuming all other settings (AA/AF/textures/lighting/shadows/etc) remain the same.
 

Necronic

Staff member
I swear you young rapscallions with your highest graphical settings. I've always thought it was a bit odd. Back in the day playing Quake 1-3 we would build these really powerful systems, then crank back the graphics to the absolute minimum to get the best frame rates possible.

Low frame rates will completely fuck your ability to play FPSs at a competitive level. High screen res can be nice though, as it often gives you a larger viewable area (especially in the old games with solid status bars.) But it still blows my mind that people play these games with anything less than the 75ish hz limit of LCD screens/human eye.
 
Status
Not open for further replies.
Top