Jump to content
LCVG

Ouch, Valve slams Nvidia!


General Zot

Recommended Posts

whooter, I agree that more research here is needed, but I'm not buying a new Nvidia card until their name is cleared either 8)

 

The part I found interesting was the vast gap between the two product lines, it was literally half the FPS for Nvidia! Even if Gabe Newell wanted to skew the numbers a bit, isn't this too huge a difference to be made up? Especially with the bad press Nvidia has gotten recently for doctoring benchmarks, just doesn't instill confidence.

 

Don't matter, not buying anything until HL2 ships anyway. Always cheaper tomorrow...

Link to comment
Share on other sites

I'll reserve judgement on this until someone who isn't having his pockets lined with ATi money says it's so...

 

Nvidia's own words in thier press release reply which you can read here:

 

Nvidia's response

 

Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers.

 

I see that as being exactly the problem with DX9 applications on Nvidia cards and the entire point Gabe Newell is trying to make. Smaller dev houses who do not have the money and resources to pour into creating these specialized paths may end up having major issues getting the best possible performance from Nvidi'as FX hardware.

 

Furthermore several sites have been reporting on John Carmacks comments that the Nvidia FX series does indeed require these specialization:

 

Update 09/05: We emailed id Software guru, John Carmack about his experience with NV3x hardware on Pixel Shading performance, and this was his reply:

 

GD: John, we've found that NVIDIA hardware seems to come to a crawl whenever Pixel Shader's are involved, namely PS 2.0..

 

Have you witnessed any of this while testing under the Doom3 environment?

 

"Yes. NV30 class hardware can run the ARB2 path that uses ARB_fragment_program, but it is very slow, which is why I have a separate NV30 back end that uses NV_fragment_program to specify most of the operations as 12 or 16 bit instead of 32 bit."

 

John Carmack

 

Gabe Newells point touches more on the development issues that other dev houses may run into and less of "thrashing" of Nvidia cards. Carmacks comments seem to back these worries up. Competition is a good thing but Nvidia is just as guilty of lining pockets as anyone else in the industy, the point of this particular address is still poignant if true, however I would like to see actual benchmarks done by some reliable hardware sites as well. We'll have to see what improvements the Detenator 50's bring to the FX hardware.

 

When is their new chip due?

 

I've heard the release for the NV40, nvidia next major redesign, is sometime 1st quarter next year. Around the same time ATI will be releasing the R420 chipset, which is also their next major hardware redesign as well.

 

ATI also has the 9800TX due soon and it seemd rumors of it being packaged along with Half-Life 2 may be true.

Link to comment
Share on other sites

Good points, Romier. As an Nvidia fan and ATi skeptic, it's disapointing to see.

 

I agree, its disappointing from any perspective really. I want Nvidia and ATI to both succeed. The more competition the better. Keeps both companies on their toes. The Det 50'smay bring about some revelations however so we'll see how that goes.

Link to comment
Share on other sites

I want Nvidia and ATI to both succeed. The more competition the better.

 

Well, the seesaw kind of battle might be good. I thought for sure a couple years ago that ATI was dead in the water, but they really worked hard in the face of being obliterated. I suspect nVidia is working just as hard, and will come back strong...

Link to comment
Share on other sites

I agree with Ed, the race is not yet over. If Nvidia needs to kick it up a notch for next generation, so be it. Reminds me of the Intel/AMD battle the last few years. We didn't get really fast and really cheap CPUs until AMD started kicking the bejesus out of Intel with a superior price/performance ratio. Now all CPUs are a lot more resonable and going Intel or AMD is strictly a personal preference. ie - they will both play Quake 3 equally well for 95% of the population.

 

I'll probably pick up that Radeon/HL2 bundle when it hits shelves, more just because it makes two must have purchases a little cheaper than buying separately. I don't think Nvidia HL2 players will be at any sort of disadvantage at that point (maybe at worst turn off a little high end eye candy -- big whoop.)

 

It's like that old CIA game: let's have you and him fight, I'll wait here.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...