Home Gaming Nvidia working to fix Tomb Raider issues

Nvidia working to fix Tomb Raider issues

1 min read
69

TRAMD

Tomb Raider on the PC is one of the few blockbuster titles that features an AMD splash screen instead of the all-too-familiar “Nvidia: The way it’s meant to be played” one – so it’s fair to say that’s it’s been heavily optimised for AMD cards. In fact, right now, it doesn’t work very well with Nvidia hardware at all – but a fix is coming.

“We are aware of major performance and stability issues with GeForce GPUs running Tomb Raider with maximum settings,” Nvidia’s Andrew Burnes said in a comment on Nvidia’s web site (via Joystiq).

“Unfortunately, NVIDIA didn’t receive final code until this past weekend which substantially decreased stability, image quality and performance over a build we were previously provided. We are working closely with Crystal Dynamics to address and resolve all game issues as quickly as possible. In the meantime, we would like to apologize to GeForce users that are not able to have a great experience playing Tomb Raider, as they have come to expect with all of their favorite PC games.”

The game is further optimised to make use of AMD’s system-intensive TressFX to give Lara some pretty lifelike hair – but even that stresses out some pretty beefy AMD systems.

Have any of you, using Nvidia based hardware, had any issues running Tomb Raider?

Last Updated: March 7, 2013

69 Comments

  1. Sir Rants-a-Lot Llew

    March 7, 2013 at 07:54

    Well that sucks. See, again why I don’t buy day1 🙂 Always something.

    On a side note, the TressFX that the game uses for hair, can nVidia cards also calculate and display these effects or is it purely AMD?

    If so that’s a pretty cool thing to have on AMD.

    Reply

    • Angus Comrie

      March 7, 2013 at 08:20

      TressFX is available for both nvidia and amd cards. The current gen of nvidia cards (excluding titan) have gone backwards in terms of compute power though, so it certainly runs better on AMD. Not necessarily a driver issue there as much as a design issue.

      Reply

      • Sir Rants-a-Lot Llew

        March 7, 2013 at 08:23

        I’ve been noticing a trend that bothers me (I’m pro nVidia). That is that developers are leaning more towards AMD optimization lately.

        I don’t think it’s due to computational power but rather game dev support channels. nVidia has seriously started lacking in this regards and so it seems that game companies are definitely preferring to enjoy the ease of access that AMD provides for gaming companies.

        It worries me because I may have to swallow my pride at some stage and move over to AMD if I want to game if nVidia doesn’t shape up and step up to the plate

        Reply

        • Admiral Chief Erwin

          March 7, 2013 at 09:00

          I tend to compare AMD to Android and NVidia to Apple.

          I loves me some AMD and Android and prefer fish paste to Apple and or NVidia. Although I dislike Apple even more.

          Reply

        • FoxOneZA

          March 7, 2013 at 09:42

          Yeah what you say is true. Reminds me of the 2003-2009 period when games had Creative Soundblaster plastered all over it. On the flipside I find AMD is lacking on the mid-range market. Yes their high-end cards are much cheaper than the Nvidia equivalent but the mid-range needs attention.

          Another factor that works against Nvidia is that AMD GPU’s will be in the 3 major home consoles next-gen. That’s a 3:1 ratio making dev’s lean towards AMD.

          Reply

          • Johan du Preez

            March 7, 2013 at 13:54

            Yeh creative use to be big news in the 90’s as well, I remember them having the best gfx cards at the time … XFX reminds me allot of creative back in the day.

            I notice they have a new range of sound cards out locally been itching to try them out.

        • Uberutang

          March 7, 2013 at 10:49

          Arma 3 is pure Intel and Nvidia optimized 😛
          Viva le real sims.

          Reply

          • Sir Rants-a-Lot Llew

            March 7, 2013 at 10:53

            What worries me is how it is becoming less and less.

            CryTek (Crysis 3), Irrational Games (BioShock Infinite), Ubi Soft (Far Cry 3), Maxis (Sim City), Electronic Arts, Square Enix… All of these are leaning towards AMD optimization. No longer nVidia 🙁

            Much sadness

          • Uberutang

            March 7, 2013 at 10:55

            All cross platform fire and forget games. Nothing with meat on the bones 😛

          • Sir Rants-a-Lot Llew

            March 7, 2013 at 10:59

            lol

        • Angus Comrie

          March 11, 2013 at 09:59

          I’m pretty neutral when it comes to these cards, as I use an AMD card for gaming and an nVidia card for research, but it’s pretty clear that nvidia shifted their focus away from things like directCompute in the 600 series, while AMD shifted towards that. The thing that would worry me as a gamer on an nvidia system is the fact that all 3 new consoles will be AMD. I can’t imagine that helping the optimization on the nvidia side.

          Reply

  2. Shodan

    March 7, 2013 at 08:09

    Just did the benchmark test on Ultimate Settings@1080p and got Max = 59 Min 28 and Avg = 39.2 on my GTX680 which in game feels really smooth so nothing to complain yet…

    Reply

    • Sir Rants-a-Lot Llew

      March 7, 2013 at 08:45

      Not bad. But so low even on a 680? That’s pretty heavy. My poor 560Ti will probably cry 🙁

      Reply

      • Tarisma

        March 7, 2013 at 09:58

        Ya my poor 650ti will have a heart attack! Good thing I have always valued gameplay over pretty pictures.

        Reply

      • Shodan

        March 7, 2013 at 13:44

        Saw a vid last night with a dude running two GTX Titans…. avg 140 fps all max settings… ENVY!!!!

        Reply

  3. Twakkie

    March 7, 2013 at 08:20

    Dont let Erwin read this! He will be posting AMD slander all day long!

    Reply

    • Sir Rants-a-Lot Llew

      March 7, 2013 at 08:24

      The problem is he may have some rights to do it. Go read a bit on the net. Devs are starting to move away from nVidia (And that is a bad thing, yet completely nVidia’s fault)

      Reply

      • LordCaptainAwesomeness

        March 7, 2013 at 08:32

        AMD Powwwaaaaaaaaa!!

        Reply

        • Sir Rants-a-Lot Llew

          March 7, 2013 at 08:41

          Yeah yeah. Here, you forgot your Ritalin this morning 😛 #trolol

          Reply

          • LordCaptainAwesomeness

            March 7, 2013 at 08:44

            Ritalin? you mean the sugar in my Tea?

          • Sir Rants-a-Lot Llew

            March 7, 2013 at 08:44

            Oh dear…..

          • Admiral Chief Erwin

            March 7, 2013 at 08:58

            Not in tea dude, you must take it up the bum, works faster and your fart smells minty fresh

          • LordCaptainAwesomeness

            March 7, 2013 at 09:09

            That’s what you said the last time, and I woke up 2 days later, in the bush, completely neked, hunting with some lions…. No my dear sir, not this time…

          • Admiral Chief Erwin

            March 7, 2013 at 09:23

            You never said you did not enjoy the experience. The nature, the wild animals, the freedom. What more do you want?

          • LordCaptainAwesomeness

            March 7, 2013 at 09:28

            Touché my good Sir

        • FoxOneZA

          March 7, 2013 at 09:34

          Info overload. Go get a room or a tent you two @@

          Reply

    • Admiral Chief Erwin

      March 7, 2013 at 08:58

      READ IT AND SUCKIT BITCHES!!!!

      ROOOOOOOOOAAAAAAAAAAAAAR

      RAAAAAAAAAAWWWWWWWWWWWWWR

      Ummm, yeah, and stuff

      Reply

      • Twakkie

        March 7, 2013 at 09:14

        Jip…. the beast has been unleashed lol.

        Reply

        • Admiral Chief Erwin

          March 7, 2013 at 10:05

          Beast has been unleashed to feast over there in the east on some yeast at least

          Reply

          • Alex Hicks

            March 7, 2013 at 12:28

            Thanks for that Dr Seuss.

            Dammit, I have to get this game. Looks awesome; but my PC will likely struggle …

          • matthurstrsa

            March 8, 2013 at 10:57

            Because you have Nvidia?

  4. Folly

    March 7, 2013 at 08:20

    Played for 2 hours on high with Lara’s hair graphics set to normal and vsync off without any issues, I then change the graphics to ultra which enabled tressfx and vsync, crashed within about 10 mins

    Switched it back to high and played for hours since with no issues.

    So I don’t get to see her wonderful realistic hair blowing in the wind, big deal

    Reply

  5. Theo Steenekamp

    March 7, 2013 at 08:22

    Header win! I don’t have any problems with it crashing at high, got everything maxed out with TressFX just using Nvidia’s adaptive Vsync, about 4hours playtime without any crash.

    Reply

  6. OVG

    March 7, 2013 at 08:31

    REALISTIC HAIR!!!!

    Reply

    • Sir Rants-a-Lot Llew

      March 7, 2013 at 08:44

      Lol.

      What bugs me though is that even in the tech videos showcasing the hair it does seem to be a bit… odd… Like ghostly, floating odd.

      Reply

    • Sir Captain Rincethis

      March 7, 2013 at 08:51

      Brilliant ovg, win!

      Reply

  7. Sir Captain Rincethis

    March 7, 2013 at 08:50

    Header win Geoff, has got to be header of the week!

    Reply

  8. Admiral Chief Erwin

    March 7, 2013 at 09:01

    Never a big fan about hair, personally like it shaven 😉

    Reply

    • Slade Boender

      March 7, 2013 at 09:07

      died **,

      Reply

    • Sir Rants-a-Lot Llew

      March 7, 2013 at 09:14

      I see what you did there….

      Reply

  9. Slade Boender

    March 7, 2013 at 09:07

    RACISM!!!!

    Reply

  10. Fnuik

    March 7, 2013 at 09:12

    And that is why I love consoles. I put in a game and I know it’s gonna work!

    Reply

  11. Galbedir

    March 7, 2013 at 09:15

    Viva la AMD! 😛

    Reply

  12. HellFire

    March 7, 2013 at 09:47

    Maybe developers are leaning towards AMD architecture to prepare for Next Gen?? Sony and Microsoft are both going to use AMD!!

    Reply

    • Exalted Overlord Geoffrey Tim

      March 7, 2013 at 09:53

      That right there is some interesting, and likely true conjecture.

      Reply

    • Tbone187

      March 8, 2013 at 16:38

      Yup, Strange how nvidia have dropped the ball like this…

      Reply

  13. PaasHaas

    March 7, 2013 at 10:33

    Hmmmm

    Game is running great sofar for me on a 570gtx.

    Was running at max at first(not with the fancy hair tho), but later I disabled tessellation because the framerate dropped too low at places.

    Havent had a crash yet.

    Reply

  14. Uberutang

    March 7, 2013 at 10:47

    Running it maxed out on Nvidia Gforce 670. With fancy hair on. No issues. The trick is to turn off ‘tessellation’ , until Nvidia releases updated driver. Looks pretty damn awesome.

    Reply

    • LordCaptainAwesomeness

      March 7, 2013 at 10:54

      Nooit Guy. Laaik the Tesselation is laaik what makes it Purdy… Go Go AMD Powa

      Reply

      • Uberutang

        March 7, 2013 at 10:56

        Cannot play Arma3 properly without Physx… I am glued to Nvidia!

        Reply

        • LordCaptainAwesomeness

          March 7, 2013 at 11:00

          So why not use both? Hybrid-PhysX for the Win !

          Got the setup like that for the last year, and it was the best choice to date, well that and planting the speakers in Darren’s bedroom, convincing him he’s hearing voices…

          Reply

          • Johan du Preez

            March 7, 2013 at 12:38

            Yeh I just got myself a 7970 and using my 660ti as my physx card and it works like a dream.

          • Sir Rants-a-Lot Llew

            March 7, 2013 at 13:10

            I don’t like doing setups like that because it halves your PCIex lanes from 16 to 8. Wish they would go ahead and make mobos that are capable of full dual 16 lane capabilities over 2 cards 🙁

          • Johan du Preez

            March 7, 2013 at 13:40

            If you get a very expensive mobo both pcie slots runs at 16x 🙂 I have a cheap one though, I keep the physx card on the slower pcie slot. I am not sure what the memory bandwidth usage just for physx is but it should be much lower then the card running full 3d keeping that in mind I dont think it gets effected as bad as a Crosfire/SLI setup would.

          • Sir Rants-a-Lot Llew

            March 7, 2013 at 14:39

            It gets affected the same. Regardless of what you do. The moment you plug in a second card it splits your channels evenly. Whether you are using it in SLI or as 2 seperate cards doing different jobs.

            8 PCI lanes for the one card and 8 PCI lanes for the second.

            Which sucks big time.

            Plugged in a dedicated PhysX card in to my system to see if I could run Arkham City better with a seperate card doing the PhysX calculations and actually got a decrease in performance because suddenly my primary card was having to run on only 8 lanes.

      • Sir Rants-a-Lot Llew

        March 7, 2013 at 11:05

        Tesselation is a DX11 tech not linked to either nVidia or AMD. I think you meant TressFX as nVidia cards handle tesselation better due to the CUDA core technology?

        Reply

        • LordCaptainAwesomeness

          March 7, 2013 at 12:55

          No. I am aware that it is part of DX11. The fact that he had to turn Tessellation off on the Nvidia is worrying.

          Now had he been running an AMD, then he would not have the issue 😉

          Mooaaaaarr Power

          Reply

          • Sir Rants-a-Lot Llew

            March 7, 2013 at 13:09

            I reckon what’s happening is that the nVidia drivers are trying to tessellate each individual hair strand being created by the TressFX which would destroy almost any card that isn’t optimized for TressFX.

            If nVidia release an update simply removing tessellation attempts from the hair then it should run perfectly fine. This is simply assumption though

  15. Gareth Ludeman

    March 7, 2013 at 11:20

    Played for 5 hours straight last night with maxed out graphics and zero issues. My GTX580 has never let me down 🙂

    Reply

  16. mily male

    March 7, 2013 at 15:30

    it’s just awesome , I just hope they fix the probs soon enough 😉

    Reply

  17. Hotshot

    March 7, 2013 at 16:06

    Runs great on my GAINWARD Phantom GeForce GTX 670 OVERCLOCKED 2048MB GDDR5
    Max settings

    Reply

    • Syph1n

      March 7, 2013 at 22:55

      mine to

      Reply

  18. ???????? ???????

    March 7, 2013 at 18:52

    Hair? TreeFX? LOL! The real HairFX is found in Alice: Madness Returns with PhysX. No lags, no other problems just… realistic hair behavior. Unlike TR, where in the first playable scene you can see how Lara’s super-hair, other than that in pony-tail, are noticeable glued to her head.

    Reply

  19. matthurstrsa

    March 8, 2013 at 10:55

    Does nothing work on launch day anymore?

    Reply

  20. Luke Miller

    March 9, 2013 at 04:02

    Anyone having a problem with the TreeFX? Lara’s hair always seems to be flickering, I don’t know if it’s my GPU driver or if I just need a patch. Please respond.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Check Also

Turns out Microsoft will require a TPM chip for you to install Windows 11

Turns out the much hyped low-specs for Microsoft's new operating system might be more rest…