• Save 15% on ALL SFF Network merch, until Dec 31st! Use code SFF2024 at checkout. Click here!

GPU Geforce 20 series (RTX) discussion thread (E: 2070 Review unbargo!)

QuantumBraced

Master of Cramming
Mar 9, 2017
507
358
Yeah... The insane prices are also due to lack of competition. Luckily, Intel is stepping into that space in 2020, let's hope they release a compelling product. My guess is Nvidia's next GPU will launch around that time, possibly just after Intel's. Likely, that cold war has already begun.

You saw what Intel did for 10 years without competition from AMD, $350 quad cores for a decade with 5% generational improvement. A Core 2 Quad is still decent for 1080p gaming and general use. Let's hope this doesn't happen to GPUs. AMD made a killing last quarter, so maybe they'll devote more resources to their own GPU division.
 

wovie

Trash Compacter
Aug 18, 2015
48
12
They mentioned that the new reference cooler at full load is only 20% of the noise compared to a 1080 Ti reference cooler. Could be very interesting since I was leaning towards slapping an Accelero with NF-A12s for an M1 build, but if it's as good as they say, I'd be saving a decent chunk of change.
 
Last edited:

Duality92

Airflow Optimizer
Apr 12, 2018
307
330
They mentioned that the new reference cooler at full load is only 20% of the noise compared to a 1080 Ti reference cooler. Could be very interesting since I was leaning towards slapping an Accelero with NF-A12s for an M1 build, but if it's as good as they say, I'd be saving a decent chunk of change.

XFX called about their new FE cooler, they want their HD 7870 DD heatsinks back.
 

Reldey

Master of Cramming
Feb 14, 2017
387
405
Those are some chunky cards. Most appealing to me from a SFF perspective is the Asus Turbo blower... wonder how loud that thing can get. Waiting on benches... but uh don't really need to upgrade at the moment.
 

TheHig

King of Cable Management
Oct 13, 2016
951
1,171
So even the local Microcenter has the pickup date of 9/20 for store preorders.

Wonder when some reviewers will have them? More waiting it seems.
 

SegaCD

What's an ITX?
Aug 20, 2018
1
4
I don't know why everyone is crying doom & gloom here, but I think people are over-analyzing the presentation and also overlooking a few key points that weren't emphasized in the presentation when they should have been.

The presentation held the ray-tracing architecture as an analog to the programmable shaders of the 8X00GTX series, when it really should have been compared to the hardware T&L units that the Geforce 256 had. The 8X00GTX had an immediate impact on current games of the time without much hardware support (although DX10 certainly improved performance further) but the hardware T&L units required graphics engine support/industry adoption. Once adopted, the T&L units completely changed the way lighting was handled on GPUs and offloaded the actual GPU (or rather CPU) for many other tasks, accelerating 3D quality between the 2001-to-2005 period. Other GPU vendors had competing T&L units (see Voodoo 5500) but most fell flat.
One of the few things that the movie production companies do that video games can't handle in real time is high density ray-tracing. Ray-tracing is coming one day whether today with the RTX announcement or in the future. Nvidia is trying to grab hold of this next logical step that the market will eventually go down by creating a ray-tracing standard before AMD/others can, just as they did in the early hardware T&L days. Hopefully, people who buy RTX GPUs now wont end up with useless RT units if the industry goes with another solution.

The increased die space costs more. Its not as easy as duplicating sections of the silicon either (as in Threadripper). A price increase is logical, especially if they're maintaining the performance of the previous generation and then some. We also have low GDDR6 yields and higher import taxes almost across the board...and we still have the competition from miners (even though its subdued somewhat). You can tell Nvidia tried to deemphasize the price with their tricky tactics. (We all saw through that slide with the picture of the 2080 Ti that said "Starting at $499", Nvidia...)

I'm assuming Nvidia Volta is a similar architecture to Turing. Lets look at the Titan V which has a similar number of CUDA cores as the RTX 2080 Ti just without the RT cores and with a few less optimizations here and there (e.g. hardware support for foveated rendering, etc). There was at a least 30% increase in performance from games over the Titan Xp across the board. Lets expect the performance of this card to match Volta at the very least.

The increased TDP is most likely from the much-overlooked 35w that can be supplied through the USB-C/VirtualLink port. The TDP is likely calculated including this load which is why its exactly 35w above the standard 250w of most cards today. It is unlikely that this will add much to the heat produced from the card even in use. This card will likely run much cooler than this generation just from process improvements.

If they put more RAM into the RTX 2080 Ti, it would encroach on their market for the Quadros (which have less CUDA cores to begin with but more RAM). You want space to do lots of data crunching? Get a Quadro. Want a lot of textures for gaming? Your 11GB RTX 2080 Ti will have more than enough memory for a while.

(I was going to make an account here eventually since I'm building a SFF PC but I figured this was a good time to start. ;) )
 
Last edited:

Biowarejak

Maker of Awesome | User 1615
Platinum Supporter
Mar 6, 2017
1,744
2,262
Yeah I'm also very much looking forward to 2050 ti. Hopefully it'll have a proper half-height, ITX Length variant.
 

tinyitx

Shrink Ray Wielder
Jan 25, 2018
2,279
2,338
Prices listed on the website are higher than announced during the stream. $599 - 2070, $799 - 2080 and $1199 - 2080 Ti. Prices announced were $499 - 2070, $699 - 2080 and $999 - 2080 Ti. He made a point that these were starting at under $500.
It seems that there are 2 FE for the 2070. The standard one is $499 with boost clock at standard 1620 MHz and then there is an overclocked (1710 MHz) one at $599. It is a little confusing but this is how I get it. Check the full spec below.
https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2070/

Edit:-
With further thinking, I now reckon there is only one FE which is overclocked at 1710 MHz Boost Clock. The spec (@1620 MHz) is standard spec that AIB can use to make a 'standard' version. So, I suppose an AIB can use this standard spec and use a traditional blower cooling design to sell at $499. But this is not going to happen a lot as AIB are likely to put forward overclocked versions with better cooling to see at higher price.
 
Last edited:

EdZ

Virtual Realist
May 11, 2015
1,578
2,107
With how much the AIB cards are being sold for in practice (in the UK, all are at or above the FE price), I'm happy I went for the FE. The cooler looks visually nicer (though I'm sad to see the demise of the only decent rear-exhaust design in production) and is likely to perform effectively identically to all the others as with previous generations, and no PCB sillyness to make watercooling awkward.
 

gunpalcyril

Airflow Optimizer
Aug 7, 2016
294
319
I took a hard close look at both the 2070 and the 2080, it seems the 2070 is about an inch shorter, and has the power plugs at the end like some quadro and wx cards. The 2080 is physically and inch longer and has the pins at the traditional layout, quite a peculiar design choice for the 2070.
 

tinyitx

Shrink Ray Wielder
Jan 25, 2018
2,279
2,338
With how much the AIB cards are being sold for in practice (in the UK, all are at or above the FE price), I'm happy I went for the FE. The cooler looks visually nicer (though I'm sad to see the demise of the only decent rear-exhaust design in production) and is likely to perform effectively identically to all the others as with previous generations, and no PCB sillyness to make watercooling awkward.
I see Asus and EVGA (at least) have traditional blower design in their line-up, including 2080Ti. I would be interested to see the noise and temperature of this design and to see if Nvidia abandons it for marketing reason or not.
 

rfarmer

Spatial Philosopher
Jul 7, 2017
2,669
2,793
 

TheHig

King of Cable Management
Oct 13, 2016
951
1,171
Welcome to the party @SegaCD !

I get that Ray Tracing may be some watershed moment in rendering the likes that hasn't been seen in quite some time. However I very much believe that gaming with Ray Tracing disabled the RTX gpus will be an incremental improvement in today's titles. This is some sweet tech and as an enthusiast I'm interested certainly but a bit skeptical as well concerning this first wave. I'm guessing if you aren't in a RT title its just a bit better than Pascal and uses a bit more power to get there. This is a case of build the hardware and the software will come. Now with Nvidia's influence I very much believe that it will happen. Maybe 2020 before its really getting traction. Pre orders are a month to delivery and we probably won't see reviews until then. How many people would drop 1200 on the 2080Ti if its only 20% faster than the 1080Ti that is 699 for DX12/11 stuff we have now? Very few.
 

lhl

SFF Lingo Aficionado
Nov 16, 2015
121
143
and we still have the competition from miners (even though its subdued somewhat). You can tell Nvidia tried to deemphasize the price with their tricky tactics. (We all saw through that slide with the picture of the 2080 Ti that said "Starting at $499", Nvidia...)

Straight from Nvidia earnings - they were expecting $100M in crypto-mining chip sales last quarter and ended up selling $18M (and expect $0 cryptomining revenue moving forward) - you can bet they’re sitting on a metric ton of unsold 10-series inventory.

It’s all fine for Nvidia as w/o competition from AMD they’re able to price the high end how they want (selling out at almost double the launch price of the last gen isn’t too shabby - all w/o showing any non-RT benchmarks) to max out profits/spool out inventory.

Turing is super compelling for the DC (the SIGGRAPH keynote was great, really exciting) and the tech is awesome - longer term, will be as big as the move to shaders, but no one has to be excited about the fact that $/core and $/perf looks like it’s possibly even gone up vs Pascal (sure bigger dies blah blah, but remember this is on a super mature node and are Quadro bins). Personally I’ll wait for benchmarks, price drops, and honeslty, likely 7nm and the return of competition (2019-2020) before upgrading. Lots of other exciting tech to spend money on and my 1080Ti is looking like a pretty good buy in hindsight.
 

QuantumBraced

Master of Cramming
Mar 9, 2017
507
358
I don't know why everyone is crying doom & gloom here, but I think people are over-analyzing the presentation and also overlooking a few key points that weren't emphasized in the presentation when they should have been.

The presentation held the ray-tracing architecture as an analog to the programmable shaders of the 8X00GTX series, when it really should have been compared to the hardware T&L units that the Geforce 256 had. The 8X00GTX had an immediate impact on current games of the time without much hardware support (although DX10 certainly improved performance further) but the hardware T&L units required graphics engine support/industry adoption. Once adopted, the T&L units completely changed the way lighting was handled on GPUs and offloaded the actual GPU (or rather CPU) for many other tasks, accelerating 3D quality between the 2001-to-2005 period. Other GPU vendors had competing T&L units (see Voodoo 5500) but most fell flat.
One of the few things that the movie production companies do that video games can't handle in real time is high density ray-tracing. Ray-tracing is coming one day whether today with the RTX announcement or in the future. Nvidia is trying to grab hold of this next logical step that the market will eventually go down by creating a ray-tracing standard before AMD/others can, just as they did in the early hardware T&L days. Hopefully, people who buy RTX GPUs now wont end up with useless RT units if the industry goes with another solution.

The increased die space costs more. Its not as easy as duplicating sections of the silicon either (as in Threadripper). A price increase is logical, especially if they're maintaining the performance of the previous generation and then some. We also have low GDDR6 yields and higher import taxes almost across the board...and we still have the competition from miners (even though its subdued somewhat). You can tell Nvidia tried to deemphasize the price with their tricky tactics. (We all saw through that slide with the picture of the 2080 Ti that said "Starting at $499", Nvidia...)

I'm assuming Nvidia Volta is a similar architecture to Turing. Lets look at the Titan V which has a similar number of CUDA cores as the RTX 2080 Ti just without the RT cores and with a few less optimizations here and there (e.g. hardware support for foveated rendering, etc). There was at a least 30% increase in performance from games over the Titan Xp across the board. Lets expect the performance of this card to match Volta at the very least.

The increased TDP is most likely from the much-overlooked 35w that can be supplied through the USB-C/VirtualLink port. The TDP is likely calculated including this load which is why its exactly 35w above the standard 250w of most cards today. It is unlikely that this will add much to the heat produced from the card even in use. This card will likely run much cooler than this generation just from process improvements.

If they put more RAM into the RTX 2080 Ti, it would encroach on their market for the Quadros (which have less CUDA cores to begin with but more RAM). You want space to do lots of data crunching? Get a Quadro. Want a lot of textures for gaming? Your 11GB RTX 2080 Ti will have more than enough memory for a while.

(I was going to make an account here eventually since I'm building a SFF PC but I figured this was a good time to start. ;) )


Great post, welcome. A few thoughts. I agree that RT is a game-changer, I think in games that avail of it, these cards will outperform Pascal considerably b/c all lighting will be offloaded to the RT/AI cores and the shading cores will be freed up to do more shading + you have 500 extra CUDA cores, so performance will be markedly better. Not to mention, Pascal cards won't even support RT, so they'll look worse while having significantly lower FPS.

That being said, for games that don't support RT or don't use much of it, the difference won't be huge, most likely around 15-20% improvement over the 1080 Ti. Again, I think the reason they launched the 2080 Ti alongside the 2080/2070 is because the 2080 won't outperform the 1080 Ti in current games and future non-RT or low-RT games. In fact, the GV100/Titan V has almost 1000 more CUDA cores, so it will handily outperform the 2080 Ti in those instances, and yes that is a $3000 GPU, but it's also 9 months old now.

So the frustration here is that after 27 months, which I believe is the longest Nvidia has ever waited to update their product line, we're getting a top-tier product that costs almost double what the 1080 cost at launch in May 2016, has a considerably higher TDP, and doesn't offer a solid improvement in non-specific workloads. They didn't even bump the memory capacity, I was hoping for 16GB of GDDR6 on the top-tier card, I mean it's $1200. The one thing it's amazing for is compute, at least give it adequate memory for those workloads. Which include mining, let's be real. Mining *is* essentially dead -- check Nicehash profitability, at $0.15/watt, you're making $6/month with a 1080 Ti, but this card may have literally 10x the compute power, so it may be worth it to defray some of the cost, clearly not worth it to actually make money.

That being said, the RT and AI cores are revolutionary technology and it may be that all games soon start using it and games without it look so much worse that pre-RTX cards become quickly obsolete, like pre 8000-series cards did. And I get that the lack of direct performance increase is because Turing is only going from 14nm to 12nm, and the tariffs factor in the increased ccost, but still... Over 2 years, 20% non-specific improvement, double the price tag... Not amazing.
 

Thehack

Spatial Philosopher
Creator
Mar 6, 2016
2,813
3,670
J-hackcompany.com
I don't know why everyone is crying doom & gloom here, but I think people are over-analyzing the presentation and also overlooking a few key points that weren't emphasized in the presentation when they should have been.

The presentation held the ray-tracing architecture as an analog to the programmable shaders of the 8X00GTX series, when it really should have been compared to the hardware T&L units that the Geforce 256 had. The 8X00GTX had an immediate impact on current games of the time without much hardware support (although DX10 certainly improved performance further) but the hardware T&L units required graphics engine support/industry adoption. Once adopted, the T&L units completely changed the way lighting was handled on GPUs and offloaded the actual GPU (or rather CPU) for many other tasks, accelerating 3D quality between the 2001-to-2005 period. Other GPU vendors had competing T&L units (see Voodoo 5500) but most fell flat.
One of the few things that the movie production companies do that video games can't handle in real time is high density ray-tracing. Ray-tracing is coming one day whether today with the RTX announcement or in the future. Nvidia is trying to grab hold of this next logical step that the market will eventually go down by creating a ray-tracing standard before AMD/others can, just as they did in the early hardware T&L days. Hopefully, people who buy RTX GPUs now wont end up with useless RT units if the industry goes with another solution.

The increased die space costs more. Its not as easy as duplicating sections of the silicon either (as in Threadripper). A price increase is logical, especially if they're maintaining the performance of the previous generation and then some. We also have low GDDR6 yields and higher import taxes almost across the board...and we still have the competition from miners (even though its subdued somewhat). You can tell Nvidia tried to deemphasize the price with their tricky tactics. (We all saw through that slide with the picture of the 2080 Ti that said "Starting at $499", Nvidia...)

I'm assuming Nvidia Volta is a similar architecture to Turing. Lets look at the Titan V which has a similar number of CUDA cores as the RTX 2080 Ti just without the RT cores and with a few less optimizations here and there (e.g. hardware support for foveated rendering, etc). There was at a least 30% increase in performance from games over the Titan Xp across the board. Lets expect the performance of this card to match Volta at the very least.

The increased TDP is most likely from the much-overlooked 35w that can be supplied through the USB-C/VirtualLink port. The TDP is likely calculated including this load which is why its exactly 35w above the standard 250w of most cards today. It is unlikely that this will add much to the heat produced from the card even in use. This card will likely run much cooler than this generation just from process improvements.

If they put more RAM into the RTX 2080 Ti, it would encroach on their market for the Quadros (which have less CUDA cores to begin with but more RAM). You want space to do lots of data crunching? Get a Quadro. Want a lot of textures for gaming? Your 11GB RTX 2080 Ti will have more than enough memory for a while.

(I was going to make an account here eventually since I'm building a SFF PC but I figured this was a good time to start. ;) )

1. I'm on the side caution of games that implement RT well. DX12/Vulkan has explicit multi-GPU and ASync compute yet games hardly take advantage of these features despite them providing very strong benefits. I have a feeling that it'll be like Hairworks and PhysX which does little to improve the game quality across the board. It'll take a few years before games implement it *well*. The issue is 7nm is coming next year, so being a smart consumer this is definitely an early-adopter tax. 7nm is going to be a massive jump so no point in buying this year GPU for the same raster performance.

2. The Titan V also had about 30% more CUDA cores than the Titan Xp. I'd like to see some sourcing where performance per CUDA core increased in the Volta architecture. I don't recall seeing any performance uplift there. Yes they're charging us more for the bigger chip due to the additional RT cores, but I don't the first year of games will have good RT implementation. People turn off stuff like hairworks because it tanks your FPS so hard compared to other eye-candy features.

3. I don't agree about the USB-C taking up the TDP. TDP = thermal design power and is generally a criteria when developing the heat dissipation solution. The 35W is "consumed" at the load device (VR headset), so there is no reason to add the 35W to the TDP of the card. The TDP increase matches up with the increase CUDA cores per card. There is no new Node process, so maybe a slight increase in performance/watts but overall very similar to Pascal likely.

TLDR: HOLD, 7NM next year.
 
Last edited:

tinyitx

Shrink Ray Wielder
Jan 25, 2018
2,279
2,338
To me, whatever the framerate improvement and technology innovation like RT do not concern me as much as the price tag increase. I think Nvidia is basing the price tag more on an elevated platform due to mining craze. This is primary. Secondarily, lack of serious competition from AMD is really disappointing.

Frankly, RT, just like PhysX, belongs to the eye-candy category. Having great visual effects is nice but the story and how the plot is implemented in a game is much much more important to me. Anyway, I think Nvidia is stressing RT so much in the launch meeting because this is one area that AMD is completely lacking.
 

QuantumBraced

Master of Cramming
Mar 9, 2017
507
358
Hm, I wonder if they'll be able to leverage the RT and Tensor cores for shading tasks in games that don't support RT/support little RT/don't support AI-assisted AA. It won't be efficient, but still if you can get another 10% or so improvement by emulating CUDA tasks on those cores, it may be worth it. I agree, 7nm + RT maturity + competition from Intel will likely make the next generation amazing. But who has time to wait, life is short. :p