Production FormD T1 Classic (V1.0 and V1.1) READ FIRST POST

More GPU Slot Options for T1?


  • Total voters
    267

thelaughingman

King of Cable Management
Jul 14, 2018
838
888
Is this radiator actually 26mm thick https://www.highflow.nl/watercooling/radiatoren/2-x-120mm-240/ek-coolstream-se-240-slim-dual.html ?
Searching by EAN leads to EKWB website which says it's 28mm thick :confused:
it is 28mm - you'll need the SE Classic https://www.ekwb.com/shop/ek-coolstream-classic-se-240

Ye, that’s why I was curious, got the 240-D RGB like 7 months ago in anticipation for eventually getting the t1, so fitting a full 25mm alongside the two slim fans is something I hadn’t considered possible, due to the thick size of the radiator.
there's not a lot of room for tubing and it will require some force and some trial & error but definitely doable if you go for it
 
  • Like
Reactions: zacobin

nater.2003

Cable Smoosher
Sep 18, 2020
12
10
thats a bit too rich for my blood, but it if comes out before i can get my card maybe it will adjust the pricing of the 3080!

I'm just concerned about the 10gb of the 3080.... Is that actually problematic? its the only thing really preventing me from being super excited about being back on the 3080 train

The amount of video memory used for gaming really hasn't increased that quickly. You should be fine for at least two upgrade cycles, I'd say, if not longer.
 

anothernoob

Cable Smoosher
Oct 12, 2020
11
17
30mm thick rad + 15mm thick fans - see the bulge yet? ? ? ?
Which 30mm radiator are you using?
 

biopunk

SFF Lingo Aficionado
Bronze Supporter
Sep 24, 2020
136
185
17mm custom radiator
HIgh FPI 17mm rad would be nice but you'd still need 2x240mm rads to cool high-end components which is a real PITA right now, and component choices are extremely limited. I'd love if the chassis was a few mm wider while staying < 10L, so that 3080-90 FE WB with TX240 and 15mm fans could fit.
Not that I don't want to fit 25mm fans at the top as well ?

As for the accessories, my wish list is 1) Freeflow with a fill-port, 2) front panel with USB-C only, 3) top/bottom panel in black to convert from dual-tone.
 
Last edited:

Nhilros

Chassis Packer
Jan 4, 2020
15
9
Seems ray-tracing is still not that wide-spread in games despite the fact it's been available for more than 2 years:

Overall I think 6800 XT is better for high refresh rate gaming @ 1440p due to Infinity Cache and SAM but 3080 is better for 4k@60Hz with RT.

This is also interesting: https://videocardz.com/newz/amd-radeon-rx-6800xt-breaks-hwbots-3dmark-fire-strike-world-record

Ah yea... tech youtubers (i can barely tolerate more than 2~3 of them).
He looks only at released 2020 games, not counting previous years. And with consoles now sporting RT, it'll inevitably promote a lot more support of ray tracing on PC in upcoming games. I doubt we'll see many games going full path tracing ala minecraft unless Nvidia partners with a dev, but it'll still be better than before. Also DLSS is not needed for everything, i mean you can only care about rasterization performances until which point? That you need a 360Hz monitor to run Doom Eternal? DLSS is needed (and very much) on RT intensive games, because it makes the difference between unplayable to playable.

So the narrative that keeps coming up to be ready for new generation of console games is :
"10GB won't cut it, 16GB is safe", but also
"Look at previous years of RT support in games, ain't worth it" ?

Which.. is kind of dumb. To begin with, consoles are moving away from brute forcing VRAM usage like previous generations with decoders/DirectStorage and SSDs. Rather than keep like 30 seconds of dataset that are idle and waiting to be used in the level, you can literally keep only 1 or 2 seconds of assets required to stream what players see. As Mark Cerny (PS5) showed in the slides, SSDs are memory extensions while the VRAM now acts like a buffer almost, keeping only immediate assets and keeping idle data at a minimum.
The only way 10GB is a problem is if devs put a stupid amount of asset in memory idling without any IO logic, he'll also tank console performances anyway so yea..

Also Tech powerup found that even @ 1080p & 1440p, on average, the 3080 is still ahead
So i'm here, scratching my head about all these claims that 3080 is only for 4k..

There's also a big paradigm shift in console, probably even bigger than ray tracing and it's machine learning. The thing is as of now, there's no way to truly benchmark these features until we start seeing directML features. Here's a good article no the subject.

"If I saw a competing card released tomorrow which heavily outperformed the GeForce 3080 in current-gen games, it would actually set my alarm bells ringing, because it would mean that the competing GPU’s silicon has been over-allocated to serving yesterday’s needs."

Nvidia went all the way into tensor cores for ampere. They have lower number of RT cores than Turing, while tripling the Tensor OPs. They did not sacrifice that amount of silicon area just for DLSS 2.0. The question now is, what is Nvidia planning for tensor cores next year? AI texture upscaling? , AI texture compression? , AI physics? , or a brand new upscaler?
I mean who knows. But all these youtubers are not participating in this discussion of what upcoming console games will require with RT and machine learning support, just VRAM and old game rasterization performances..
 

biopunk

SFF Lingo Aficionado
Bronze Supporter
Sep 24, 2020
136
185
Ah yea... tech youtubers (i can barely tolerate more than 2~3 of them).
He looks only at released 2020 games, not counting previous years. And with consoles now sporting RT, it'll inevitably promote a lot more support of ray tracing on PC in upcoming games. I doubt we'll see many games going full path tracing ala minecraft unless Nvidia partners with a dev, but it'll still be better than before. Also DLSS is not needed for everything, i mean you can only care about rasterization performances until which point? That you need a 360Hz monitor to run Doom Eternal? DLSS is needed (and very much) on RT intensive games, because it makes the difference between unplayable to playable.

So the narrative that keeps coming up to be ready for new generation of console games is :
"10GB won't cut it, 16GB is safe", but also
"Look at previous years of RT support in games, ain't worth it" ?

Which.. is kind of dumb. To begin with, consoles are moving away from brute forcing VRAM usage like previous generations with decoders/DirectStorage and SSDs. Rather than keep like 30 seconds of dataset that are idle and waiting to be used in the level, you can literally keep only 1 or 2 seconds of assets required to stream what players see. As Mark Cerny (PS5) showed in the slides, SSDs are memory extensions while the VRAM now acts like a buffer almost, keeping only immediate assets and keeping idle data at a minimum.
The only way 10GB is a problem is if devs put a stupid amount of asset in memory idling without any IO logic, he'll also tank console performances anyway so yea..

Also Tech powerup found that even @ 1080p & 1440p, on average, the 3080 is still ahead
So i'm here, scratching my head about all these claims that 3080 is only for 4k..

There's also a big paradigm shift in console, probably even bigger than ray tracing and it's machine learning. The thing is as of now, there's no way to truly benchmark these features until we start seeing directML features. Here's a good article no the subject.

"If I saw a competing card released tomorrow which heavily outperformed the GeForce 3080 in current-gen games, it would actually set my alarm bells ringing, because it would mean that the competing GPU’s silicon has been over-allocated to serving yesterday’s needs."

Nvidia went all the way into tensor cores for ampere. They have lower number of RT cores than Turing, while tripling the Tensor OPs. They did not sacrifice that amount of silicon area just for DLSS 2.0. The question now is, what is Nvidia planning for tensor cores next year? AI texture upscaling? , AI texture compression? , AI physics? , or a brand new upscaler?
I mean who knows. But all these youtubers are not participating in this discussion of what upcoming console games will require with RT and machine learning support, just VRAM and old game rasterization performances..
Steve from GN came to pretty much the same conclusion as me:

AMD has an advantage in general at 1080p and 1440p. So if you are really focused on 1080p high refresh gaming and you can get a high-performance CPU to go with it like Ryzen 5000 right now being the top gaming performer these days. Then 6800 XT might make the most sense for that type of scenario, whereas if you want ray-tracing performance or 4k performance specifically, you probably should be looking towards NVIDIA.
Watch here:


Btw 1080p is still the most mainstream resolution according to Steam. 1080p or lower currently have 86% market share while 4K and higher have less than 5%:
 
Last edited:

threestripevida

Airflow Optimizer
Gold Supporter
Mar 28, 2017
252
434
My dc-lt 2600 pump died last night on me. Not sure what caused it, but I was planning on tearing down my loop anyway. If anyone knows a good place to get replacement thermal pads please let me know. I’ll need to replace some of the stock ones from when I took apart my 2080ti.
 

Nhilros

Chassis Packer
Jan 4, 2020
15
9
Steve from GN came to pretty much the same conclusion as me:


Watch here:


Btw 1080p is still the most mainstream resolution according to Steam. 1080p or lower currently have 86% market share while 4K and higher have less than 5%:

Yeah i see some sites come to different conclusions. Would be interesting to see why Tech Powerup landed with a different one, as far as i know, RT games were not included in averages.

Those 1080p steam users are also on low end cards or laptops actually, going by CPU market share, it's inundated with laptops.
When the most popular card is a 1060.. I would like to think that customers who are looking at spending ~700$ on a GPU are not going to keep their 1080p monitor, unless they are e-sport for 240/360Hz. But ok, for sure there will be 1080p users (god why). But then, any discussion about VRAM concerns (even 3070) at 1080p should not even pop-up.
While we are debating on high end cards that basically hold a 2~3% share of Steam hardware (sometimes even <1% in Turing/5700XT), the market will gobble up the 3600/6700 cards like always.

But hey, it's a GOOD thing we have an alternative to ampere depending on user needs. Don't get me wrong.
But to simply put RT & ML subjects into a drawer as if they don't exist, like Hardware unboxed is doing, i expect better from tech reviewers. I'm practically building an entire new PC just for Cyberpunk 2077, as i guess many of us are. RT & DLSS are absolutely important to me. Reviewers are there to paint a portrait of all possible user cases. Not decide for us what they think should not be important.

And in my case, so far no 2-slot 6800XT cards. I intend to use my Formd-T1 with the NH-L12S and in 2-slot mode. That also limits my choices on ampere side (basically only FE haha), but at least it does not fuck up the idea i have for the build. I know like 90% of you are on water cooling so it's not the same problem as me.

Anyway, sorry for even being so out of topic, i know it's not appreciated too much to debate outside of case talk here, so i'll stop on the subject.

/still waiting on e-white T1
 
  • Like
Reactions: Madhawk1995

gregbiv

Caliper Novice
Apr 17, 2020
30
11
Hello there, slowly getting to make watercooling for my case. Can someone tell me, is the M19 to 1/4 adapter of the case is safe to use with watercooling? What is it made of? Hope it's not aluminum
 

threestripevida

Airflow Optimizer
Gold Supporter
Mar 28, 2017
252
434
I mentioned the other day that my pump had died and I would need to swap parts out in my build. I was going to use the LT Solo pump/block to watercool only my CPU and go back to aircooling my gpu. Unfortunately, the LT Solo was not going to be able to fit on my board with the fittings I was using, so now I need to decide which AIO I want and if I want to go with 120 or 240. I am currently using the Noctua NH-L9i.

Tearing down the system was annoying, and I did a crappy job of cable management the first time. This time I was a lot more careful about where I was routing cables and tried to do it all as I was building the system piece by piece. You might notice in my build pictures that the SSD is kind of slanted, and that's because I was not going to be able to plug in the sata power cable if I was using both screws to attach it to the SSD bracket. It worked fine, so I'm gonna leave it how it is.

Having to put the stock cooler back onto my GPU was even more nerve wrecking for me than taking it apart and putting the waterblock on. Good thing was that I took pictures as I took the card apart so I would know how to put it back together. Bad thing was that I forgot to take one important picture so I had to search the web for a teardown and pictures. The card is working fine though and I didn't break anything.

Here are the pictures though, and if you want to see more let me know

T1 Aircooled build