• Save 15% on ALL SFF Network merch, until Dec 31st! Use code SFF2024 at checkout. Click here!

GPU Now this could be big: R9 Nano!

Phuncz

Lord of the Boards
SFFn Staff
May 9, 2015
5,948
4,953
Too bad they choose to not have the PCIe connector on the side like they did with the other card. The traces you can see in the upper right corner indicate it's on top like most GPUs:



Also worthy of note: Small Form Factor is used, not the "ITX" name. I like more focus on SFF and less on "ITX".
 

theGryphon

Airflow Optimizer
Original poster
Jun 15, 2015
299
237
Also worthy of note: Small Form Factor is used, not the "ITX" name. I like more focus on SFF and less on "ITX".

Agreed.

About the power connectors, the other card is tall, and I suspect that's why they put the connectors facing the front.
This one is of normal height, so no need to do that from a reference design point of view.
 

rawr

SFF Lingo Aficionado
Mar 1, 2015
137
10
Can't wait to see some benchmarks for this thing. The article says that it's going to be faster than the 290X; I wonder if it's faster than the GTX 970 too - if it is, then it will become my ITX card of choice.
 

MJVR1

SFF Lingo Aficionado
Jun 10, 2015
92
55
It has to be half the TDP of the 290X. That cooler is way to small to cool a Hawaii chip unless you want a delta fan running at 100 percent. Either way, this card is looking very promising. Dolphin Emulator gonna love these cards.
 

PlayfulPhoenix

Founder of SFF.N
SFFLAB
Chimera Industries
Gold Supporter
Feb 22, 2015
1,052
1,990
That would also mean, at 2x perf-per-watt, that we should expect a ~27% performance boost alongside that 100W drop in power consumption.

That, with the size shrink, is a pretty darn good update all-told. I'll be curious as to what noise and thermals look like.
 

Phuncz

Lord of the Boards
SFFn Staff
May 9, 2015
5,948
4,953
Yes indeed, impressive enough to go against the GTX 980 in such a small package.

During the presentation, I believe I heard that the cooler was made of "premium" materials, I can't find it mentioned though, except for Project Quantum.
 

Vittra

Airflow Optimizer
May 11, 2015
359
90
Starting to get a bit concerned this card is getting hyped beyond reasonable expectations...
 

Phuncz

Lord of the Boards
SFFn Staff
May 9, 2015
5,948
4,953
We'll know in a week I guess, most would be glad with a $650 card that beats the 980Ti and Titan X. This Nano that seems to sit in between the 390X and the Fury would be awesome just due to sheer size/performance.
 

PlayfulPhoenix

Founder of SFF.N
SFFLAB
Chimera Industries
Gold Supporter
Feb 22, 2015
1,052
1,990
We'll know in a week I guess, most would be glad with a $650 card that beats the 980Ti and Titan X...

And that isn't impractical, I might add. You can get the R9 295x2 for less nowadays, but at 500W, with water cooling as a requirement, and with CrossFire, it's not the simplest solution to live with :p
 

Phuncz

Lord of the Boards
SFFn Staff
May 9, 2015
5,948
4,953
Yes indeed. Although for the foreseable future, Crossfire and SLI will become more important with VR headsets. While it won't be a problem to run 2000x1000 resolutions at 75-90fps steadily, I foresee the jump to 4000x2000 soon after VR hits. While that is still atleast a year away, I'm not going to be sad to get another 290X for my mATX Kimera Habanero build and know it will atleast run games @ 60fps on 4K displays reasonably. Meaning full-bore with non-maxed-out details or medium bore with stuff like Touchy-Feely Grass ©.
 

Vittra

Airflow Optimizer
May 11, 2015
359
90
Have you used crossfire before? AMD gets slagged a lot, but the crossfire discontent is actually one of the few valid criticisms I agree with. Nvidia SLI is far from perfect, but their commitment to getting it working in a majority of games (and in a timely matter) is indisputably better than Crossfire.

Back around Witcher 2's release, I was primarily using AMD cards, albeit in a single card config. I actually had a download folder of about 3-4 drivers to switch between, based on the game I was playing. Witcher 2 had some nasty flickering / graphical glitches which were fixed in a driver, but later drives reintroduced them. While single card drivers are exceptionally good now for AMD, Crossfire succumbs to this kind of stuff quite a bit.

I worry about Crossfire setups for VR, regardless of if we see a switch to SFR modes in DX12.
 

Phuncz

Lord of the Boards
SFFn Staff
May 9, 2015
5,948
4,953
I've been aware of the issues with Crossfire, although I have never experienced them myself because I've run single GPU all my life (except for 3DFX Voodoo
).
But in 2015 a lot of attention has gone to the Crossfire part of the driver and I've read less and less of the crap that it usually suffered. I don't know how far it has come to this day, but considering AMD's LiquidVR seems to be all about the immersion (and maintaining it) I'm hoping this will become a good solution:

Introducing AMD LiquidVR™

LiquidVR™ is an AMD initiative dedicated to making VR as comfortable and realistic as possible by creating and maintaining what’s known as “presence” — a state of immersive awareness where situations, objects, or characters within the virtual world seem “real.” Guided by close collaboration with key technology partners in the ecosystem, LiquidVR™ uses AMD’s GPU software and hardware sub-systems to tackle the common issues and pitfalls of achieving presence, such as reducing motion-to-photon latency to less than 10 milliseconds. This is a crucial step in addressing the common discomforts, such as motion sickness, that may occur when you turn your head in a virtual world and it takes even a few milliseconds too long for a new perspective to be shown.

Enable scalable rendering for more realistic experiences

Rendering near-photorealistic imagery in real-time at high resolutions in stereo at high refresh rates over 100 Hz is a challenge to even the most powerful GPUs and CPUs that are available today. With LiquidVR, users of AMD technology can build multi-GPU and multi-CPU systems with solutions available in the market today. AMD is providing powerful interfaces for developers to take productive advantage of all the CPUs and GPUs in the system for the best possible VR experience.

But if AMD fails I might just as well go with Nvidia if it offers a better solution. I'm not brand-dependant, narrow-minded or insecure enough to not evaluate all the options and focus on the goal. But I've stepped away a few times from Nvidia in the past because of texture flickering and issues you described, but with single GPU Nvidia cards.
 

Vittra

Airflow Optimizer
May 11, 2015
359
90
Dual GPU can be a frustrating experience at times. My 690 treated me quite well throughout it's life, I must admit, but any game where it didn't work hamstrung me as I was running as I was down to a single 680 essentially. I eventually let it go for a 780 Ti.

AMD partnering with Oculus is a good sign for LiquidVR. Adoption rates of previous techs is low because in the past, they didn't actively seek out strong partnerships or collaborations to push that tech onto the market, or their different departments were not synced up to release them in tandem (think Mantle vs Bulldozer and how those releasing parallel could have changed things).

Their LiquidVR push I think this signifies a shift in focus, a very positive one.

I actually liked how my 290x (briefly had one, a 290 as well) operated extended mode between my monitor and receiver/tv better than Nvidia in Win 8.1. Also woke the monitor (Crossover 27Q LED-P) from sleep instantly, the Nvidia cards took time. I've mostly used Nvidia as of late due to to the game-centric nature of my machine - I really like Nvidia Inspector's flexibility and control compared to RadeonPro (which is defunct anyway), but I do feel a bit forced into G-sync and Gameworks which I don't particularly like. Power consumption differences weren't really a concern until more recently, as hydro continues to climb where I live.

I really don't mind switching to the R9 Nano though if the performance is there, but it if it isn't I'll wait, because Arctic Islands and Pascal should definitely provide what I am looking for. Ever since the V1 iteration of the M1 I've been wanting a small, powerful GPU for the case. Initially that was due to wanting to get a WC res inside the case but overall it just opens up a lot more options in an already incredibly versatile case.
 

EdZ

Virtual Realist
May 11, 2015
1,578
2,107
In terms of GPU-vendor branded VR extensions (LiquidVR and Gameworks VR), both are effectively the exact same featureset, with that featureset being whatever Oculus (and Valve, in the case of late-latching) had already hacked in with situation specific shims and wanted the GPU vendors to make easier to implement and more stable. e.g. 'VR direct' is just Oculus' 'Direct Mode' without needing the extra stub driver. The same goes with the idea of using individual GPUs to render the image for each eye, which is common to both vendors and originated with a concept from Michael Abrash (I can't recall whether this was when he was with Valve or Oculus). I think one of Crytek's developers proposed using a pair of GPUs to render the frames and hand them off to an integrated GPU for compositing and warp (allowing the stages to be fully decoupled) but I can't find the blog post for that, and I don't think anyone has actually implemented it.

The one exception I have seen so far is Nvidia's multi-resolution technique using MAxwell's optimisation for multiple viewports to reduce render load for the periphery of the image by rendering a 'squished' view natively. That seems to have been an entirely independent development, though it's a technique that will support proper foveated rendering if anyone can produce an easily integrated eye-tracker with a sufficiently low latency to reliable beat saccade motion.
 
  • Like
Reactions: Phuncz

PlayfulPhoenix

Founder of SFF.N
SFFLAB
Chimera Industries
Gold Supporter
Feb 22, 2015
1,052
1,990
The one exception I have seen so far is Nvidia's multi-resolution technique using Maxwell's optimization for multiple viewports to reduce render load for the periphery of the image by rendering a 'squished' view natively. That seems to have been an entirely independent development, though it's a technique that will support proper foveated rendering if anyone can produce an easily integrated eye-tracker with a sufficiently low latency to reliably beat saccade motion.

And this, friends, is why I didn't go into graphics development. Yikes.

(I learned at least two words from that last sentence...)
 

Necere

Shrink Ray Wielder
NCASE
Feb 22, 2015
1,720
3,284
Too bad they choose to not have the PCIe connector on the side like they did with the other card.
I prefer it this way, actually. As a case designer, the more things that are more-or-less standardized, the better. The vast majority of cards have the power connectors on the side, so I'm going to be designing for that anyway, and a single card on the market with connectors at the end doesn't do me any good. On the other hand, short, high-end cards like this are still rare and allow for some very compact, powerful builds, and it would be a shame to have to make a case bigger than it needs to be just for the sake of a single card with odd power connectors.
 
  • Like
Reactions: Phuncz

Phuncz

Lord of the Boards
SFFn Staff
May 9, 2015
5,948
4,953
And this, friends, is why I didn't go into graphics development. Yikes.

(I learned at least two words from that last sentence...)
You are not alone on this one, this thread is teaching me more about VR than anything or anyone ever did !

I prefer it this way, actually. As a case designer, the more things that are more-or-less standardized, the better. The vast majority of cards have the power connectors on the side, so I'm going to be designing for that anyway, and a single card on the market with connectors at the end doesn't do me any good. On the other hand, short, high-end cards like this are still rare and allow for some very compact, powerful builds, and it would be a shame to have to make a case bigger than it needs to be just for the sake of a single card with odd power connectors.
I can certainly agree with that point-of-view, although most SFF cases seem to have less restriction in depth than in width. But for it to be of general benefit, the market would indeed require much more cards that work this way. It will depend on case design but I would guess with SFF it's difficult to maintain compatibility without taking extra "buffer" space into account for the various types of cards. Like the cards with taller PCB's.

I used to have a Lian-Li PC-V352, which was designed exactly according to PCIe full-height card spec. This meant there was not a single millimeter of space between the top of the card and the top of the segment the card would reside in:







While this was an obvious design flaw considering there were almost no cards in existance with the PCIe connector on the side, it was marketed as being able to handle GPU's up to 280mm or 11in. The only card I could find at the time was a HIS Radeon HD 7770, otherwise I would have had to go with a card without PCIe connectors. It was frustrating that I could fit a huge card like an HD 7970 and just not connect it.
 

iFreilicht

FlexATX Authority
Feb 28, 2015
3,243
2,361
freilite.com
I prefer it this way, actually. As a case designer, the more things that are more-or-less standardized, the better. The vast majority of cards have the power connectors on the side, so I'm going to be designing for that anyway, and a single card on the market with connectors at the end doesn't do me any good. On the other hand, short, high-end cards like this are still rare and allow for some very compact, powerful builds, and it would be a shame to have to make a case bigger than it needs to be just for the sake of a single card with odd power connectors.

The thing is, practically no case will have to be made bigger for this card to fit, because it is even shorter than regular "ITX sized" cards. If the case supports 175mm of GPU, then the R9 Nano will fit, because at 6", it is about 23mm shorter, so the PCIe connector and the power cables would still fit. It'd be a tight squeeze, but it would fit.
And there are certain cases that weren't designed with GPUs in mind and can't take the PEG connector on the side, but would fit one quite nicely with a small amount of modding. That's another use case where the Nano could prove to be very useful.