SFF.Network CES 2019: AMD

annasoh323

Master of Cramming
Apr 4, 2018
424
314
I was able to catch some of the live stream. You guys had awesome seats!

After all of the hullabaloo about the leaks and how specific they were, the lack of specifics during the keynote felt... underwhelming. However, now that I've had some time to digest it, I realize I musn't let the disappointment of what we didn't see overshadow what we did see. PCIe 4.0! Ryzen matching/beating a 9900K core for core! Crazy, crazy compute power for the consumer level from Rad VII!

May the speculation games continue. Anandtech seems to think that there's room for another chiplet on the Ryzen 3000 chip. Rad VII is a single card; lots of room for a full lineup. Better strap down my wallet - this could be a rough year for it.
 
  • Like
Reactions: Biowarejak

Biowarejak

Maker of Awesome | User 1615
Silver Supporter
Mar 6, 2017
1,731
2,219
Very excited about it being single slot IO :) might never get my hands on it but man it's exciting
 

MJVR1

SFF Lingo Aficionado
Jun 10, 2015
92
55
Anyone measure the Radeon VII? Can’t seem to find any info on that card. Nice coverage of CES, keep it coming!
 

Therandomness

Cable-Tie Ninja
Nov 9, 2016
226
268
Anyone measure the Radeon VII? Can’t seem to find any info on that card. Nice coverage of CES, keep it coming!
Watch it be almost exactly the same PCB as Vega 56/64, with the e-peen extender on the end of it.
Yes, second chiplet for 16 core / 32 thread CPU...!

Or, Navi chiplet for Zen 2 APU...!
 

Boil

SFF Guru
Nov 11, 2015
1,245
1,090
No engineers were available to answer my inquiry in a Nano. The card was VEGA 64 long, my calipers weren't long enough for it so is automatically disqualified, haha

It looks like the PCB is the full size of the housing, so long & tall...
 
  • Like
Reactions: Biowarejak

ChinStrap

Cable-Tie Ninja
Sep 13, 2017
169
146
Radeon VII – on the surface, why? Only 25% more performance vs. Vega 64 and 43%* more expensive?

Does it overclock like a beast and they are making you pay for it w/o saying it?

*V64 can be had for $400 on Newegg right now.
 

teodoro

SFF Lingo Aficionado
Oct 8, 2018
108
76
The 2080ti is 30-70% more expensive than the 2080 for 30% more performance. The highest tier cards will always be suboptimal value. If Nvidia didn’t just reveal software freesync support, the VII would have stolen some 2080 sales and it may well be superior in certain workloads (professional or otherwise). Lots of people, largely outside of sff, don’t care that much about power draw/heat/noise. It’s not the high-end parity we would prefer, but at least the card is relevant and will help to keep nvidia card pricing in check.
 

huddy

Caliper Novice
Aug 22, 2018
30
29
I was hoping for an earlier release of Ryzen 2, but will patiently wait to finish my build until the 8C/16T CPU's arrive!
 

annasoh323

Master of Cramming
Apr 4, 2018
424
314

I wonder if there's any downside to having that many cores/threads on a mainstream platform (other than a bit of cannabilization on the low end TR parts). RAM? PCIe lanes? I need to do more research and brainstorming. 16 cores and 32 threads is 4x and 8x more cores and threads respectively than my i5-7600K. Mind-boggling. Even just this 8 core 16 thread part would leave the i5 in a cloud of unrendered dust and leave me crying in a pool of non-ray traced tears.

The 2080ti is 30-70% more expensive than the 2080 for 30% more performance. The highest tier cards will always be suboptimal value. If Nvidia didn’t just reveal software freesync support, the VII would have stolen some 2080 sales and it may well be superior in certain workloads (professional or otherwise). Lots of people, largely outside of sff, don’t care that much about power draw/heat/noise. It’s not the high-end parity we would prefer, but at least the card is relevant and will help to keep nvidia card pricing in check.

Now, if we could only go the opposite direction where an AMD card could activate adaptive sync on a GSync monitor... (has anyone done this?) I'll just have to see what the future holds, I guess. Here's hoping that '19 is even more of a shakeup year than '17 was with the launch of the original Ryzen (on the CPU side. Hoping to see some actual shakeup on the GPU side).
 

BirdofPrey

Standards Guru
Sep 3, 2015
797
493
I can imagine that many cores starting to strain the memory subsystem in certain workloads.
As for a 16 core Ryzen VS threadripper, there's certainly room. They already differentiate the high core count Threadripper from EPYC by the IO. If you need a bunch of IO and memory Channels, and/or lower memory latency, you need an EPYC, but if you just want more cores, Threadripper is the way to go. I can see them saying the same thing for Ryzen vs Threadripper. If you need a bunch of cores, but are fine with just 2 memory channels and 20 PCIe lanes, then Ryzen is good enough, but if you plan on loading up on GPUs or gobs of memory as well, then a Threadripper is the choice.
 

confusis

John Morrison. Founder and Head writer SFF.N
Original poster
SFF Workshop
Editorial Staff
Moderator
Jun 19, 2015
3,580
6,171
sff.network
I'd posit that we still are limited by software developers not multithreading tasks that should be, as well as Windows' sub-standard thread scheduler. Alas, we still live in an ecosystem where more threads isn't the full answer.
 
  • Like
Reactions: Soul_Est

BirdofPrey

Standards Guru
Sep 3, 2015
797
493
Fair enough, though designing software to use more threads is non-trivial, and most tasks suffer diminishing returns from increased parallelism.
At the very least, with Intel at least trying to answer AMD and creating something of a core count war, there's impetus for more stuff to be written for more thread usage.

More cores does mean less sharing of resources between separate programs, though, which is why, even if increased core count hasn't done anything for gamers, it's been of benefit to streamers.
 
  • Like
Reactions: Biowarejak

Boil

SFF Guru
Nov 11, 2015
1,245
1,090
More cores does mean less sharing of resources between separate programs, though, which is why, even if increased core count hasn't done anything for gamers, it's been of benefit to streamers.

Gaming & Streaming (at the same time, on the same machine) is a definite target for high core count CPUs...

But it is also a great thing for content creators with multiple programs open at the same time...

From a SFF slant on things, the only thing that needs done now is to allow for dual-channel 32GB DIMMs in the system, for a total of 64GB of RAM...

ITX 4 Lyfe...?!?
 

Similar threads

Replies
4
Views
678
Replies
11
Views
2K
Replies
5
Views
1K
Replies
25
Views
3K