Where do U think it goes from here

LooseNeutral

Trash Compacter
Original poster
LOSIAS
Jul 3, 2015
46
23
FLASH! intel is done with tic-tok. Maybe we are seing some war between AMD & Nvidia, for whos got top this , pardon me, shit. Sorry. But honestly. whatever any of them all are doing is just squeeks away from each other, we had our best fun around Sandybridge time. Since then.. the slow derps trains roll tossing clowns out the door at every dumbass show, that - sorry- thoses sgows don't mean shit anymore. umm.. I like to call them Lie Fests. Or, BS Fish farms..needing anyone out there to write any thing..YES,, they leak.

Brings me to this. Intel/Amd/Apple/MS.... it's all about a show most in the world don't give a shit about.

They want Shiney new toys

Us..we want more on the backend. We built these companies. Don't be mad they're leaving us now. I'll ask you this, where will they go if/when they're dumbass ideas fail. Back to us
 

jØrd

S̳C̳S̳I̳ ̳f̳o̳r̳ ̳l̳i̳f̳e̳
sudocide.dev
SFFn Staff
Gold Supporter
LOSIAS
Jul 19, 2015
818
1,359
Apple proved the comeback could be done. For the longest time they wernt the juggernaut they are now. They were borderline bankrupt being kept afloat because lots of design firms were Apple loyalists way back when. Then Jobs returned.

Nokia proved the comeback couldnt be done, even when they had MeeGo/maemo/Moblin, the n900 and alot of hype. Then they went Windows Phone, then they stopped selling many phones, then Microsoft gutted what was left.

Where it goes from here, who knows. One thing is for sure, nothing stays the same for long
 

Phuncz

Lord of the Boards
SFFn Staff
May 9, 2015
5,938
4,951
There is much we in general don't know about what these companies can and can't handle. I'm sure many won't believe AMD is still able to survive and others believe Intel can never be taken down.

But companies like Nvidia that only produce GPUs, albeit GPGPUs is also a thing, they are desperately looking for new markets (auto industry, mobile devices) that they can grab on to, because if hypothetically one day GPUs aren't needed anymore (look at soundcards now and 20 years ago), they will go out of business pretty fast. Their GPU division (high volume) makes the GPGPU division (low volume) possible financially, what I've come to understand

Intel is also battling to get a grip on the mobile market, because the decline in PCs sold each quarter hits them hard, too. The reason why Intel hasn't been giving us serious performance improvements each CPU generation is not because they can't or won't, but because they can keep margins high.

At the moment, one of the only companies that seems has nowhere to go but up, is companies like Apple and Samsung. With so many markets they both have, with the talent and momentum to be a market leader in just a few years, they seem unstoppable. It's no coincidence that Toshiba, Sony, HP, Dell, Samsung and many others have chosen very different directions, mayor cutbacks or even complete fire-sales for their PC segment.

But I don't fear the gaming or enthusiast PC is in danger, PC gaming is better than ever and has been growing a lot in the past few years.
 
Last edited:

LooseNeutral

Trash Compacter
Original poster
LOSIAS
Jul 3, 2015
46
23
There is much we in general don't know about what these companies can and can't handle. I'm sure many won't believe AMD is still able to survive and others believe Intel can never be taken down.

But companies like Nvidia that only produce GPUs, albeit GPGPUs is also a thing, they are desperately looking for new markets (auto industry, mobile devices) that they can grab on to, because if hypothetically one day GPUs aren't needed anymore (look at soundcards now and 20 years ago), they will go out of business pretty fast. Their GPU division (high volume) makes the GPGPU division (low volume) possible financially, what I've come to understand

Intel is also battling to get a grip on the mobile market, because the decline in PCs sold each quarter hits them hard, too. The reason why Intel hasn't been giving us serious performance improvements each CPU generation is not because they can't or won't, but because they can keep margins high.

At the moment, one of the only companies that seems has nowhere to go but up, is companies like Apple and Samsung. With so many markets they both have, with the talent and momentum to be a market leader in just a few years, they seem unstoppable. It's no coincidence that Toshiba, Sony, HP, Dell, Samsung and many others have chosen very different directions, mayor cutbacks or even complete fire-sales for their PC segment.

But I don't fear the gaming or enthusiast PC is in danger, PC gaming is better than ever and has been growing a lot in the past few years.
I'm an old hunter. I csn tell you this most all sites miss. The green team fears Red Team now. Red team did some great shizz under the radar, Green team focused on putting them down, evevery chance, every opening they could. The DX112 stats tell me the olded RED Team cards are still keeping up with the best Green team can offer. Why is this? How can this be? So Green Team, who makes more money started paying off Game devs.. err investing in, to use Gameworks. This has worked in Green teams favor for what folks.. two years? Then they trie, Green team to corner markets with GSync... How's that working out for them. They has the cash.. but monitor mfgrs haven't given them exclusive domain. That shiz costs the monitor mfgrs too. Which is why you still see movement towards Team Red. Open Source. and why I see fear from Team Green now in the GPU front. Team Red got way ahead w/HBM. And, if you dig deeper into it, Team Red owns patients. Google vs MS with Andriod comes to mind.

Computing is shifting. As an enthusi dude, I love it. Arm chips are challenging intel. Intel just said..no more tick tock for the desktop. Apples chips are scaring the F out of all of them. Yes, I said Apple. BTW, they have a ram company as well and are working on tighter integration beteen, CPU,APU,RAM,GPU.... the bus is key. Think about it. look back a few years ago w/ USB 3.1 Apple had more engeneers on that than all else.. save yup..HP. Apple was key to Thunderbolt. Latest gen 3..all enginers.. dominating w/ intel... Dell/HP right there.

Team Red.. lags behing in the bus zone, as far as TB/USB3.1 and HDMI 2.0 , but that just what we see and are told about. Team Greens big announcements last..was all about being propriatary. and me, layman, Each Team is goimg w/ where they think any of this maddness is going.

Me, I'm giving points to Team Red for looking at the end user, and ASYNC.. That shit will change the way apps are developed.

BTW, I'm such a geek now thanks to some folks here on your site.... Lovs to Confusis, and jOrd.. all the old players @LOSIAS. Yup, I'm still nuts. Not your fault. Loves ya. Loves this place too!
 
  • Like
Reactions: michaelmitchell

PlayfulPhoenix

Founder of SFF.N
SFFLAB
Chimera Industries
Gold Supporter
Feb 22, 2015
1,052
1,990
Apple proved the comeback could be done. For the longest time they wernt the juggernaut they are now. They were borderline bankrupt being kept afloat because lots of design firms were Apple loyalists way back when. Then Jobs returned.

At the moment, one of the only companies that seems has nowhere to go but up, is companies like Apple and Samsung. With so many markets they both have, with the talent and momentum to be a market leader in just a few years, they seem unstoppable.

Computing is shifting. As an enthusi dude, I love it. Arm chips are challenging intel. Intel just said..no more tick tock for the desktop. Apples chips are scaring the F out of all of them. Yes, I said Apple. BTW, they have a ram company as well and are working on tighter integration beteen, CPU,APU,RAM,GPU.... the bus is key. Think about it. look back a few years ago w/ USB 3.1 Apple had more engeneers on that than all else.. save yup..HP. Apple was key to Thunderbolt. Latest gen 3..all enginers.. dominating w/ intel...

I've been frustrated for years how PC enthusiasts haven't noticed Apple's semiconductor work. Especially in the past few years, Apple has not only been soundly kicking the asses of every other ARM manufacturer/designer in the business - Qualcomm and Samsung included - but they've undeniably been creating some of the most exciting processors, designs, and integrations in the industry, by a long shot. Intel remains the best fabricator and manufacturer, but a persuasive argument can be made that Apple's SoC design team is the best in the world.

A good example of this: The current generation iPhones 6s can run practically unthrottled (which it itself a big achievement in the mobile phone form factor), while having single-core performance that exceeds that of the MacBook - a desktop processor running a desktop OS. And multi-core performance that's nipping at Intel's heels, to boot. In a device that costs half as much.

Another: Even with less cores, less transistors, and less memory, on-device benchmarks show Apple consistently meeting or beating performance metrics of comparable Android flagships. Because they aren't re-purposing a generic design made by a fab that wants broad customer appeal, the efficiency of the processor is improved considerably, and this level of customization means that Apple can spend die space on superior post-image processing (for the camera), sensor co-processing, and a wider architecture - even with lower power consumption.

One more: Apple's graphics performance is even better (on a relative basis) when compared to other ARM chips, with Apple's last-generation devices consistently beating Samsung's current-generation devices in on-device benchmarks. Which is to say, Apple's lead is a whole year at a minimum.

Just imagine if, in 2009, someone speculated that Apple would be in this position in five years. They would have been excoriated. And yet, Jobs's insistence that they have direct control over SoC design has translated to Apple's signature hardware advantage in the mobile electronics space. And now that they're pushing to design storage controllers, memory controllers, display controllers, modems... this advantage will only be cemented further.

I'm an old hunter. I csn tell you this most all sites miss. The green team fears Red Team now. Red team did some great shizz under the radar, Green team focused on putting them down, evevery chance, every opening they could. The DX112 stats tell me the olded RED Team cards are still keeping up with the best Green team can offer. Why is this? How can this be? So Green Team, who makes more money started paying off Game devs.. err investing in, to use Gameworks. This has worked in Green teams favor for what folks.. two years? Then they trie, Green team to corner markets with GSync... How's that working out for them. They has the cash.. but monitor mfgrs haven't given them exclusive domain. That shiz costs the monitor mfgrs too. Which is why you still see movement towards Team Red. Open Source. and why I see fear from Team Green now in the GPU front. Team Red got way ahead w/HBM. And, if you dig deeper into it, Team Red owns patients. Google vs MS with Andriod comes to mind.

I'd have to disagree with this sentiment. If I had to invest $1 million in one of these companies, I would put it in nVIdia without a second thought.

HBM is interesting and exciting, but AMD's current product lineup is a highly compromised collection of parts that they're only going to be able to continue selling by heavily discounting once nVidia responds, which will kill their margins. i know that a lot of people who would consider themselves AMD fans like to make the argument that AMD generally always stays at least a little bit ahead of nVidia in terms of price/performance in this way, but the practical hardware design compromises to get there with the current gen (laughable efficiency and a reliance on old parts at the low to mid-range, and low memory and mandatory use of an AIO at the high end) are becoming ridiculous. Plus, slashing pricing is simply not a sustainable strategy for a company that actually wants to make money.

(AMD's been bleeding money, even from the division responsible for graphics, for some time now).

Comparatively, nVidia's current stuff has stunning efficiency and overclocks like a champion. Their unified architecture aligns well with a growing presence in mobile. They're diversifying heavily, and even if half of those initiatives fail, they get entry to billion-dollar business with the other half. Oh, and nVidia's been seeing record revenues alongside consistent profits. Which are helpful when the R&D costs of the business are escalating as fast as they are.

...I don't say this to say that AMD sucks, or to say that I want them to fail (I sure as hell don't), but I don't agree with the sentiment that the market advantage is with team red currently, let alone that they're in a dead heat. nVidia is (mostly) cleaning AMD's clock right now. AMD's basically betting that the stuff we have yet to see will leapfrog them into the lead.
 
  • Like
Reactions: LooseNeutral

michaelmitchell

SFF Lingo Aficionado
Mar 12, 2016
117
73
.
...I don't say this to say that AMD sucks, or to say that I want them to fail (I sure as hell don't), but I don't agree with the sentiment that the market advantage is with team red currently, let alone that they're in a dead heat. nVidia is (mostly) cleaning AMD's clock right now. AMD's basically betting that the stuff we have yet to see will leapfrog them into the lead.

There is very little interest in red cards for VR at the moment most people are buying green cards and even switching camps to use the current headsets. I am not one to really stick with a particular brand in computing, I just choose the best tool for the job and nVidia has the driver support for early VR.

But honestly all of this will end up on mobile, they are already stupid fast and we don't really make use of the hardware in them yet, I wish I could dump a proper linux install or even something else on something like a note5 and plug it into a screen as a pull time PC.
 

jØrd

S̳C̳S̳I̳ ̳f̳o̳r̳ ̳l̳i̳f̳e̳
sudocide.dev
SFFn Staff
Gold Supporter
LOSIAS
Jul 19, 2015
818
1,359
I wish I could dump a proper linux install or even something else on something like a note5 and plug it into a screen as a pull time PC.

IIRC the Ubuntu phones & tablets from BQ support their convergence feature to enable, more or less, just that
 
  • Like
Reactions: michaelmitchell

EdZ

Virtual Realist
May 11, 2015
1,578
2,107
It's far too early to make sweeping pronunciations of DX12 performance yet. There is one game built without DX12 retrofitted (Ashes of the Singularity) and that was built for Mantle before Mantle was discontinued. Other than that there are a handful of console ports (most of which are constrained by being designed for a unified memory architecture), and one or two games built for DX11 with preliminary let's-see-if-it-works DX12 implementations (e.g. Talos Principle).
Plus, DX12 is not a magical go-faster API. It's a low-level API, which while giving developers more flexibility in how they write code, also moves the burden of writing and optimising that code onto the developer. And being low-level, that means architecture-specific optimisation is also now in the hands of developers rather than GPU vendors.
In regards to HBM: The HBM Task Group at JEDEC counts among its members both AMD and Nvidia, along with many others. Both have access to (and input on) the development process for HBM and HBM2.


I wish I still had my A1200. Over a decade ago I had the whole shebang: external SCSI CD writer (a hand-me-down), joystick, MODEM, a proper monitor, etc. All went to a car boot sale at some point to fund an upgrade to a PIII machine (a heavily modified Dell somethingorother using a Slocket adapter).)
 

LooseNeutral

Trash Compacter
Original poster
LOSIAS
Jul 3, 2015
46
23
YOU people are awesome! I'm on other forums, I expect to get NUKED in response to my ravings. I took a chance here. figured a few members here know I'm nuts anyhow. But this. You peeples are awesome. Great feedback. not just shit back I'd get from other sites. I love you for it.

I'm home
 

PNP

Airflow Optimizer
Oct 10, 2015
285
257
I've been frustrated for years how PC enthusiasts haven't noticed Apple's semiconductor work.
I think a lot of PC enthusiasts know about Apple's work (because I'm willing to bet a large number of them also own smartphones), but being PC enthusiasts, they don't want to deal with the limitations Apple imposes on its products. Enthusiasts want to poke and prod, to disobey the manufacturer guidelines and the moment Apple lets them, the integration disappears and the performance with it.

(Also, Apple doesn't make its SoCs...or much of anything. They drive a harder bargain than most design houses and they like surprise inspections, but in the end someone else is making the product real.)

One more: Apple's graphics performance is even better (on a relative basis) when compared to other ARM chips, with Apple's last-generation devices consistently beating Samsung's current-generation devices in on-device benchmarks. Which is to say, Apple's lead is a whole year at a minimum.
I've yet to see a graphics benchmark where an Ax SoC has to run at the same resolution as its competition. Yes, I can make a TNT2 seem faster than a Fury X but if the former is running a QVGA display and the latter is running an 8K one that doesn't really seem fair, does it?



Back on topic, I don't really blame Intel for wanting to get away from increasing performance solely through process and architecture. There's a looming apocalypse in the semiconductor industry that will be very expensive to survive. Getting to sub-micron minimum feature size was hard. Getting High-K to sit on a wafer correctly was hard. Building FinFETs was freaking hard. Following the ITRS roadmap to 5nm in 2021? Madness. The fleet of lasers we used to get to 14nm are worth the same amount as a gently used fighter jet and in a year they'll be very expensive paperweights someone decided to weld to a giant concrete slab.

Truth is, everybody's been drawing from the same bag of tricks since about 2000, ever since the nasty surprise of Netburst showed that deep pipelines don't work very well when branch prediction can't keep up, which in turn put a bit of a ceiling on clock speed. And now we're running out of tricks, we need to find a different bag.

We have different bags in parallelism and exotic materials, but both are sealed tight by 40 years of inertia. How can you undo layers upon layers of accepted convention? No one wants to re-invent software to use all those cores and no one wants to spend the money to make wafers out of those exotic materials at a reasonable cost.

But, uh, don't just take my word for it: http://scholar.harvard.edu/files/mickens/files/theslowwinter.pdf

TL;DR a 40 year old way of thinking is dying and software overhead is finally catching up with how quickly transistors can be jammed onto a die

(Of course, this is not a common view. Very few consumers also have a hand in producing what they consume. Grown men shed tears to get them their shiny toys, yo.)
 
Last edited:
  • Like
Reactions: iFreilicht

Phuncz

Lord of the Boards
SFFn Staff
May 9, 2015
5,938
4,951
I think a lot of PC enthusiasts know about Apple's work (because I'm willing to bet a large number of them also own smartphones), but being PC enthusiasts, they don't want to deal with the limitations Apple imposes on its products. Enthusiasts want to poke and prod, to disobey the manufacturer guidelines and the moment Apple lets them, the integration disappears and the performance with it.
PC enthusiasts are still a minority in the PC world, although most plain users are shifting towards mobile (phone, tablet) products which impose even more limitations (software and hardware). Apple PCs were never targeted at enthusiasts that want to fiddle and tweak. Just like a Rolls Royce isn't meant to be driven like a sports car like an Aston Martin. In the same sense I don't see many enthusiasts buying a Mad Catz R.A.T. very customizable mouse, so it's clear it depends where the needs lie.

I don't think it was @PlayfulPhoenix 's intension to have PC enthusiasts be shocked in awe of what Apple might cook up, but how as a mislabeled "all show, no go" brand they've realised multiple class-leading mobile SoCs while it's not their core business. They could have used Samsung's Exynos or Qualcomm's Snapdragon and be done with it, using companies that specialise in that sort of thing.

(Also, Apple doesn't make its SoCs...or much of anything. They drive a harder bargain than most design houses and they like surprise inspections, but in the end someone else is making the product real.)

As far as I know only Samsung and Intel design and manufacture their own SoCs. Apple's A-series SoCs are designed in-house and manufactured by a fab like TSMC or Samsung. Qualcomm, AMD and Nvidia for instance also don't manufacture their own SoCs but do design them.

No company that I know of manufactures a complete PC without relying on an Asian manufacturing facility. This is Foxconn for example (which Apple relies on): https://en.wikipedia.org/wiki/Foxconn#Major_customers
In the same sense, a lot of us aren't "making" our own cases or PCs, we often outsource our designs to other companies to fabricate it or atleast use the materials and machines they make to achieve this.
 

PNP

Airflow Optimizer
Oct 10, 2015
285
257
They could have used Samsung's Exynos or Qualcomm's Snapdragon and be done with it, using companies that specialise in that sort of thing.
They could, but that reduces integration and you can't bend the design towards your own requriements. Confidentiality would require at least a few black boxes for which you will have to take the vendor's word as to how they do what they do.
 

iFreilicht

FlexATX Authority
Feb 28, 2015
3,243
2,361
freilite.com
In the same sense, a lot of us aren't "making" our own cases or PCs, we often outsource our designs to other companies to fabricate it or atleast use the materials and machines they make to achieve this.

It's often ignored or forgotten that there is a limit to how much you can do yourself. Even the greatest DIY gods will not mine the aluminium for the plates they use for their projects.
There's always a thing that you will have to outsource to a specialist, that's the sole reason why we can even do this sort of stuff in the first place.
 

Phuncz

Lord of the Boards
SFFn Staff
May 9, 2015
5,938
4,951
Indeed and it wasn't meant as critique towards designers and modders, but more as a reminder that nobody does it all. This is why we have a civilization, so each person doesn't have to do everything alone.
 

Soul_Est

SFF Guru
SFFn Staff
Feb 12, 2016
1,536
1,928
To be honest, I was hoping for a re-think of the hardware and software that we use, especially the operating system. Using ARM cores for general tasks (desktop UI, peripheral I/O, block I/O, background services, etc.), x86 cores for certain computationally intensive tasks (CAD, 3D modelling, gaming, web browsing, etc.), and the GPU for other computationally intensive tasks (rendering, gaming, video encoding and decoding, positional audio, OpenCL, etc.) would help with creating a more energy efficient system. The latter two have already been realized. The use of ARM cores for general tasks with the x86 cores used for heavier tasks (and suspended for the rest of the time), could net a much lower use of electricity for either better battery run time or a lower energy bill.
 

PNP

Airflow Optimizer
Oct 10, 2015
285
257
The use of ARM cores for general tasks with the x86 cores used for heavier tasks (and suspended for the rest of the time), could net a much lower use of electricity for either better battery run time or a lower energy bill.

The enormous overhead associated with this...

ARM post-Cortex era is Harvard like most x86 CPUs but ARM runs on RISC, it does not play well with x86 which is CISC. You would need an additional emulation layer (something like VISC) to bridge these two. We could end up with a machine language version of Java and Java is sloooow.

I mean, I guess to speed things up we could make this new VISC machine the main processor and make the x86/ARM cores co-processors, but for all the crap we give Intel for including a GPU "which no one ever uses ever ever ever ever", would you really be willing to give up more die space for this new machine?

Furthermore, we'd need three generations of programmers who started on this philosophy because all the old-fogeys will just code what they know (like how game developers tended to use the Motorola 68K on the Atari Jaguar for processing tasks even though the purpose-built Tom and Jerry chips on the same board were much more powerful but also very different).
 

iFreilicht

FlexATX Authority
Feb 28, 2015
3,243
2,361
freilite.com
something like VISC

Man that would be a revival of a dead concept. The main problem is that, as you say, this would have to be implemented for a long time to migrate from the old style of processors to the new one just because programming would have to change so much.

The question is, do we really need more processing power in the personal computer market? Sure, HPC applications always need more, but they are highly specialised, so maybe you could have a potential new architecture take off there much faster if the performance gains are high enough.
Especially when we look at gaming, which is the most common thing to use such processing power, it seems like more power actually harms the games that are being made. More and more budget is being spent on the graphical fidelity and the assets while underlying systems and gameplay mechanics get reused and recycled.
 

PNP

Airflow Optimizer
Oct 10, 2015
285
257
Man that would be a revival of a dead concept.

I dunno, the way VISC was/is being sold, it seemed like implementation would single-handedly take us to the singularity :p.

Especially when we look at gaming, which is the most common thing to use such processing power, it seems like more power actually harms the games that are being made. More and more budget is being spent on the graphical fidelity and the assets while underlying systems and gameplay mechanics get reused and recycled.

I always felt that programmers ought to be start out on something extremely primitive so that sloppy coding actually has consequences. That simple amortization program? It has to run on a PDP-11 that someone threw together on a breadboard, start counting your registers.

Though the way developers are leaving the trip-A scene to make indie titles makes me think that maybe the ship is starting to sink.
 

IntoxicatedPuma

Customizer of Titles
SFFn Staff
Feb 26, 2016
992
1,272
I'm not a programmer but........I'm trying to avoid AAA because I don't want some guy telling me what to do and not having any ability to give feedback. The money can be good on those but they're long projects where the work you do doesn't get recognized unless you're one who likes to take credit for everything and tell everyone about it.