News NVIDIA GeForce GTX 1650 Launching April 2019

chromov113

Trash Compacter
Jan 14, 2020
34
14
By the way, regarding the power limit... I was recently thinking about power delivery system of a GPU. The GPU driver has algorithms to calculate it's clock based on available power limit and temperature, and the user may control the power limit of a GPU with the help of some software, right? That means, a GPU measures it's current draw somewhere and considers it for calculating the available clock.

Imagine you design a power delivery system with two power sources (from GPU's point of view) - PCIe slot and PCIe connector from a PSU. By adding individual current sensors on both channels you may not only measure the current draw per source, but also detect, if the PSU is connected at all - thus, potentially, allowing for a GPU, which works with or without PSU connector, it just won't boost the clock to its full potential without power from PSU. I don't see any technical restrictions here, only cost and meaningfullness.

I'm just dreaming though.
 
Last edited:

chromov113

Trash Compacter
Jan 14, 2020
34
14
So I bought and installed Gigabyte low-profile 1650 GDDR6 (GV-N1656OC-4GL), and I must say - that's a win!

I tested both old Zotac 1050 Ti and 1650 with Unigine Superposition (1080p high), both without overclock and stock fan curves, and 1650 demonstrated about 35% increase in points. The only thing is that it heated up to 83°C - too much, I suppose. By limiting power to 80% (via MSI Afterburner) I managed to limit the temperature by 76°C, and 1650 still showed about 22% increase in points over 1050 Ti.
A nice surprise was that this card is incredibly silent compared to Zotac. Despite the same fan diameter of 55 mm, subjectively the noise level of Gigabyte 1650 at full load equals Zotac 1050 Ti at idling!


Sorry, I didn't make any real gaming tests. In fact, the only game I played on both cards was Mad Max, and that is with G-Sync enabled. I have a budget Freesync 75 Hz monitor, and the synchronization worked okay at 60 Hz refresh rate, although there was some tearing at dynamic scenes, like turning the camera around the protagonist very quickly. With 1650, I'm sure tearing occurs less often.
 
Last edited:
  • Like
Reactions: Valantar and BaK

Valantar

Shrink Ray Wielder
Jan 20, 2018
2,201
2,225
So I bought and installed Gigabyte low-profile 1650 GDDR6 (GV-N1656OC-4GL), and I must say - that's a win!

I tested both old Zotac 1050 Ti and 1650 with Unigine Superposition (1080p high), both without overclock and stock fan curves, and 1650 demonstrated about 35% increase in points. The only thing is that it heated up to 83°C - too much, I suppose. By limiting power to 80% (via MSI Afterburner) I managed to limit the temperature by 76°C, and 1650 still showed about 22% increase in points over 1050 Ti.
A nice surprise was that this card is incredibly silent compared to Zotac. Despite the same fan diameter of 55 mm, subjectively the noise level of Gigabyte 1650 at full load equals Zotac 1050 Ti at idling!


Sorry, I didn't make any real gaming tests. In fact, the only game I played on both cards was Mad Max, and that is with G-Sync enabled. I have a budget Freesync 75 Hz monitor, and the synchronization worked okay at 60 Hz refresh rate, although there was some tearing at dynamic scenes, like turning the camera around the protagonist very quickly. With 1650, I'm sure tearing occurs less often.
Are you sure you actually have FreeSync activated? Using an FS display with a GeForce card you'll need to force it on in your GPU driver settings (unless you have one of the relatively few certified displays out there). Tearing should be entirely eliminated by using any form of adaptive sync, which makes me think it's not actually activated for you.

Beyond that, I really wouldn't worry about the GPU running at 83 degrees. Unless it's throttling, leave it be - that is a perfectly fine temperature for GPU silicon.
 

Legion

Airflow Optimizer
Original poster
Nov 22, 2017
357
386

chromov113

Trash Compacter
Jan 14, 2020
34
14
Are you sure you actually have FreeSync activated? Using an FS display with a GeForce card you'll need to force it on in your GPU driver settings
That's exactly what I did. I've experimented with Nvidia Pendulum Demo with 1050 Ti:
G-SyncRefresh rate, HzResult
OFF75 / 60Frequent tearing
ON75Occasional tearing
ON60No tearing
However, there still was occasional screen tearing while gaming (Mad Max). I suppose that Freesync, being a purely software feature, is less effective than a hardware-software G-Sync. And my monitor is not on Nvidia's support list, by the way.
I want to try Pendulum with my new GPU sometime soon as well.

Beyond that, I really wouldn't worry about the GPU running at 83 degrees. Unless it's throttling, leave it be - that is a perfectly fine temperature for GPU silicon.
Okay, got it. Yes, it had throttled at 83°C.
 
  • Like
Reactions: Valantar

Valantar

Shrink Ray Wielder
Jan 20, 2018
2,201
2,225

Looking at that, it's pretty unlikely we'll see a 75w Low profile card coming on Ampere. The 3050 is 90w TGP SKU (that translates to cards using approx 120w once board, VRM and Vram is factored in)
If the rumors of the GTX series being retired entirely are true, we can't expect successors in the RTX series to have the same naming tier. Remember, the GTX 1660 (and Ti) were a clear step below the RTX 2060, despite both being xx60 tier cards.. Ampere also generally seems to increase power draws at each naming level, at least with current SKUs. So, if there's an RTX 3050, it's far more likely to be meant to succeed the GTX 1660 (or Ti) than the 1650 (Ti/Super), and a 120W TBP sounds likely as such. For a 1650 (Ti) successor, I guess we'll see if Nvidia brings back the xx40 naming tier this go around - if not, they'll have a tough time fitting in all the performance and power draw tiers given how wide their GPU stack is looking. Or maybe they'll actually update the xx30 tier this go around? IMO there'd be room for both if they actually made a low-end Ampere die, say at $100 and $150 and 50 and 75W.
That's exactly what I did. I've experimented with Nvidia Pendulum Demo with 1050 Ti:
G-SyncRefresh rate, HzResult
OFF75 / 60Frequent tearing
ON75Occasional tearing
ON60No tearing
However, there still was occasional screen tearing while gaming (Mad Max). I suppose that Freesync, being a purely software feature, is less effective than a hardware-software G-Sync. And my monitor is not on Nvidia's support list, by the way.
I want to try Pendulum with my new GPU sometime soon as well.
That sounds like you are having some actual compatibility issues between the GPU and monitor. Not unheard of when using an FS monitor and a Geforce GPU, but more of a consequence of Nvidia really not wanting to spend engineering resources on monitors where they aren't getting a cut. Still a shame though, as the smoothness of 75Hz is definitely noticeably improved over 60Hz.
Okay, got it. Yes, it had throttled at 83°C.
By how much? You still lost ~13% of your performance increase by power limiting it, so it can't have been throttling that hard. I guess it depends what you're after (if the lower temperature lets you run your fans slower, for example), but if performance is a goal I would let the card regulate itself. 83C even if throttling isn't damaging anything, and you clearly aren't losing performance from it.
 

Analogue Blacksheep

King of Cable Management
Dec 2, 2018
831
688

Looking at that, it's pretty unlikely we'll see a 75w Low profile card coming on Ampere. The 3050 is 90w TGP SKU (that translates to cards using approx 120w once board, VRM and Vram is factored in)

I think low profile cards are going to need a rethink (And a company who are willing to risk it). @Necere showed some brilliant concepts back in this thread from last year which show how the LP form factor could evolve.

There was also a LP GPU years ago which had a power connector on it, the Afox 7850 low profile.
 
Last edited:

smitty2k1

King of Cable Management
Dec 3, 2016
967
492
If the rumors of the GTX series being retired entirely are true, we can't expect successors in the RTX series to have the same naming tier. Remember, the GTX 1660 (and Ti) were a clear step below the RTX 2060, despite both being xx60 tier cards.. Ampere also generally seems to increase power draws at each naming level, at least with current SKUs. So, if there's an RTX 3050, it's far more likely to be meant to succeed the GTX 1660 (or Ti) than the 1650 (Ti/Super), and a 120W TBP sounds likely as such. For a 1650 (Ti) successor, I guess we'll see if Nvidia brings back the xx40 naming tier this go around - if not, they'll have a tough time fitting in all the performance and power draw tiers given how wide their GPU stack is looking. Or maybe they'll actually update the xx30 tier this go around? IMO there'd be room for both if they actually made a low-end Ampere die, say at $100 and $150 and 50 and 75W.

That sounds like you are having some actual compatibility issues between the GPU and monitor. Not unheard of when using an FS monitor and a Geforce GPU, but more of a consequence of Nvidia really not wanting to spend engineering resources on monitors where they aren't getting a cut. Still a shame though, as the smoothness of 75Hz is definitely noticeably improved over 60Hz.

By how much? You still lost ~13% of your performance increase by power limiting it, so it can't have been throttling that hard. I guess it depends what you're after (if the lower temperature lets you run your fans slower, for example), but if performance is a goal I would let the card regulate itself. 83C even if throttling isn't damaging anything, and you clearly aren't losing performance from it.
Also he said the fans were very quiet. I'd try bumping them up to 100% speed to see if that cooled the chip down
 
  • Like
Reactions: Valantar

chromov113

Trash Compacter
Jan 14, 2020
34
14
By how much? You still lost ~13% of your performance increase by power limiting it, so it can't have been throttling that hard. I guess it depends what you're after (if the lower temperature lets you run your fans slower, for example), but if performance is a goal I would let the card regulate itself. 83C even if throttling isn't damaging anything, and you clearly aren't losing performance from it.
At 100% power limit and before throttling I've seen 1800 MHz, at 83°C - 1650-1700 MHz, at 80% power limit and 76°C - 1625-1675 MHz.

Also he said the fans were very quiet. I'd try bumping them up to 100% speed to see if that cooled the chip down
I shall try it out, too. Never seen fan demand over 74%.

Guys, what do you think about Aorus Engine? Since I've got Gigabyte and there is a proprietary software :)
 

Valantar

Shrink Ray Wielder
Jan 20, 2018
2,201
2,225
At 100% power limit and before throttling I've seen 1800 MHz, at 83°C - 1650-1700 MHz, at 80% power limit and 76°C - 1625-1675 MHz.


I shall try it out, too. Never seen fan demand over 74%.

Guys, what do you think about Aorus Engine? Since I've got Gigabyte and there is a proprietary software :)
What were the temperatures like when you hit 1800MHz? Modern Nvidia cards drop boost clocks every few degrees above ... is it 65°C? So going from an idle card to full load you'll always see it boost high initially before it heats up, but stable clocks will always be lower than peak clocks unless you have a massively overbuilt cooler. That's just how these GPUs work (and part of why they have much lower boost clock specs than real-world clocks). It also sounds rather odd that your stock 83° results beat the 1050 by ~35% and your new power limited results only beat it by 22% if clocks are that similar in use. Is the benchmark you're running long enough for the card to reach steady-state temperatures?

It's also worth noting that power limiting the GPU can harm 1%/.1% FPS numbers more than averages, as power throttling typically causes more clock speed fluctuations than thermal limits. So while your averages might be similar, it's likely that the overall experience will be worse.

As for Aorus Engine, I don't really know anything about it, but I would much rather trust a known good OC software (MSI Afterburner, EVGA Precision) than an unknown one, as they can cause some really weird behaviour. Unless the Gigabyte software gives you access to adjusting something the others don't, I would stay away.
 

chromov113

Trash Compacter
Jan 14, 2020
34
14
That's just how these GPUs work
I know this ?

It's also worth noting that power limiting the GPU can harm 1%
And this I didn't now, seems logical to me. But it's impossible to limit temperature without touching the power limit, isn't it? I tried, MSI Afterburner doesn't let me.

Aorus Engine is similar to MSI Afterburner, but doesn't include Riva Tuner Statistics Server, so - no in-game overlay, no FPS number and other things. But it allows to search for VBIOS updates and update it.
Oh, and user interface is terrible.
 
Last edited:

Valantar

Shrink Ray Wielder
Jan 20, 2018
2,201
2,225
And this I didn't now, seems logical to me. But it's impossible to limit temperature without touching the power limit, isn't it? I tried, MSI Afterburner doesn't let me.
Hm, Afterburner lets me adjust the target temperature on my AMD cards, but I don't know about Nvidia ones. I would have expected that to be an option at least.
 

owliwar

Master of Cramming
Lazer3D
Apr 7, 2017
586
1,082
I have the gigabyte LP gtx1650 and it doesnt have a zero rpm mode, but its mostly inaudible over other components on other system. at load it does get loud, but only if youre sitting close to it, as I am. either way it is 'ok'
I de-shrouded the gpu and put a single 92mm slim noctua fan in there and its much better in terms of noise, temps are a couple ºC better as well.
 
  • Like
Reactions: K888D