https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/
 No doubt a day one purchase for well-off peeps with high-end racing/flightsim setups at 4k. Whether the cheaper $900/$1200 4080
versions would be tempting upgrades for present 3080 owners is an
interesting question. My 2yr-old $800 3080 is likely worth little on
ebay, and since I've been happy at 1440p and mostly play less-demanding games, the yearning to upgrade just isn't there for now, and I'll wait
until a 4k OLED monitor in just the right formfactor appears before considering one of these nextgen cards.
Also, AMD will be launching their RDNA3 lineup in November.
On 20/09/2022 18:23, rms wrote:
https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/
No doubt a day one purchase for well-off peeps with high-end racing/flightsim setups at 4k. Whether the cheaper $900/$1200 4080 versions would be tempting upgrades for present 3080 owners is an interesting question. My 2yr-old $800 3080 is likely worth little on ebay, and since I've been happy at 1440p and mostly play less-demanding games, the yearning to upgrade just isn't there for now, and I'll wait until a 4k OLED monitor in just the right formfactor appears before considering one of these nextgen cards.
Also, AMD will be launching their RDNA3 lineup in November.
Well that's cheap!
As I've probably said before my days of buying higher end graphics cards
are long behind me and even then I went for the more expensive ones I wouldn't go for the most expensive but try and hit that sweet spot. My
last upgrade was from an ageing GTX 570 HD* to a budget end GTX 1650 OC.
I did think about moving into the mid-range but I just couldn't justify
it when I really don't have, or want, any games that the 1650 isn't more than capable of running to what I find is an acceptable level. I'm sure
I'd find what a card like the 4090 could do is impressive but ultimately
I know that for me the novelty quickly wears off.
*Even that upgrade was kinda forced on me as I thought it was failing
due to the appearance of graphical artefacts but in reality it was the
PSU that was most likely the core problem.
On 20/09/2022 18:23, rms wrote:
https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/
No doubt a day one purchase for well-off peeps with high-end
racing/flightsim setups at 4k. Whether the cheaper $900/$1200 4080
versions would be tempting upgrades for present 3080 owners is an
interesting question. My 2yr-old $800 3080 is likely worth little on
ebay, and since I've been happy at 1440p and mostly play less-demanding
games, the yearning to upgrade just isn't there for now, and I'll wait
until a 4k OLED monitor in just the right formfactor appears before
considering one of these nextgen cards.
Also, AMD will be launching their RDNA3 lineup in November.
Well that's cheap!
As I've probably said before my days of buying higher end graphics cards
are long behind me and even then I went for the more expensive ones I >wouldn't go for the most expensive but try and hit that sweet spot. My
last upgrade was from an ageing GTX 570 HD* to a budget end GTX 1650 OC.
I did think about moving into the mid-range but I just couldn't justify
it when I really don't have, or want, any games that the 1650 isn't more
than capable of running to what I find is an acceptable level. I'm sure
I'd find what a card like the 4090 could do is impressive but ultimately
I know that for me the novelty quickly wears off.
*Even that upgrade was kinda forced on me as I thought it was failing
due to the appearance of graphical artefacts but in reality it was the
PSU that was most likely the core problem.
https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/
No doubt a day one purchase for well-off peeps with high-end
racing/flightsim setups at 4k. Whether the cheaper $900/$1200 4080 versions >would be tempting upgrades for present 3080 owners is an interesting >question. My 2yr-old $800 3080 is likely worth little on ebay, and since >I've been happy at 1440p and mostly play less-demanding games, the yearning >to upgrade just isn't there for now, and I'll wait until a 4k OLED monitor
in just the right formfactor appears before considering one of these nextgen >cards.
On Tue, 20 Sep 2022 11:23:52 -0600, "rms"
<rsquiresMOO@MOOflashMOO.net> wrote:
https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/
No doubt a day one purchase for well-off peeps with high-end >>racing/flightsim setups at 4k. Whether the cheaper $900/$1200 4080 versions >>would be tempting upgrades for present 3080 owners is an interesting >>question. My 2yr-old $800 3080 is likely worth little on ebay, and since >>I've been happy at 1440p and mostly play less-demanding games, the yearning >>to upgrade just isn't there for now, and I'll wait until a 4k OLED monitor >>in just the right formfactor appears before considering one of these nextgen >>cards.
nVidia's CEO also just recently announced, essentially, that Moore's
Law is dead and GPUs have no option to go up in price. Well, he
controls a company that has a virtual stranglehold on GPUs, so I'm
sure he's trustworthy in this regard, right?
My days of paying excessively high amounts for a GPU are long past, I
think. Back in the day, I'd happily sling $200-400 USD for a video
card (and might even go through several per year at that price) but
I'm not sure I see the value in it anymore.
Graphics are more-and-more becoming 'good enough' and a lot of
developers - especially the Indies - are relying more on clever
artistry than pushing polygons to differentiate themselves. The only
ones who can really afford to develop the super high-end visuals are
the triple-A publishers, and their products are becoming less and less >interesting as far as gameplay and value are concerned. So if I'm
running a GPU two or three generations old? That's fine as far as I'm >concerned.
nVidia's CEO also just recently announced, essentially, that Moore's
Law is dead and GPUs have no option to go up in price. Well, he
controls a company that has a virtual stranglehold on GPUs, so I'm
sure he's trustworthy in this regard, right?
I don't know why, but my eVGA GeForce RTX Super 2700 is still doing a
great job. I play everything at maximum resolution and quality without
any signs of a slowdown.
Even monopolists have to set prices based on demand if they want toI guess they have not yet gotten the end of the mining boom for good on
maximize profits. I think Nvidia may have a unrealistic expectation
about what the demand for thier cards is likely to end up being.
Their pricing seems to be based on the same logic they used when
they increased prices for their last couple of generations of GPUs.
Consumer demand was growing, while demand from cryto miners was explodi
Graphics are more-and-more becoming 'good enough' and a lot of
developers - especially the Indies - are relying more on clever
artistry than pushing polygons to differentiate themselves. The only
ones who can really afford to develop the super high-end visuals are
the triple-A publishers, and their products are becoming less and less interesting as far as gameplay and value are concerned. So if I'm
running a GPU two or three generations old? That's fine as far as I'm concerned.
"But perhaps the most damning indictment of what Nvidia is doing comes in
the shape of value for money. At $1,600, the new RTX 4090 looks expensive >enough to be irrelevant to the vast majority of gamers. But the fact that it >looks like good value compared to the RTX 4080 12GB is completely crazy" >https://www.pcgamer.com/nvidia-rtx-40-series-let-down
rms <rsquiresMOO@MOOflashMOO.net> wrote:
"But perhaps the most damning indictment of what Nvidia is doing comes in >>the shape of value for money. At $1,600, the new RTX 4090 looks expensive >>enough to be irrelevant to the vast majority of gamers. But the fact that it >>looks like good value compared to the RTX 4080 12GB is completely crazy" >>https://www.pcgamer.com/nvidia-rtx-40-series-let-down
Yah, a few sites have noticed that the RTX 4090 gives better performance
per dollar on just abou every metric than both the RTX 4080 12G and 16G.
Another interesting point that article makes is the Nvidia probably
isn't planning on selling many of these GPUs anyways:
Expect Nvidia to keep production numbers low for this
generation. It knows the whole Ada Lovelace series is onto a loser
in terms of market conditions. That can't be helped. It's going to
be tough for the next 18 months to two years, at least, whatever
Nvidia does. So, it will likely keep volumes uncharacteristically
low and try to maintain higher margins courtesy of short supply,
the idea being to keep very high prices for the longer term
and for future generations when the market has picked up again,
even if sales of Ada Lovelace suffer. That makes sense given Ada
Lovelace has the potential to struggle, whatever, thanks to all
those external factors.
The only problem is that Nvidia is going to need to sell a lot of them
at some point if they're going to recoup the massive investment they
made into designing these GPUs. Waiting a couple of years for sales
to recover while keeping prices unchanged won't be easy. AMD is about
to announce a new generation chiplet based GPUs that will likely be >significantly cheaper to make than Nvidia's new chips, and Intel has
their new discrete graphic cards coming soon.
Still Nvidia may not have a lot of options, apperently they have a lot
3000 series GPUs in their warehouses they need to get rid of. They don't >want their new 4000 GPUs undercutting them. But with crypto miners also >having a lot 3000 series cards to get rid of, it also be hard for Nvidia
not lower the prices on thier previous generation GPUs.
*--I don't know why, but my eVGA GeForce RTX Super 2700 is still doing a
great job. I play everything at maximum resolution and quality without
any signs of a slowdown.
You're playing elden ring with this? I guess at 1080p it would manage.
You're likely not getting HDR with that setup either.
rms
But you have to wonder if this pressure isn't one of the reasons EVGA
bailed on nvidia. nvidia's margins have been relatively good (although >dropping since a high in 2018, they're still around the 20% range),
EVGA has been squeezed tighter (it's margin is about 5%), and it could
be the competiton from used GPUs - on top of the increased costs of
nvidia chips - was the feather that broke the camel's back.
Spalls Hurgenson <spallshurgenson@gmail.com> wrote:
But you have to wonder if this pressure isn't one of the reasons EVGA >>bailed on nvidia. nvidia's margins have been relatively good (although >>dropping since a high in 2018, they're still around the 20% range),
EVGA has been squeezed tighter (it's margin is about 5%), and it could
be the competiton from used GPUs - on top of the increased costs of
nvidia chips - was the feather that broke the camel's back.
Apparently EVGA told Nvidia that they were ending their relationship back
in April. Back then it was less obvious that a recession was coming,
but the PC market was already slowing and the Ethereum merge was only a >matter of time. So EVGA would've known the next generation of Nvidia
GPUs weren't likely to match previous generations in terms of sales.
I can imagine EVGA lowering their internal sale forcasts until it got
the point where they decided they might as well just lower them to zero.
More generally, I think Nvidia's board partners could bear a lot of
the brunt of Nvidia's pricing decisions. If market conditions end
up being as bad as they look, they're the ones that will be forced to
lower prices first. They have their own costs of development they need
to recoup and while their margins are better than EVGA's, because they
make their own boards, they're not as big as Nvidia's.
EVGA probably would've got clobbered if they choose continue to working
with Nvidia. Whatever their reasons, it's very much looking like they
made the right choice.
Seeing as how I just bought a 3060 Ti to play Elden Ring, not going to
happen here either. I'll probably upgrade my CPU next, which of course
means MB/CPU & Ram. I was thinking about doing it soon as I'm on
Win 10 and it's complaining my computer doesn't meet requirements
for 11. On the other hand I don't see any reason to, I've got 11 on my
work computers, and it's just more of a pain in the ass to get to settings
I want. I did see Vampire Slayers was bottlenecking pretty hard on
the CPU, but performance mode works to fix that.
On 22/09/2022 03:01, Justisaur wrote:
Seeing as how I just bought a 3060 Ti to play Elden Ring, not going to
happen here either. I'll probably upgrade my CPU next, which of course
means MB/CPU & Ram. I was thinking about doing it soon as I'm on
Win 10 and it's complaining my computer doesn't meet requirements
for 11. On the other hand I don't see any reason to, I've got 11 on my
work computers, and it's just more of a pain in the ass to get to
settings
I want. I did see Vampire Slayers was bottlenecking pretty hard on
the CPU, but performance mode works to fix that.
My PC is relatively new and even it says that it can't run Win 11.
That's sorta of true but only because I need to change the BIOS settings
to enable some security features.
On 22/09/2022 03:01, Justisaur wrote:
Seeing as how I just bought a 3060 Ti to play Elden Ring, not going to happen here either. I'll probably upgrade my CPU next, which of course means MB/CPU & Ram. I was thinking about doing it soon as I'm on
Win 10 and it's complaining my computer doesn't meet requirements
for 11. On the other hand I don't see any reason to, I've got 11 on my
work computers, and it's just more of a pain in the ass to get to settings I want. I did see Vampire Slayers was bottlenecking pretty hard on
the CPU, but performance mode works to fix that.
My PC is relatively new and even it says that it can't run Win 11.
That's sorta of true but only because I need to change the BIOS settings
to enable some security features.
On Tuesday, September 27, 2022 at 3:16:27 AM UTC-7, JAB wrote:
On 22/09/2022 03:01, Justisaur wrote:
Seeing as how I just bought a 3060 Ti to play Elden Ring, not going toMy PC is relatively new and even it says that it can't run Win 11.
happen here either. I'll probably upgrade my CPU next, which of course
means MB/CPU & Ram. I was thinking about doing it soon as I'm on
Win 10 and it's complaining my computer doesn't meet requirements
for 11. On the other hand I don't see any reason to, I've got 11 on my
work computers, and it's just more of a pain in the ass to get to settings >>> I want. I did see Vampire Slayers was bottlenecking pretty hard on
the CPU, but performance mode works to fix that.
That's sorta of true but only because I need to change the BIOS settings
to enable some security features.
Ah that could be it. I don't really feel like bothering with trying to get it
updated at this point, though it's supposed to have better memory management, which should make a small improvement to overall speed.
I'm still running a 1080GTX that I paid $600 for. I'm not going into RTX until it is fully mature.I have a 2080FE which very likely will serve me another bunch of years,
2000 series was definitely not, I mean, they put ray-tracing into
Minecraft and Quake 2, ffs.
Shame; I rather liked EVGA cards. And nvidia cards too, for thatAmen to that, my 2080 will run another bunch of years but then I will
matter. But they're pricing themselves... well, not out of my reach,
but out of where I feel I'm getting value for money.
Am 23.09.22 um 21:23 schrieb Spalls Hurgenson:
Shame; I rather liked EVGA cards. And nvidia cards too, for thatAmen to that, my 2080 will run another bunch of years but then I will
matter. But they're pricing themselves... well, not out of my reach,
but out of where I feel I'm getting value for money.
have a serious look towards AMD, just the same way I switched over from
Intel when the Ryzens came out.
The few features I lose are less pain than paying the nvidia tax. AMD
now has a real chance to regain marketshare from NVidia if they play
their cards right.
Also, AMD will be launching their RDNA3 lineup in November.
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 297 |
Nodes: | 16 (0 / 16) |
Uptime: | 126:41:57 |
Calls: | 6,663 |
Calls today: | 1 |
Files: | 12,212 |
Messages: | 5,334,958 |