• RTX 4090 available 8/12, only $1600!

    From rms@21:1/5 to All on Tue Sep 20 11:23:52 2022
    https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/
    No doubt a day one purchase for well-off peeps with high-end racing/flightsim setups at 4k. Whether the cheaper $900/$1200 4080 versions would be tempting upgrades for present 3080 owners is an interesting
    question. My 2yr-old $800 3080 is likely worth little on ebay, and since
    I've been happy at 1440p and mostly play less-demanding games, the yearning
    to upgrade just isn't there for now, and I'll wait until a 4k OLED monitor
    in just the right formfactor appears before considering one of these nextgen cards.

    Also, AMD will be launching their RDNA3 lineup in November.

    rms

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From JAB@21:1/5 to rms on Wed Sep 21 09:26:51 2022
    On 20/09/2022 18:23, rms wrote:
    https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/
      No doubt a day one purchase for well-off peeps with high-end racing/flightsim setups at 4k.  Whether the cheaper $900/$1200 4080
    versions would be tempting upgrades for present 3080 owners is an
    interesting question.  My 2yr-old $800 3080 is likely worth little on
    ebay, and since I've been happy at 1440p and mostly play less-demanding games, the yearning to upgrade just isn't there for now, and I'll wait
    until a 4k OLED monitor in just the right formfactor appears before considering one of these nextgen cards.

    Also, AMD will be launching their RDNA3 lineup in November.


    Well that's cheap!

    As I've probably said before my days of buying higher end graphics cards
    are long behind me and even then I went for the more expensive ones I
    wouldn't go for the most expensive but try and hit that sweet spot. My
    last upgrade was from an ageing GTX 570 HD* to a budget end GTX 1650 OC.
    I did think about moving into the mid-range but I just couldn't justify
    it when I really don't have, or want, any games that the 1650 isn't more
    than capable of running to what I find is an acceptable level. I'm sure
    I'd find what a card like the 4090 could do is impressive but ultimately
    I know that for me the novelty quickly wears off.

    *Even that upgrade was kinda forced on me as I thought it was failing
    due to the appearance of graphical artefacts but in reality it was the
    PSU that was most likely the core problem.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Justisaur@21:1/5 to JAB on Wed Sep 21 19:01:57 2022
    On Wednesday, September 21, 2022 at 1:26:53 AM UTC-7, JAB wrote:
    On 20/09/2022 18:23, rms wrote:
    https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/
    No doubt a day one purchase for well-off peeps with high-end racing/flightsim setups at 4k. Whether the cheaper $900/$1200 4080 versions would be tempting upgrades for present 3080 owners is an interesting question. My 2yr-old $800 3080 is likely worth little on ebay, and since I've been happy at 1440p and mostly play less-demanding games, the yearning to upgrade just isn't there for now, and I'll wait until a 4k OLED monitor in just the right formfactor appears before considering one of these nextgen cards.

    Also, AMD will be launching their RDNA3 lineup in November.

    Well that's cheap!

    As I've probably said before my days of buying higher end graphics cards
    are long behind me and even then I went for the more expensive ones I wouldn't go for the most expensive but try and hit that sweet spot. My
    last upgrade was from an ageing GTX 570 HD* to a budget end GTX 1650 OC.
    I did think about moving into the mid-range but I just couldn't justify
    it when I really don't have, or want, any games that the 1650 isn't more than capable of running to what I find is an acceptable level. I'm sure
    I'd find what a card like the 4090 could do is impressive but ultimately
    I know that for me the novelty quickly wears off.

    *Even that upgrade was kinda forced on me as I thought it was failing
    due to the appearance of graphical artefacts but in reality it was the
    PSU that was most likely the core problem.

    Seeing as how I just bought a 3060 Ti to play Elden Ring, not going to
    happen here either. I'll probably upgrade my CPU next, which of course
    means MB/CPU & Ram. I was thinking about doing it soon as I'm on
    Win 10 and it's complaining my computer doesn't meet requirements
    for 11. On the other hand I don't see any reason to, I've got 11 on my
    work computers, and it's just more of a pain in the ass to get to settings
    I want. I did see Vampire Slayers was bottlenecking pretty hard on
    the CPU, but performance mode works to fix that.

    I usually don't upgrade unless there's something I really want to play,
    or several somethings that my GPU really can't handle even
    with minimum graphics.

    - Justisaur
    ø-ø
    (\_/)\
    `-'\ `--.___,
    ¶¬'\( ,_.-'
    \\
    ^'

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From PW@21:1/5 to JAB on Wed Sep 21 21:38:19 2022
    On Wed, 21 Sep 2022 09:26:51 +0100, JAB <noway@nochance.com> wrote:

    On 20/09/2022 18:23, rms wrote:
    https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/
      No doubt a day one purchase for well-off peeps with high-end
    racing/flightsim setups at 4k.  Whether the cheaper $900/$1200 4080
    versions would be tempting upgrades for present 3080 owners is an
    interesting question.  My 2yr-old $800 3080 is likely worth little on
    ebay, and since I've been happy at 1440p and mostly play less-demanding
    games, the yearning to upgrade just isn't there for now, and I'll wait
    until a 4k OLED monitor in just the right formfactor appears before
    considering one of these nextgen cards.

    Also, AMD will be launching their RDNA3 lineup in November.


    Well that's cheap!

    As I've probably said before my days of buying higher end graphics cards
    are long behind me and even then I went for the more expensive ones I >wouldn't go for the most expensive but try and hit that sweet spot. My
    last upgrade was from an ageing GTX 570 HD* to a budget end GTX 1650 OC.
    I did think about moving into the mid-range but I just couldn't justify
    it when I really don't have, or want, any games that the 1650 isn't more
    than capable of running to what I find is an acceptable level. I'm sure
    I'd find what a card like the 4090 could do is impressive but ultimately
    I know that for me the novelty quickly wears off.

    *Even that upgrade was kinda forced on me as I thought it was failing
    due to the appearance of graphical artefacts but in reality it was the
    PSU that was most likely the core problem.

    *---

    I don't know why, but my eVGA GeForce RTX Super 2700 is still doing a
    great job. I play everything at maximum resolution and quality without
    any signs of a slowdown.

    If anything, if I was to buy something it would probably be a monitor
    of higher resolutions and features than my current Dell Gaming
    monitors (I have two 27 inch ones).

    I don't know what the fancy terms means for the new monitors any way.

    -pw

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Spalls Hurgenson@21:1/5 to rsquiresMOO@MOOflashMOO.net on Thu Sep 22 13:50:17 2022
    On Tue, 20 Sep 2022 11:23:52 -0600, "rms"
    <rsquiresMOO@MOOflashMOO.net> wrote:

    https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/
    No doubt a day one purchase for well-off peeps with high-end
    racing/flightsim setups at 4k. Whether the cheaper $900/$1200 4080 versions >would be tempting upgrades for present 3080 owners is an interesting >question. My 2yr-old $800 3080 is likely worth little on ebay, and since >I've been happy at 1440p and mostly play less-demanding games, the yearning >to upgrade just isn't there for now, and I'll wait until a 4k OLED monitor
    in just the right formfactor appears before considering one of these nextgen >cards.

    nVidia's CEO also just recently announced, essentially, that Moore's
    Law is dead and GPUs have no option to go up in price. Well, he
    controls a company that has a virtual stranglehold on GPUs, so I'm
    sure he's trustworthy in this regard, right?

    My days of paying excessively high amounts for a GPU are long past, I
    think. Back in the day, I'd happily sling $200-400 USD for a video
    card (and might even go through several per year at that price) but
    I'm not sure I see the value in it anymore.

    Graphics are more-and-more becoming 'good enough' and a lot of
    developers - especially the Indies - are relying more on clever
    artistry than pushing polygons to differentiate themselves. The only
    ones who can really afford to develop the super high-end visuals are
    the triple-A publishers, and their products are becoming less and less interesting as far as gameplay and value are concerned. So if I'm
    running a GPU two or three generations old? That's fine as far as I'm concerned.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Zaghadka@21:1/5 to Spalls Hurgenson on Thu Sep 22 17:35:27 2022
    On Thu, 22 Sep 2022 13:50:17 -0400, in comp.sys.ibm.pc.games.action,
    Spalls Hurgenson wrote:

    On Tue, 20 Sep 2022 11:23:52 -0600, "rms"
    <rsquiresMOO@MOOflashMOO.net> wrote:

    https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/
    No doubt a day one purchase for well-off peeps with high-end >>racing/flightsim setups at 4k. Whether the cheaper $900/$1200 4080 versions >>would be tempting upgrades for present 3080 owners is an interesting >>question. My 2yr-old $800 3080 is likely worth little on ebay, and since >>I've been happy at 1440p and mostly play less-demanding games, the yearning >>to upgrade just isn't there for now, and I'll wait until a 4k OLED monitor >>in just the right formfactor appears before considering one of these nextgen >>cards.

    nVidia's CEO also just recently announced, essentially, that Moore's
    Law is dead and GPUs have no option to go up in price. Well, he
    controls a company that has a virtual stranglehold on GPUs, so I'm
    sure he's trustworthy in this regard, right?

    My days of paying excessively high amounts for a GPU are long past, I
    think. Back in the day, I'd happily sling $200-400 USD for a video
    card (and might even go through several per year at that price) but
    I'm not sure I see the value in it anymore.

    Graphics are more-and-more becoming 'good enough' and a lot of
    developers - especially the Indies - are relying more on clever
    artistry than pushing polygons to differentiate themselves. The only
    ones who can really afford to develop the super high-end visuals are
    the triple-A publishers, and their products are becoming less and less >interesting as far as gameplay and value are concerned. So if I'm
    running a GPU two or three generations old? That's fine as far as I'm >concerned.

    I'm still running a 1080GTX that I paid $600 for. I'm not going into RTX
    until it is fully mature.

    2000 series was definitely not, I mean, they put ray-tracing into
    Minecraft and Quake 2, ffs.

    3000 series was the first serious attempt at it, but prices went nutso,
    and games showed serious lack of detail to accomodate the ray tracing
    overhead (I'm looking at you Cyberpunk).

    4000 series is the first one I'll evaluate for a purchase, and probably a
    4060, because... nutso prices.

    But rn, my 1080 does just fine, and I have a ridiculous backlog of games
    that will run on it perfectly. It's near top-of-the line pre-ray tracing.
    I'm not convinced at all that ray tracing is worth the ransom Nvidia is charging for it. That and the cards are getting too long to even fit in
    my case! I'm very resistant to recasing my computer.

    --
    Zag

    No one ever said on their deathbed, 'Gee, I wish I had
    spent more time alone with my computer.' ~Dan(i) Bunten

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ross Ridge@21:1/5 to spallshurgenson@gmail.com on Fri Sep 23 01:19:38 2022
    Spalls Hurgenson <spallshurgenson@gmail.com> wrote:
    nVidia's CEO also just recently announced, essentially, that Moore's
    Law is dead and GPUs have no option to go up in price. Well, he
    controls a company that has a virtual stranglehold on GPUs, so I'm
    sure he's trustworthy in this regard, right?

    Even monopolists have to set prices based on demand if they want to
    maximize profits. I think Nvidia may have a unrealistic expectation
    about what the demand for thier cards is likely to end up being.
    Their pricing seems to be based on the same logic they used when
    they increased prices for their last couple of generations of GPUs.
    Consumer demand was growing, while demand from cryto miners was exploding.

    This time though, consumer spending on PCs is way down, and is only
    likely to fall even further. Demand from crypto miners has already
    collapsed and may never recover now that Ethereum has moved away from
    a proof-of-work model.

    Moore's Law is dead but Nvidia has relatively large profit margins for an electronics company, so they have room to keep prices stable, even lower
    them, despite their costs increasing. Mind you if the demand Nvidia is expecting doesn't materialize, it'll be their board partners that will
    be forced lower prices and then go begging to Nvidia to lower chip prices.

    --
    l/ // Ross Ridge -- The Great HTMU
    [oo][oo] rridge@csclub.uwaterloo.ca
    -()-/()/ http://www.csclub.uwaterloo.ca:11068/
    db //

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rms@21:1/5 to All on Thu Sep 22 21:30:50 2022
    I don't know why, but my eVGA GeForce RTX Super 2700 is still doing a
    great job. I play everything at maximum resolution and quality without
    any signs of a slowdown.

    You're playing elden ring with this? I guess at 1080p it would manage. You're likely not getting HDR with that setup either.

    rms

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Werner P.@21:1/5 to All on Fri Sep 23 10:28:10 2022
    Am 23.09.22 um 03:19 schrieb Ross Ridge:
    Even monopolists have to set prices based on demand if they want to
    maximize profits. I think Nvidia may have a unrealistic expectation
    about what the demand for thier cards is likely to end up being.
    Their pricing seems to be based on the same logic they used when
    they increased prices for their last couple of generations of GPUs.
    Consumer demand was growing, while demand from cryto miners was explodi
    I guess they have not yet gotten the end of the mining boom for good on
    their radar.
    Ethereum which was the only driving force of gpu mining has shifted to
    proof of stake instead of proof of work last week, no fork is on its
    way, and newer coins will not touch gpu mining. So this market is gone
    for good, but NVidia probably has gotten this too late on their radar to
    change prices and strategy.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Werner P.@21:1/5 to All on Fri Sep 23 10:26:37 2022
    Am 22.09.22 um 19:50 schrieb Spalls Hurgenson:
    Graphics are more-and-more becoming 'good enough' and a lot of
    developers - especially the Indies - are relying more on clever
    artistry than pushing polygons to differentiate themselves. The only
    ones who can really afford to develop the super high-end visuals are
    the triple-A publishers, and their products are becoming less and less interesting as far as gameplay and value are concerned. So if I'm
    running a GPU two or three generations old? That's fine as far as I'm concerned.

    Same here, I am happily playing 90% of the games on my Steam Deck
    nowadays and that thing only has a measly APU. But it is also not
    pushing resolution boundaries, which is perfectly fine being a mobile
    device.
    I can hold off from a new gpu purchase for several years, even with VR
    in mind. Foveated rendering will offload a ton of rendering load from
    the gpus in the next gen.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rms@21:1/5 to All on Fri Sep 23 11:37:35 2022
    "But perhaps the most damning indictment of what Nvidia is doing comes in
    the shape of value for money. At $1,600, the new RTX 4090 looks expensive enough to be irrelevant to the vast majority of gamers. But the fact that it looks like good value compared to the RTX 4080 12GB is completely crazy" https://www.pcgamer.com/nvidia-rtx-40-series-let-down

    https://www.reddit.com/r/pcgaming/comments/xlms6v/comment/ipksxew/

    rms

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ross Ridge@21:1/5 to rms on Fri Sep 23 18:39:12 2022
    rms <rsquiresMOO@MOOflashMOO.net> wrote:
    "But perhaps the most damning indictment of what Nvidia is doing comes in
    the shape of value for money. At $1,600, the new RTX 4090 looks expensive >enough to be irrelevant to the vast majority of gamers. But the fact that it >looks like good value compared to the RTX 4080 12GB is completely crazy" >https://www.pcgamer.com/nvidia-rtx-40-series-let-down

    Yah, a few sites have noticed that the RTX 4090 gives better performance
    per dollar on just abou every metric than both the RTX 4080 12G and 16G.

    Another interesting point that article makes is the Nvidia probably
    isn't planning on selling many of these GPUs anyways:

    Expect Nvidia to keep production numbers low for this
    generation. It knows the whole Ada Lovelace series is onto a loser
    in terms of market conditions. That can't be helped. It's going to
    be tough for the next 18 months to two years, at least, whatever
    Nvidia does. So, it will likely keep volumes uncharacteristically
    low and try to maintain higher margins courtesy of short supply,
    the idea being to keep very high prices for the longer term
    and for future generations when the market has picked up again,
    even if sales of Ada Lovelace suffer. That makes sense given Ada
    Lovelace has the potential to struggle, whatever, thanks to all
    those external factors.

    The only problem is that Nvidia is going to need to sell a lot of them
    at some point if they're going to recoup the massive investment they
    made into designing these GPUs. Waiting a couple of years for sales
    to recover while keeping prices unchanged won't be easy. AMD is about
    to announce a new generation chiplet based GPUs that will likely be significantly cheaper to make than Nvidia's new chips, and Intel has
    their new discrete graphic cards coming soon.

    Still Nvidia may not have a lot of options, apperently they have a lot
    3000 series GPUs in their warehouses they need to get rid of. They don't
    want their new 4000 GPUs undercutting them. But with crypto miners also
    having a lot 3000 series cards to get rid of, it also be hard for Nvidia
    not lower the prices on thier previous generation GPUs.

    --
    l/ // Ross Ridge -- The Great HTMU
    [oo][oo] rridge@csclub.uwaterloo.ca
    -()-/()/ http://www.csclub.uwaterloo.ca:11068/
    db //

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Spalls Hurgenson@21:1/5 to Ross Ridge on Fri Sep 23 15:23:48 2022
    On Fri, 23 Sep 2022 18:39:12 -0000 (UTC), rridge@csclub.uwaterloo.ca
    (Ross Ridge) wrote:

    rms <rsquiresMOO@MOOflashMOO.net> wrote:
    "But perhaps the most damning indictment of what Nvidia is doing comes in >>the shape of value for money. At $1,600, the new RTX 4090 looks expensive >>enough to be irrelevant to the vast majority of gamers. But the fact that it >>looks like good value compared to the RTX 4080 12GB is completely crazy" >>https://www.pcgamer.com/nvidia-rtx-40-series-let-down

    Yah, a few sites have noticed that the RTX 4090 gives better performance
    per dollar on just abou every metric than both the RTX 4080 12G and 16G.

    Another interesting point that article makes is the Nvidia probably
    isn't planning on selling many of these GPUs anyways:

    Expect Nvidia to keep production numbers low for this
    generation. It knows the whole Ada Lovelace series is onto a loser
    in terms of market conditions. That can't be helped. It's going to
    be tough for the next 18 months to two years, at least, whatever
    Nvidia does. So, it will likely keep volumes uncharacteristically
    low and try to maintain higher margins courtesy of short supply,
    the idea being to keep very high prices for the longer term
    and for future generations when the market has picked up again,
    even if sales of Ada Lovelace suffer. That makes sense given Ada
    Lovelace has the potential to struggle, whatever, thanks to all
    those external factors.

    The only problem is that Nvidia is going to need to sell a lot of them
    at some point if they're going to recoup the massive investment they
    made into designing these GPUs. Waiting a couple of years for sales
    to recover while keeping prices unchanged won't be easy. AMD is about
    to announce a new generation chiplet based GPUs that will likely be >significantly cheaper to make than Nvidia's new chips, and Intel has
    their new discrete graphic cards coming soon.

    Still Nvidia may not have a lot of options, apperently they have a lot
    3000 series GPUs in their warehouses they need to get rid of. They don't >want their new 4000 GPUs undercutting them. But with crypto miners also >having a lot 3000 series cards to get rid of, it also be hard for Nvidia
    not lower the prices on thier previous generation GPUs.

    Although I'd be wary of buying a used GPU pre-owned by a cryptocrank.
    Constant use - and heat - probably haven't done the capacitors and microelectronics on those boards much good. I'm sure lots of people
    will buy them - whether they're aware of the device's history or not -
    but I wouldn't be surprised if many of these frugal buyers face
    sporadic and difficult-to-troubleshoot issues from these devices,
    which could tarnish nvidia's reputation for years to come.

    But you have to wonder if this pressure isn't one of the reasons EVGA
    bailed on nvidia. nvidia's margins have been relatively good (although
    dropping since a high in 2018, they're still around the 20% range),
    EVGA has been squeezed tighter (it's margin is about 5%), and it could
    be the competiton from used GPUs - on top of the increased costs of
    nvidia chips - was the feather that broke the camel's back.

    Shame; I rather liked EVGA cards. And nvidia cards too, for that
    matter. But they're pricing themselves... well, not out of my reach,
    but out of where I feel I'm getting value for money. And in an era
    where graphics capability of its competitors are often 'good enough'
    and there's increasingly less demand for such raw power, it's a risky
    move on nvidia's part.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From PW@21:1/5 to rsquiresMOO@MOOflashMOO.net on Sat Sep 24 21:42:22 2022
    On Thu, 22 Sep 2022 21:30:50 -0600, "rms"
    <rsquiresMOO@MOOflashMOO.net> wrote:

    I don't know why, but my eVGA GeForce RTX Super 2700 is still doing a
    great job. I play everything at maximum resolution and quality without
    any signs of a slowdown.

    You're playing elden ring with this? I guess at 1080p it would manage.
    You're likely not getting HDR with that setup either.

    rms
    *--

    Yes, and a lot of others!

    ER: 2560x1440, Auto-Detect Best Rendering Settings=ON, Quality=High,
    All game settings on High

    144Hz refresh rate

    Dell S2719DGF (I am using two of them, but not with ER).

    I don't know what 1080p and HDR are.

    Don't know what my framerates are either.

    -pw

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ross Ridge@21:1/5 to spallshurgenson@gmail.com on Sun Sep 25 22:11:24 2022
    Spalls Hurgenson <spallshurgenson@gmail.com> wrote:
    But you have to wonder if this pressure isn't one of the reasons EVGA
    bailed on nvidia. nvidia's margins have been relatively good (although >dropping since a high in 2018, they're still around the 20% range),
    EVGA has been squeezed tighter (it's margin is about 5%), and it could
    be the competiton from used GPUs - on top of the increased costs of
    nvidia chips - was the feather that broke the camel's back.

    Apparently EVGA told Nvidia that they were ending their relationship back
    in April. Back then it was less obvious that a recession was coming,
    but the PC market was already slowing and the Ethereum merge was only a
    matter of time. So EVGA would've known the next generation of Nvidia
    GPUs weren't likely to match previous generations in terms of sales.
    I can imagine EVGA lowering their internal sale forcasts until it got
    the point where they decided they might as well just lower them to zero.

    More generally, I think Nvidia's board partners could bear a lot of
    the brunt of Nvidia's pricing decisions. If market conditions end
    up being as bad as they look, they're the ones that will be forced to
    lower prices first. They have their own costs of development they need
    to recoup and while their margins are better than EVGA's, because they
    make their own boards, they're not as big as Nvidia's.

    EVGA probably would've got clobbered if they choose continue to working
    with Nvidia. Whatever their reasons, it's very much looking like they
    made the right choice.

    --
    l/ // Ross Ridge -- The Great HTMU
    [oo][oo] rridge@csclub.uwaterloo.ca
    -()-/()/ http://www.csclub.uwaterloo.ca:11068/
    db //

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Spalls Hurgenson@21:1/5 to Ross Ridge on Sun Sep 25 22:04:45 2022
    On Sun, 25 Sep 2022 22:11:24 -0000 (UTC), rridge@csclub.uwaterloo.ca
    (Ross Ridge) wrote:

    Spalls Hurgenson <spallshurgenson@gmail.com> wrote:
    But you have to wonder if this pressure isn't one of the reasons EVGA >>bailed on nvidia. nvidia's margins have been relatively good (although >>dropping since a high in 2018, they're still around the 20% range),
    EVGA has been squeezed tighter (it's margin is about 5%), and it could
    be the competiton from used GPUs - on top of the increased costs of
    nvidia chips - was the feather that broke the camel's back.

    Apparently EVGA told Nvidia that they were ending their relationship back
    in April. Back then it was less obvious that a recession was coming,
    but the PC market was already slowing and the Ethereum merge was only a >matter of time. So EVGA would've known the next generation of Nvidia
    GPUs weren't likely to match previous generations in terms of sales.
    I can imagine EVGA lowering their internal sale forcasts until it got
    the point where they decided they might as well just lower them to zero.

    More generally, I think Nvidia's board partners could bear a lot of
    the brunt of Nvidia's pricing decisions. If market conditions end
    up being as bad as they look, they're the ones that will be forced to
    lower prices first. They have their own costs of development they need
    to recoup and while their margins are better than EVGA's, because they
    make their own boards, they're not as big as Nvidia's.

    EVGA probably would've got clobbered if they choose continue to working
    with Nvidia. Whatever their reasons, it's very much looking like they
    made the right choice.


    I hadn't known that about the April announcement; thanks for the
    update.

    EVGA also pointed to the competition from nvidia's own "Founder
    Edition" cards, since those tended to get first-choice of the best
    components, leaving aftermarket brands to struggle for what's left
    (which was especially a problem during the crypto-boom, and during the pandemic's shipping crisis). Founders Editions often came out before
    the aftermarket brands too, and so grab the choicest customers too.
    Apparently that practice was taking a big bite out of EGVA's sales.

    This transition away from nvidia cards will likely kill EVGA, unless
    they have some ace hidden up their sleeves. Their other product lines
    just don't seem robust enough to support the company (and certainly
    don't inspire the robust premiums that video cards earn; as important
    as it is to buy a quality power-supply, 9 out 10 customers will buy
    the cheaper one, and not EVGA's)

    Still, I think long run this will hurt nvidia even more, at least in
    the discrete GPU market. I can't help but remember my experiment a few
    years back when their GeForce Now streaming service was introduced,
    and my lowly Atom-processor powered Netbook gave reasonable (if not
    entirely playable) performance with games like "Kingdom Come:
    Deliverance". More powerful - but still very dated - PCs with
    integrated video ran far better. Nvidia's offerings are becoming
    increasingly unimportant except to the '1337' gamer, and even though
    nvidia could still sell its GPUs to streaming services, I doubt that
    would compare to the sales they'd make directly to individual
    customers or through OEMs.

    Moores Law may be dead... but nvidia's strategy can't be to overload
    its boards with more GPU chips at an ever-increasing cost; it's
    unsustainable and will likely leave them vulnerable to an up-n-coming
    tech company that better leverages existing tech rather than relying
    on brute force.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Idaho Homo Joe@21:1/5 to All on Sun Sep 25 21:41:21 2022
    Try using FRAPS. It has very regular software updates.
    It's the best program out there! Do it, douchebag!!!

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From JAB@21:1/5 to Justisaur on Tue Sep 27 11:16:25 2022
    On 22/09/2022 03:01, Justisaur wrote:

    Seeing as how I just bought a 3060 Ti to play Elden Ring, not going to
    happen here either. I'll probably upgrade my CPU next, which of course
    means MB/CPU & Ram. I was thinking about doing it soon as I'm on
    Win 10 and it's complaining my computer doesn't meet requirements
    for 11. On the other hand I don't see any reason to, I've got 11 on my
    work computers, and it's just more of a pain in the ass to get to settings
    I want. I did see Vampire Slayers was bottlenecking pretty hard on
    the CPU, but performance mode works to fix that.


    My PC is relatively new and even it says that it can't run Win 11.
    That's sorta of true but only because I need to change the BIOS settings
    to enable some security features.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Dimensional Traveler@21:1/5 to JAB on Tue Sep 27 08:03:52 2022
    On 9/27/2022 3:16 AM, JAB wrote:
    On 22/09/2022 03:01, Justisaur wrote:

    Seeing as how I just bought a 3060 Ti to play Elden Ring, not going to
    happen here either.  I'll probably upgrade my CPU next, which of course
    means MB/CPU & Ram.  I was thinking about doing it soon as I'm on
    Win 10 and it's complaining my computer doesn't meet requirements
    for 11.  On the other hand I don't see any reason to, I've got 11 on my
    work computers, and it's just more of a pain in the ass to get to
    settings
    I want.  I did see Vampire Slayers was bottlenecking pretty hard on
    the CPU, but performance mode works to fix that.


    My PC is relatively new and even it says that it can't run Win 11.
    That's sorta of true but only because I need to change the BIOS settings
    to enable some security features.

    Shirley you meant "change the BIOS settings to disable some security
    features"!


    --
    I've done good in this world. Now I'm tired and just want to be a cranky
    dirty old man.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Justisaur@21:1/5 to JAB on Tue Sep 27 07:42:11 2022
    On Tuesday, September 27, 2022 at 3:16:27 AM UTC-7, JAB wrote:
    On 22/09/2022 03:01, Justisaur wrote:

    Seeing as how I just bought a 3060 Ti to play Elden Ring, not going to happen here either. I'll probably upgrade my CPU next, which of course means MB/CPU & Ram. I was thinking about doing it soon as I'm on
    Win 10 and it's complaining my computer doesn't meet requirements
    for 11. On the other hand I don't see any reason to, I've got 11 on my
    work computers, and it's just more of a pain in the ass to get to settings I want. I did see Vampire Slayers was bottlenecking pretty hard on
    the CPU, but performance mode works to fix that.

    My PC is relatively new and even it says that it can't run Win 11.
    That's sorta of true but only because I need to change the BIOS settings
    to enable some security features.

    Ah that could be it. I don't really feel like bothering with trying to get it updated at this point, though it's supposed to have better memory
    management, which should make a small improvement to overall speed.

    - Justisaur

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From JAB@21:1/5 to Justisaur on Wed Sep 28 08:33:46 2022
    On 27/09/2022 15:42, Justisaur wrote:
    On Tuesday, September 27, 2022 at 3:16:27 AM UTC-7, JAB wrote:
    On 22/09/2022 03:01, Justisaur wrote:

    Seeing as how I just bought a 3060 Ti to play Elden Ring, not going to
    happen here either. I'll probably upgrade my CPU next, which of course
    means MB/CPU & Ram. I was thinking about doing it soon as I'm on
    Win 10 and it's complaining my computer doesn't meet requirements
    for 11. On the other hand I don't see any reason to, I've got 11 on my
    work computers, and it's just more of a pain in the ass to get to settings >>> I want. I did see Vampire Slayers was bottlenecking pretty hard on
    the CPU, but performance mode works to fix that.

    My PC is relatively new and even it says that it can't run Win 11.
    That's sorta of true but only because I need to change the BIOS settings
    to enable some security features.

    Ah that could be it. I don't really feel like bothering with trying to get it
    updated at this point, though it's supposed to have better memory management, which should make a small improvement to overall speed.


    I'm still on Win 10 as I also just don't see the point at the moment of
    going to Win 11. My last upgrade was from Win 7 and that was because of
    support being withdrawn. I did have a bit of a panic when it seemed to
    say the upgrade was no longer free but a quick bit of googlefoo found
    that it's fairly simply to get the free upgrade legitimately.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Werner P.@21:1/5 to All on Thu Sep 29 11:07:32 2022
    Am 23.09.22 um 00:35 schrieb Zaghadka:
    I'm still running a 1080GTX that I paid $600 for. I'm not going into RTX until it is fully mature.

    2000 series was definitely not, I mean, they put ray-tracing into
    Minecraft and Quake 2, ffs.
    I have a 2080FE which very likely will serve me another bunch of years,
    the card is fine.
    The price was ok, due to the fact that i got it for FE MSRP back then,
    when NVidia actually cared about the gaming customers.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Werner P.@21:1/5 to All on Thu Sep 29 11:09:34 2022
    Am 23.09.22 um 21:23 schrieb Spalls Hurgenson:
    Shame; I rather liked EVGA cards. And nvidia cards too, for that
    matter. But they're pricing themselves... well, not out of my reach,
    but out of where I feel I'm getting value for money.
    Amen to that, my 2080 will run another bunch of years but then I will
    have a serious look towards AMD, just the same way I switched over from
    Intel when the Ryzens came out.
    The few features I lose are less pain than paying the nvidia tax. AMD
    now has a real chance to regain marketshare from NVidia if they play
    their cards right.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From JAB@21:1/5 to Werner P. on Fri Sep 30 10:11:05 2022
    On 29/09/2022 10:09, Werner P. wrote:
    Am 23.09.22 um 21:23 schrieb Spalls Hurgenson:
    Shame; I rather liked EVGA cards. And nvidia cards too, for that
    matter. But they're pricing themselves... well, not out of my reach,
    but out of where I feel I'm getting value for money.
    Amen to that, my 2080 will run another bunch of years but then I will
    have a serious look towards AMD, just the same way I switched over from
    Intel when the Ryzens came out.
    The few features I lose are less pain than paying the nvidia tax. AMD
    now has a real chance to regain marketshare from NVidia if they play
    their cards right.


    I've stuck with nVidia not because I like what they offer as such but
    instead because when I do an upgrade my choices of what to get are very
    much driven by what I hope are relatively independent recommendations of
    what will met my needs. It's also the reason that for my last upgrade I switched from Intel to a Ryzen as that seemed to be the best value for
    money in the budget sector.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Anssi Saari@21:1/5 to rms on Fri Sep 30 15:52:54 2022
    "rms" <rsquiresMOO@MOOflashMOO.net> writes:

    Also, AMD will be launching their RDNA3 lineup in November.

    Even Intel is apparently intending to get their video cards out, in
    October. Good to have competition but it'll be definitely interesting to
    see how or if Intel performs.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)