• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

News PC gamers appear to be rejecting 8GB graphics cards — now available for a discount

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Is it weird that I bought a 12GB 5070 in late Nov 2025 with the thought in my head that it's good enough right now for 1440p gaming and who care about two years from now since my gpu only having 12GB of VRAM on it is likely to be the least of my problems going into 2028 as the nation falls apart? Got a decent price on it ($440) in a combination of BF pricing plus a friend being able to use an employee discount. Sounds like the Supers are either going to be dead or crazy overpriced so screw it, pulled the trigger now before RAM apocalypse prices get baked in.
 
You can just store the games on USB or microSD storage and swap them in and out of the nvme. Most already have those, and it is probably going to be much cheaper solution than the markup on the Gabecube.

Putting on my flame suit for this one; The 5050 is not a bad value at $250. As I've argued repeatedly, you have to adjust for inflation. Which has been really bad the last few years. Most of the complaints about pricing = old man complaining about a loaf a bread being a dime when they were young energy.
IDK that's not like normal inflation numbers, that's more like super price gouged inflation, e.g., the rise in McDonalds prices by 100% from 2014-2024 while the general inflation rate over that time period was 31%.
 
Is it weird that I bought a 12GB 5070 in late Nov 2025 with the thought in my head that it's good enough right now for 1440p gaming and who care about two years from now since my gpu only having 12GB of VRAM on it is likely to be the least of my problems going into 2028 as the nation falls apart? Got a decent price on it ($440) in a combination of BF pricing plus a friend being able to use an employee discount. Sounds like the Supers are either going to be dead or crazy overpriced so screw it, pulled the trigger now before RAM apocalypse prices get baked in.
12GB puts you comfortably into the "equivalent to a PS5" zone, you'll be okay for a few years until the collapse of America
 
So maybe I'm just slow, but if I can play most games on Ultra @ 1440, and I have zero plans to upgrade to a 4k monitor. Why the hell would I ever need to upgrade? I know Redditors will tell me I should get a 240hz monitor to game. But I really don't need that shit lol. So wouldn't this card last me as long as I have it? I think my monitor does 120hz. I don't know for sure, but games are running beautifully, so I don't care about numbers. 1440p gaming isn't going to magically change in 4 years where we'll need some uber card with 24gb of vram.
 
So maybe I'm just slow, but if I can play most games on Ultra @ 1440, and I have zero plans to upgrade to a 4k monitor. Why the hell would I ever need to upgrade? I know Redditors will tell me I should get a 240hz monitor to game. But I really don't need that shit lol. So wouldn't this card last me as long as I have it? I think my monitor does 120hz. I don't know for sure, but games are running beautifully, so I don't care about numbers. 1440p gaming isn't going to magically change in 4 years where we'll need some uber card with 24gb of vram.
I guess kind of depends what next gen consoles are like. I'm not expecting too big a generational leap after Sony is seeing how poorly the PS5 Pro is selling at $700+ despite being a pretty significant leap in gpu over base PS5.
 
I guess kind of depends what next gen consoles are like. I'm not expecting too big a generational leap after Sony is seeing how poorly the PS5 Pro is selling at $700+ despite being a pretty significant leap in gpu over base PS5.

The Pro selling well doesn't help Sony that much unless it has far higher margins. The only extra sales they get are people who would not buy the base model, but will buy a Pro. Otherwise they're selling a different model to someone who would have bought a base model possibly for no difference in profit. If they sell to an existing PS5 owner, that's a used console hitting the market that replaces the sale of a new one for someone else unless it gets gifted to someone who would never buy one themselves.
 
The Pro selling well doesn't help Sony that much unless it has far higher margins. The only extra sales they get are people who would not buy the base model, but will buy a Pro. Otherwise they're selling a different model to someone who would have bought a base model possibly for no difference in profit. If they sell to an existing PS5 owner, that's a used console hitting the market that replaces the sale of a new one for someone else unless it gets gifted to someone who would never buy one themselves.
Disagree, lots of profit there in the Pro in either forcing you to pay $100 for a lousy blu-ray drive or forcing you to pay monopoly prices on all your games by having you stuck buying from PSN instead of being able to take advantage of competition on physical sales between Amazon, Best Buy, Walmart, Target, Gamestop, etc. Plus the $30 for a vertical stand is pure profit just like the drive. Also used consoles hitting the market is great for Sony since that's more game sales from people who might not have been able to afford new.
 
Also used consoles hitting the market is great for Sony since that's more game sales from people who might not have been able to afford new.
That's a great point. The hardware has traditionally been a financial loss, at least initially, since it is merely the delivery system for the software and services. Getting the BoM to the point of break even or profitability was always ancillary.
 
So maybe I'm just slow, but if I can play most games on Ultra @ 1440, and I have zero plans to upgrade to a 4k monitor. Why the hell would I ever need to upgrade? I know Redditors will tell me I should get a 240hz monitor to game. But I really don't need that shit lol. So wouldn't this card last me as long as I have it? I think my monitor does 120hz. I don't know for sure, but games are running beautifully, so I don't care about numbers. 1440p gaming isn't going to magically change in 4 years where we'll need some uber card with 24gb of vram.
This 5070 is pretty cool. I was playing Spiderman Miles Morales with most of the settings on very high at 4k with DLSS quality (so 1440p render) and maxing out up to my framelimiter in a very busy and graphically intense scene where you're walking through a street market with all kinds of heavy snow falling thick with NPCs, lights, and such. Have a framelimiter set in the Nvidia app to 57fps since I use a 60Hz 4k screen and want to stay in FreeSync range without any tearing and anything higher risks the framerate slipping slightly above 60 and giving tearing. So I thought cool this looks 4k but what if I turn off the upscale and play this scene in native 4k with DLAA instead of DLSS? And it dropped to like 45 fps and I thought ok guess I was asking too much, but within a couple of seconds it was back to maxing out my frame limiter and stayed there the entire rest of the time. Only setting I didn't turn up was disabling RT shadows because they look horrible in the game (they don't animate while the regular shadows do so look terrible with say a tree that should be swaying in the wind).
 
This 5070 is pretty cool. I was playing Spiderman Miles Morales with most of the settings on very high at 4k with DLSS quality (so 1440p render) and maxing out up to my framelimiter in a very busy and graphically intense scene where you're walking through a street market with all kinds of heavy snow falling thick with NPCs, lights, and such. Have a framelimiter set in the Nvidia app to 57fps since I use a 60Hz 4k screen and want to stay in FreeSync range without any tearing and anything higher risks the framerate slipping slightly above 60 and giving tearing. So I thought cool this looks 4k but what if I turn off the upscale and play this scene in native 4k with DLAA instead of DLSS? And it dropped to like 45 fps and I thought ok guess I was asking too much, but within a couple of seconds it was back to maxing out my frame limiter and stayed there the entire rest of the time. Only setting I didn't turn up was disabling RT shadows because they look horrible in the game (they don't animate while the regular shadows do so look terrible with say a tree that should be swaying in the wind).

I know my monitors only 2K, but I was playing Cyberpunk with I think everything maxed except ray tracing, I ran the auto and I it selected low for RT. And I'm getting 120+ FPS. Considering the price difference between the 70 and the 80, this card could almost be called a good deal for Nvidia pricing. Just for shits and giggles when I added this to my cart, I checked the prices for 5090's and almost soiled myself. I'm sure Cyberpunk would look much better with RT on ultra. But I'm fine with it on low lol. I should put it on ultra just to see how it affects my FPS.
 
I know my monitors only 2K, but I was playing Cyberpunk with I think everything maxed except ray tracing, I ran the auto and I it selected low for RT. And I'm getting 120+ FPS. Considering the price difference between the 70 and the 80, this card could almost be called a good deal for Nvidia pricing. Just for shits and giggles when I added this to my cart, I checked the prices for 5090's and almost soiled myself. I'm sure Cyberpunk would look much better with RT on ultra. But I'm fine with it on low lol. I should put it on ultra just to see how it affects my FPS.
From this bench seems like path tracing is doable if you upscale from 720p (e.g. DLSS Balanced at 1440p) without having to use any crap framegen. Alternatively so is native 4k without upscaling if you turn off RT.

 
I've been playing Indiana Jones and it often exceeds 20GB vram process use at 4K native with 2x frame gen and RT. It's more than even a lot of VR games. It's supposed to use huge textures but they don't honestly look that different from other games.

Depending on the game I use native 4K with 2-3x framegen or quality DLSS without framegen.
 
Tim just revisited budget cards in anticipation of pricing going bonkers soon. As he points out, with 8GB you are hard limited. The correct settings argument was always trolling and gaslighting. It's forced settings. E.G. Star Wars Outlaws forces down texture draw distance and quality, nothing you can do about it.

 
HUB's latest look at ram demonstrates how lack of vram is exacerbated by lack of system ram. Not new info, but with ramageddon here, it sucks even more for low budget builders.

It is why I never liked the compute power argument as the excuse for why low vram is fine. I'd rather have lower fps with smooth frame pacing, all day every day. While keeping the textures turned up.

 
Confirmations are rolling in, as Tim discusses in the latest video. Buy or cry. Probably the last call for cards like the 9060XT 16GB $400, as well.

8GB cards+16GB DDR4+Zen3+512GB SSD for mainstream gaming builds.

200.gif
 
I'm torn, in principle I don't need the 9070 XT and I have other things to buy and play with, but my hunch tells me that in 2 months from now I could sell the 6800XT for a stupid price, making the 9070 XT a very inexpensive purchase.

So freaking annoying, I had decided to ignore this crap and maybe get a 9060 XT if the old card fails... but there's no way to ignore this stupid AI craze.
 
I'm torn, in principle I don't need the 9070 XT and I have other things to buy and play with, but my hunch tells me that in 2 months from now I could sell the 6800XT for a stupid price, making the 9070 XT a very inexpensive purchase.

So freaking annoying, I had decided to ignore this crap and maybe get a 9060 XT if the old card fails... but there's no way to ignore this stupid AI craze.
Not a craze, an obsession. Leads to different resolutions.
 
I'm torn, in principle I don't need the 9070 XT and I have other things to buy and play with, but my hunch tells me that in 2 months from now I could sell the 6800XT for a stupid price, making the 9070 XT a very inexpensive purchase.

So freaking annoying, I had decided to ignore this crap and maybe get a 9060 XT if the old card fails... but there's no way to ignore this stupid AI craze.
I hate that DIY PC hardware requires speculation in the 2020s. But here we are.

I'd like to take this opportunity to apologize to all of you. This is all my fault. I always quip I don't buy hardware, I rent it. They said "Great idea! Let's cut out the middle man!"
 
Not a craze, an obsession. Leads to different resolutions.
Your observation that they are chasing immortality hit hard. I love sci-fi, but I'd prefer they left the dystopian stuff on the written page; thank you very much.
 
Your observation that they are chasing immortality hit hard. I love sci-fi, but I'd prefer they left the dystopian stuff on the written page; thank you very much.
PC stuff containing silicon might be the focus here, but this will have huge additional impacts. Calls for reduced regulatory "burden" on approving SMR projects, all experimental by the way, with all the associated dangers, is a case in point.

Your sci-fi reference is quite accurate. The discussion is do we want a StarTrek , or a Star Wars/Expanse future, and how to prevent a terminator one. They mostly agree that the Singularity has begun.

Crazy stuff indeed and a wild ride surely awaits us all.


Edit:
This would be a truly epic topic for P&N, but I just can't see it happening there.
 
Back
Top