PC Building Guide and Discussion #14

  • Work is still on-going to rebuild the site styling and features. Please report any issues you may experience so we can look into it. Click Here for Updates
Are you sure about the "smaller GPU cooler" pro for the 4070 Ti?

7900XTvs4070Ti.jpg
:biglaugh:

Granted, AIB versions of the 7900 XT may be bigger than AMD's model, on the left, but the 4070 Ti (at least Zotac's, on the right) is still massive.

Between those two, I'd probably lean towards the 7900 XT because of the VRAM (20GB is a lot more futureproof than 12GB) and the slightly better raster performance. Even the ray tracing performance isn't that bad: only 7% slower than the 4070 Ti, on average.

In fact, in the new Unreal Engine 5 version of Fortnite, with ray tracing ON, the 7900 XT comes out slightly ahead:

7900XTvs4070Ti2.jpg

That's really good news for AMD because a lot of games in the next 5 years will be using UE5.
 
Last edited:
Question for you guys. Building a computer for my girlfriend, her first desktop as she's only had gaming laptops. For the price range of certain items, the 4070 Ti or 7900 XT would be most beneficial, especially since you can get them basically at their MSRP, while you really can't much with the 7900 XTX or 4080.

So, my question is, which one would you go for, 4070 Ti or 7900 XT?

Pros for the 4070 Ti:
Less power usage.
Smaller GPU cooler.
Better ray tracing performance (not by a ton, but better).

Pros for the 7900 XT:
Better raster performance.
More RAM to work better at higher resolutions.
Partner does not really do anything with RTX (she currently has a laptop 1060), but would benefit from Raster performance.

Thoughts?

I would usually have the answer, but I'm so torn, considering prices can be dick. I don't want 3000 series or 6000 series for this build because I want her to use the card just fine at 1440p for the next 5 years.
3080 is just a little under the 4070ti in most 1440p benchmarks and the 3090ti outperforms the 4070ti so I wouldn’t write off 3000 series cards if you are worried about prices, they will perform 1440p the next 5 years.
 
The 4070 Ti is a dumb card. You may want to buy it because of superior ray tracing performance but then that uses more VRAM and that card only has 12GB. The 7900 XT is bad value too but at least they didn't cripple that card. I wouldn't buy either of them.
 
3080 is just a little under the 4070ti in most 1440p benchmarks and the 3090ti outperforms the 4070ti so I wouldn’t write off 3000 series cards if you are worried about prices, they will perform 1440p the next 5 years.

I currently have a Gigabyte Aorus Master RTX 3080 and I will say, the thing is massive. And yeah, it's been great at 1440p. But I think I'm trying to do a bit more future proofing for her as she will not upgrade things as soon as I may like (I may get a new card every 3 years or something, she would want 5 years).

3090 Ti's price is way out of reach as it's the price of the whole f***ing machine I have planned :laugh:. As I said before, she doesn't overly care about RTX, but she would like something really good for the money as she wants it to last. Would the 7900 XT be a better choice than the 4070 Ti or do you think I should still gun for a 3080?

Are you sure about the "smaller GPU cooler" pro for the 4070 Ti?

View attachment 643751
:biglaugh:

Granted, AIB versions of the 7900 XT may be bigger than AMD's model, on the left, but the 4070 Ti (at least Zotac's, on the right) is still massive.

Between those two, I'd probably lean towards the 7900 XT because of the VRAM (20GB is a lot more futureproof than 12GB) and the slightly better raster performance. Even the ray tracing performance isn't that bad: only 7% slower than the 4070 Ti, on average.

In fact, in the new Unreal Engine 5 version of Fortnite, with ray tracing ON, the 7900 XT comes out slightly ahead:

View attachment 643758

That's really good news for AMD because a lot of games in the next 5 years will be using UE5.

I thought that it depended on the game, and not just the engine, for the games in terms of RT support? Or do you expect this kind of performance for RT across all upcoming UE5 games?

Also, isn't the 4070 Ti Founders and even the ASUS Tuf one still smaller (not up against the reference AMD one)? I say I don't want to do the reference AMD one due to the cooling issue in the vapor chamber that can overheat part of the card to well over 100C, so it would have to be an AIB partner.

Also, it will fit into a Meshlicious case, so the size of it can be decent but not insane.

Basically boils down to, which do you think is the better value, the 7900 XT or the 4070 Ti?

The 4070 Ti is a dumb card. You may want to buy it because of superior ray tracing performance but then that uses more VRAM and that card only has 12GB. The 7900 XT is bad value too but at least they didn't cripple that card. I wouldn't buy either of them.

The issue boils down to that the XTX isn't really in stock at MSRP and I don't want the stock cooler as that cooler has issues with overheating parts of the card that I don't want. Got a recommended XTX that would be 1100 at most (hopefully it's heavily OC'ed at 1100)?

EDIT: Looks like the PowerColor Hellhound 7900 XTX would fit easily in that case and is $1000 MSRP and has an AIB cooler, not the reference cooler.
 
Last edited:
I thought that it depended on the game, and not just the engine, for the games in terms of RT support? Or do you expect this kind of performance for RT across all upcoming UE5 games?
Well, I expect AMD to perform relatively well in all UE5 games, maybe not always equal or faster, but not 33% slower like in Cyberpunk 2077. 18 months ago, the UE5 demo ran faster on AMD than Nvidia and, now, the first UE5 game does, as well, so UE5 just seems well optimized for AMD. Perhaps future games will take more advantage of Nvidia, but I imagine that it'll still be close, especially with less VRAM on the 4070 Ti.
Also, isn't the 4070 Ti Founders and even the ASUS Tuf one still smaller (not up against the reference AMD one)? I say I don't want to do the reference AMD one due to the cooling issue in the vapor chamber that can overheat part of the card to well over 100C, so it would have to be an AIB partner.
It's only the 7900 XTX that has the faulty vapor chamber and only a "small batch" (according to AMD).
Basically boils down to, which do you think is the better value, the 7900 XT or the 4070 Ti?
I don't know. They seem about equal value now, but the 7900 XT may hold its value better. I was just watching today's Hardware Unboxed video, in which one of the guys noted that they recommended the 3080 over the 6800 XT 2 years ago, but that the 6800 XT has aged better, largely because of its extra VRAM.
 
Last edited:
Well, I expect AMD to perform relatively well in all UE5 games, maybe not always equal or faster, but not 33% slower like in Cyberpunk 2077. 18 months ago, the UE5 demo ran faster on AMD than Nvidia and, now, the first UE5 game does, as well, so UE5 just seems well optimized for AMD. Perhaps future games will take more advantage of Nvidia, but I imagine that it'll still be close, especially with less VRAM on the 4070 Ti.

It's only the 7900 XTX that has the faulty vapor chamber and only a "small batch" (according to AMD).

I don't know. They seem about equal value now, but the 7900 XT may hold its value better. I was just watching today's Hardware Unboxed video, in which one of the guys noted that they recommended the 3080 over the 6800 XT 2 years ago, but that the 6800 XT has aged better, largely because of its extra VRAM.


It did seem a bit iffy at the time for those cards to have so much more RAM than the 3080, but it does seem to have lasted longer in the higher res outlook for gaming on PC. Also, the 6800 XT has gotten significantly cheaper than the 3080 and still has that performance.

I thought it was the general 7900 XTX ones that had the vapor chamber issues all around? Not just a small batch? If I were to get one of the reference coolers now, I should be fine and shouldn't have any issues?

EDIT:

relative-performance-rt_2560-1440.png


From TechpowerUp, the XTX is 17% slower in relative RT performance at 1440p. Well, at MSRP, which is $1000, compared to the $1200 4080, that price difference is 17%. So really, saying it's worse performance than the 4080 is correct, but the price scaling shows that to be exactly the case, but it doesn't mean the XTX is weak at its price.
 
Last edited:
I thought it was the general 7900 XTX ones that had the vapor chamber issues all around? Not just a small batch? If I were to get one of the reference coolers now, I should be fine and shouldn't have any issues?
If you get an XT, you won't have to worry about it. If you get an XTX, there's a small chance that you'll get an affected one, but you can find some videos on how to tell and AMD will replace it for you in that case. I, personally, wouldn't have any hesitation.
 
Last edited:
If you get an XT, you won't have to worry about it. If you get an XTX, there's a small chance that you'll get an affected one, but you can find some videos on how to tell and AMD will replace it for you in that case. I, personally, wouldn't have any hesitation.

Good to f***ing know. Thank you man! Yeah, been going back and forth for all of this shit.
 
SolidSnakeUS said:
From TechpowerUp, the XTX is 17% slower in relative RT performance at 1440p. Well, at MSRP, which is $1000, compared to the $1200 4080, that price difference is 17%. So really, saying it's worse performance than the 4080 is correct, but the price scaling shows that to be exactly the case, but it doesn't mean the XTX is weak at its price.
That's the 7900 XTX vs the 4080. Here's the 7900 XT and XTX vs the Gigabyte 4070 Ti:

relative-performance-rt_2560-1440.png
 
That's the 7900 XTX vs the 4080. Here's the 7900 XT and XTX vs the Gigabyte 4070 Ti:

relative-performance-rt_2560-1440.png


The 4070 Ti ties the XTX and is only 8% faster than the XT ("only," but 8% is still quite a bit less than 17%).

Yeah, I know it's for those two, but yeah, the XT does still seem to be the better value anyways, even with RT. The XTX really is f***ing enticing and I may gun for a reference copy XTX. She has a price limit (about 2k before monitor, OS and keyboard), but yeah, I may help pay for it, either way, the XTX would seem to be the more "future-proof" cards. And with more companies going to UE5, AMD won't be as weak as it seems with RT. And FSR is getting better and better with each version too.
 
Yeah, I know it's for those two, but yeah, the XT does still seem to be the better value anyways, even with RT. The XTX really is f***ing enticing and I may gun for a reference copy XTX. She has a price limit (about 2k before monitor, OS and keyboard), but yeah, I may help pay for it, either way, the XTX would seem to be the more "future-proof" cards. And with more companies going to UE5, AMD won't be as weak as it seems with RT. And FSR is getting better and better with each version too.
I, personally, would splurge for the XTX over the XT... assuming that I could get it for only $100 more. It has 12% better raster and 14% better RT performance at 1440p (even more at 4K) for only an 11% increase in price and the extra bit of VRAM is a bonus.
 
Last edited:
  • Like
Reactions: PeteWorrell
I, personally, would splurge for the XTX if I were considering the XT... assuming that I could get it for only $100 more. It's faster, has 4GB more VRAM (not that 24 vs 20 is liable to ever make much difference) and you're getting 12% better raster and 14% better RT performance at 1440p (even more at 4K) for only an 11% increase in price.

Yeah I'll try to grab the XTX and hopefully at MSRP. And seeing it was a small batch of the XTX's that went bad, I'll still try to get the reference copy.

EDIT: Funny thing, I bought 2 Meshlicious cases off of Micro Center online to have them shipped, chose No Rush shipping (7+ days), ordered on Sunday and they will be here Wednesday :laugh:.
 
Last edited:
Bought an Asus ROG B650E-I ITX board to go shopping with that Meshlicious. Should be real good to use :).

EDIT: This is what my end build would look like:

CPU: 7800X3D/7900X3D (which every one is better for my needs)
GPU: Gigabyte Aorus Master RTX 3080
CPU Cooler: EK AIO 280 Elite
Case: SSUPD Meshlicous mITX
Motherboard: Asus ROG B650E-I ITX
RAM: G.Skill Ripjaws S5 64 GB (2 x 32 GB) DDR5-6000 CL32
1st NVME: Sabrent Rocket 4 Plus 2TB
2nd NVME: Silicon Power XS70 4TB
PSU: Lian Li SP850 80+ Gold (supports ATX 3.0)

So yeah, should be a powerhouse in this kind of form factor.
 
Last edited:
That'll be a really nice build. Are you sure about the 3080, though? A quick check of that model shows it costing just over $1000. That's really bad value. You'd be much better off with the 4070 Ti or 7900 XT/XTX, like you were originally considering.
 
That'll be a really nice build. Are you sure about the 3080, though? A quick check of that model shows it costing just over $1000. That's really bad value. You'd be much better off with the 4070 Ti or 7900 XT/XTX, like you were originally considering.

I currently own a 3080. So this is a case that actually hold this f***ing beast. Got it back in late Nov/early Dec 2020.

I also currently own the CPU cooler and the 1st NVMe.

EDIT: I was able to grab the reference board 7900 XTX for my partner at MSRP. Hell yeah!
 
Last edited:
  • Like
Reactions: Osprey
I think this was true for awhile. These newer games we're talking about cite 720p30 at Low as minimum....

While I appreciate them giving specifics....that's not exactly a fantastic experience :laugh:
It just occurred to me that they may be starting to list 720p as the minimum because the Steam Deck has a 1280x800 display. 720p/30 on a 7" screen probably isn't so bad. They may not expect anyone to actually play at 720p on a desktop/laptop.

Another possibility is that it's an alternative to saying 1080p with DLSS/FSR on, since I think that either set to Balanced renders the game at 720p before upscaling.
 
Last edited:
Bought everything for myself and my girlfriend bought everything for herself. I got the CPU now as I hated how long it was going to be the 3D CPUs coming out and they are a bit too pricey. So I took the 7700X which is the better gaming one.

Mine:

CPU: AMD Ryzen 7700X
GPU: Gigabyte Aorus Master RTX 3080 (already own)
CPU Cooler: EK AIO 280 Elite
Case: SSUPD Meshlicous mITX
Motherboard: Asus ROG B650E-I ITX
RAM: G.Skill Flare S5 32 GB (2 x 32 GB) DDR5-6000 CL32
1st NVME: Sabrent Rocket 4 Plus 2TB (already own)
2nd NVME: Silicon Power XS70 4TB
PSU: Lian Li SP850 80+ Gold (supports ATX 3.0)

Her's:

CPU: AMD Ryzen 7700
GPU: Asus 7900 XTX (Reference)
CPU Cooler: be quiet! Pure Loop 280
Case: SSUPD Meshlicous mITX
Motherboard: Gigabyte B650I AORUS ULTRA Mini ITX
RAM: G.Skill Flare S5 32 GB (2 x 32 GB) DDR5-6000 CL32
1st NVME: Patriot Viper VP4300 2 TB
PSU: Lian Li SP850 80+ Gold (supports ATX 3.0)
 
AMD has revealed the prices and release dates of their newest X3D CPUs:

7950X3D (16c/32t): $699 Feb 28th
7900X3D (12c/24t): $599 Feb 28th
7800X3D (8c/16t): $449 April 6th



The 7800X3D is only $50 more than the 7700X's MSRP, the 7900X3D is only $50 more than the 7900X and the 7950X3D is the same price as the 7950X. Of course, the non-3D ones are selling for well under their MSRPs--ex. the 7700X is only $336 right now on Amazon--so the actual difference in price will likely be over $100, but that's no worse than the 5800X3D cost over the 5800X and most people seem to agree that that was worth it.
 
Last edited:
I was able to grab my 7700x for under 300 the other day.
That's a fantastic deal. Also, you would've had to wait a full two months for the 7800X3D. It's a little disappointing that it's not releasing in February with the other two, but not surprising.
 
That's a fantastic deal. Also, you would've had to wait a full two months for the 7800X3D. It's a little disappointing that it's not releasing in February with the other two, but not surprising.

Hell, it's even right at the end of the month for Feb, not even mid month.
 
It just occurred to me that they may be starting to list 720p as the minimum because the Steam Deck has a 1280x800 display. 720p/30 on a 7" screen probably isn't so bad. They may not expect anyone to actually play at 720p on a desktop/laptop.

Another possibility is that it's an alternative to saying 1080p with DLSS/FSR on, since I think that either set to Balanced renders the game at 720p before upscaling.

That would actually seem to raise more questions than it answers if the target of 720p/30 Low (the theoretical Steam Deck target) requires hardware that should easily outpace the Deck's APU. :laugh:

Particularly when it is Deck Verified
 
That would actually seem to raise more questions than it answers if the target of 720p/30 Low (the theoretical Steam Deck target) requires hardware that should easily outpace the Deck's APU. :laugh:

Particularly when it is Deck Verified
Similar to consoles, the steam deck uses unique architecture to outperform what it actually is. Now, games are nowhere near as optimized to the same extent major releases for Sony, Nintendo, or Microsoft consoles. But, while its GPU on paper is probably at best comparable to a GTX 1050, it offers full RDNA 2.0 support and FSR. It's a benefit of highly customized architecture designed to essentially do one thing (play video games).

While games will never be as optimized as Nintendo Switch games. Look at the Switch's specs compared to PCs built in the past 8 years, 9th-gen and even their 8th gen console competitors, it's astonishing the Switch can natively run games like The Witcher 3, Doom Eternal, and Sonic Frontiers even if they are stripped down but still playable ports. It's one of the main reasons it's almost impossible to compare what highly customized architecture like those in consoles can get out of a game to PC counterparts. High-end GPU/CPU combos will brute force better performance but there are tons of tricks to get a game to run in a playable state if you are focusing on that platforms exact specifications. My guess is as a game I believe is supported by AMD, and the fact Sony uses AMD that there is something around the AMD RNDA 1.5 (PS5's architechture), the Steam Decks RDNA 2.0 and FSR systems that Square Enix leaned into to make the game perform above specs.
 
Similar to consoles, the steam deck uses unique architecture to outperform what it actually is. Now, games are nowhere near as optimized to the same extent major releases for Sony, Nintendo, or Microsoft consoles. But, while its GPU on paper is probably at best comparable to a GTX 1050, it offers full RDNA 2.0 support and FSR. It's a benefit of highly customized architecture designed to essentially do one thing (play video games).

While games will never be as optimized as Nintendo Switch games. Look at the Switch's specs compared to PCs built in the past 8 years, 9th-gen and even their 8th gen console competitors, it's astonishing the Switch can natively run games like The Witcher 3, Doom Eternal, and Sonic Frontiers even if they are stripped down but still playable ports. It's one of the main reasons it's almost impossible to compare what highly customized architecture like those in consoles can get out of a game to PC counterparts. High-end GPU/CPU combos will brute force better performance but there are tons of tricks to get a game to run in a playable state if you are focusing on that platforms exact specifications. My guess is as a game I believe is supported by AMD, and the fact Sony uses AMD that there is something around the AMD RNDA 1.5 (PS5's architechture), the Steam Decks RDNA 2.0 and FSR systems that Square Enix leaned into to make the game perform above specs.
From Steam's most recent hardware survey, about 22% of Linux users on Steam are using SteamOS. Linux users make up 1.38% of all users, so assuming all SteamOS users are on Steam Deck, Deck users made up about 0.3% of users in January. Given that most of those users are probably not even using the Deck to play AAA games, I highly doubt any developers are putting in discrete optimization for the Deck beyond making sure the game runs.
 

Ad

Upcoming events

Ad