PC Building Guide and Discussion #14

Kupo

MAFIA, MOUNT UP!
Sponsor
Oct 31, 2017
11,724
24,960
Stamford CT
Hey nerds ✌🏼

My brother passed away a few years ago and I’m trying to figure out what to do with his gaming PC. I know very little about PC’s. I know it’s massive and he was a passionate WOW player. It doesn’t turn on though, I vaguely recall him saying something about the motherboard or hard drive.

What can I do with this? I don’t have any plans to fix it or use it.
 

Attachments

  • E42FF988-C6B5-4FFE-8852-199BAA009FC1.jpeg
    E42FF988-C6B5-4FFE-8852-199BAA009FC1.jpeg
    162.5 KB · Views: 8
  • FF7F54A8-7992-40F9-B639-6FBEEA2451B9.jpeg
    FF7F54A8-7992-40F9-B639-6FBEEA2451B9.jpeg
    257.8 KB · Views: 8
  • 3E894381-3872-429F-98E6-CCD4923A1C19.jpeg
    3E894381-3872-429F-98E6-CCD4923A1C19.jpeg
    202.2 KB · Views: 9
  • 15D38218-FD54-4788-8A89-16BC57441D7B.jpeg
    15D38218-FD54-4788-8A89-16BC57441D7B.jpeg
    196.6 KB · Views: 8
  • 06F38F24-0353-4750-A523-9FCACA3D98C6.jpeg
    06F38F24-0353-4750-A523-9FCACA3D98C6.jpeg
    324.7 KB · Views: 9
  • 10FFF01E-931C-46FF-BD6C-F3BFC2F74C8B.jpeg
    10FFF01E-931C-46FF-BD6C-F3BFC2F74C8B.jpeg
    342.2 KB · Views: 7
  • C1C41E99-832E-4EAB-8D48-A8E932C057A0.jpeg
    C1C41E99-832E-4EAB-8D48-A8E932C057A0.jpeg
    305.4 KB · Views: 9

PeteWorrell

[...]
Aug 31, 2006
5,095
2,216
Nothing much you can do with it at this point other than send it to a recycling center. It's incredibly outdated.
 
  • Like
Reactions: Osprey

aleshemsky83

Registered User
Apr 8, 2008
17,918
464
If you're asking can you get money for it, lol, no you can't. But someone might take the case for free.
 

Osprey

Registered User
Feb 18, 2005
27,922
10,805
To give you an idea of how outdated that PC is, the motherboard is 13 years old. You could sell the parts on eBay for a few bucks, but you'd have to know how to disassemble it and list the parts, and then be willing to ship them. It's probably not worth the trouble unless you're really strapped for cash. The easiest thing would be to just take it to a recycling center, as PeteWorrell suggested.
 
Last edited:

GreytWun

Registered User
Sep 29, 2017
1,858
1,970
Ontario
Throw a 13900k in there and your good… just kidding. As others have said it’s not worth much of anything.

Make sure you either wipe the hard drive or destroy it. Might have some personal items on there.
 

aleshemsky83

Registered User
Apr 8, 2008
17,918
464
I didn't realize AMD GPUs went to dual-GPU/chiplet design like their CPUs, Id like to see Nvidias take on that
 

PeteWorrell

[...]
Aug 31, 2006
5,095
2,216
People interested in buying an Nvidia GeForce RTX 3060 should be aware that they have released a 8GB model at the same price as the 12GB model except the performances are quite a bit worse. Nvidia sneakily releasing this after the 4080 backlash.

 
  • Wow
Reactions: Osprey

GreytWun

Registered User
Sep 29, 2017
1,858
1,970
Ontario
Nvidia is showing some poor business decisions the past generation. I think the AMD cards getting better and closing the gap is getting to them.
 

SolidSnakeUS

HFBoards Sponsor
Sponsor
Aug 13, 2009
49,504
13,349
Baldwinsville, NY
Nvidia is showing some poor business decisions the past generation. I think the AMD cards getting better and closing the gap is getting to them.

The only thing they need to do is get better at ray tracing. Outside of that, AMD really is trying to be competitive in a good way.
 

aleshemsky83

Registered User
Apr 8, 2008
17,918
464
People interested in buying an Nvidia GeForce RTX 3060 should be aware that they have released a 8GB model at the same price as the 12GB model except the performances are quite a bit worse. Nvidia sneakily releasing this after the 4080 backlash.


the reviewer mentioned the gt 1030, this really all started when reviewers defended the gtx 1060 3 gb as being just as good as the 6 gb.
 

SniperHF

Rejecting Reports
Mar 9, 2007
42,821
22,199
Phoenix
the reviewer mentioned the gt 1030, this really all started when reviewers defended the gtx 1060 3 gb as being just as good as the 6 gb.

Nvidia and ATI used to routinely sell DirectX 7 and 8.1 feature set cards as DX9 because back then backward compatibility was actually thing so they could get away with it as the games would still run. That and a ton of people still ran 800x600 CRTs so they wouldn't notice anyway.

The Geforce 4 MX line was infamous for its pitfalls and trapping new PC builders. Probably scared a lot of people out of the hobby :laugh:
 
  • Like
Reactions: aleshemsky83

Osprey

Registered User
Feb 18, 2005
27,922
10,805
The only thing they need to do is get better at ray tracing. Outside of that, AMD really is trying to be competitive in a good way.
They could get better at innovating, as well. They come out with a good feature only after Nvidia did it first. Their engineers are good at catching up, but it'd be nice if they'd develop the next killer feature and force Nvidia to catch up and copy them for a change.

Other than that, I agree. AMD is catching up in hardware and has already caught up in drivers and software. In fact, AMD's software is a lot better than Nvidia's. I switched back to Nvidia a year ago and was surprised to discover that the Nvidia Control Panel looks no different than when I used it over 10 years ago. It looks like something from the XP era.
 

aleshemsky83

Registered User
Apr 8, 2008
17,918
464
They could get better at innovating, as well. They come out with a good feature only after Nvidia did it first. Their engineers are good at catching up, but it'd be nice if they'd develop the next killer feature and force Nvidia to catch up and copy them for a change.

Other than that, I agree. AMD is catching up in hardware and has already caught up in drivers and software. In fact, AMD's software is a lot better than Nvidia's. I switched back to Nvidia a year ago and was surprised to discover that the Nvidia Control Panel looks no different than when I used it over 10 years ago. It looks like something from the XP era.
The control panel is clunky but its super straightfoward. Straightfoward GSync controls, better downsampling and control over sharpness/smoothness when downsampling, able to force AA, texture filtering, Vsync, framerate caps, ambient occlusion, etc in older games that dont have it. Its very powerful.

Even Geforce Experience as much flak as it gets is pretty decent.

AMD really just has a bunch of "features" that do nothing. AMD Chill, AMD Rage mode, etc, Nobody even knows what they do.
 

PeteWorrell

[...]
Aug 31, 2006
5,095
2,216
AMD simply has much more resources dedicated to the CPU side of the business than the GPU side. We can't forget that they had to fight Intel's dominant position that almost bankrupted them. Their overall turnaround as a company as been very impressive considering their precarious position after the Bulldozer disaster 10 years ago.
 

SolidSnakeUS

HFBoards Sponsor
Sponsor
Aug 13, 2009
49,504
13,349
Baldwinsville, NY
They could get better at innovating, as well. They come out with a good feature only after Nvidia did it first. Their engineers are good at catching up, but it'd be nice if they'd develop the next killer feature and force Nvidia to catch up and copy them for a change.

Other than that, I agree. AMD is catching up in hardware and has already caught up in drivers and software. In fact, AMD's software is a lot better than Nvidia's. I switched back to Nvidia a year ago and was surprised to discover that the Nvidia Control Panel looks no different than when I used it over 10 years ago. It looks like something from the XP era.

AMD introduced chiplet GPU design years ago and are now going for this in RDNA 3. Like how it is with Ryzen, it seems to be the way to go for the current future.

AMD also provides a USB-C on the back of the RDNA 3 cards and supports DP 2.1, which the RTX 4000 series does not.

However, in general, for Nvidia, they just put all their money into graphics and AI work, and most of the hardware side of things goes towards graphics in some way. Yes, they have SoC for things like the Switch, most of their money goes towards graphics and the tech that goes along with it. Such as with their RTX and whatnot. However, while AMD isn't up to that kind of level, they are still keeping up in raster performance and generally good value, overall, for the money.

For reference, I'm on an Nvidia streak right now (GTX 690 -> GTX 970 -> RTX 2080 -> RTX 3080).

One last thing for me, the reference card from AMD for the 7900 XTX would actually fit in a SFF Lian Li H2O case, because it's just under 3 slots and less than 300mm long, so it would fit into my dream build.
 
  • Like
Reactions: GreytWun

Osprey

Registered User
Feb 18, 2005
27,922
10,805
The control panel is clunky but its super straightfoward. Straightfoward GSync controls, better downsampling and control over sharpness/smoothness when downsampling, able to force AA, texture filtering, Vsync, framerate caps, ambient occlusion, etc in older games that dont have it. Its very powerful.

Even Geforce Experience as much flak as it gets is pretty decent.

AMD really just has a bunch of "features" that do nothing. AMD Chill, AMD Rage mode, etc, Nobody even knows what they do.
I do appreciate the straightforwardness of Nvidia. I'm not sure that their software is more powerful than AMD's, though. I think that AMD has most of the same features. They're just sometimes hidden behind confusing marketing terms, like you mentioned. For example, I believe that AMD Chill is their feature for setting framerate caps. I do prefer that Nvidia just calls it what it is, instead. Also, AMD has the equivalent of GeForce Experience built into the same app that are their settings are in. Also there are ways to monitor and overclock/undervolt the card. Nvidia seems to have only basic support for such things in the overlay and you're probably better off using MSI Afterburner. AMD Adrenaline seems to be a more centralized, modern experience. They've put a lot of work into their software, probably because they have to, while it seems less important to Nvidia. I'd ideally like everything in one place with a modern UI, like AMD has, but with straightforward settings, like Nvidia has.

AMD introduced chiplet GPU design years ago and are now going for this in RDNA 3. Like how it is with Ryzen, it seems to be the way to go for the current future.
True, and Nvidia may have to play catch up in that area, but it doesn't seem to be enough to give AMD an edge, at least not yet, as the 7900XTX isn't expected to beat the 4090. I was really thinking about DLSS and frame generation and how it'd be nice if AMD were to develop something like that first, instead of playing follow the leader.
 
Last edited:

SolidSnakeUS

HFBoards Sponsor
Sponsor
Aug 13, 2009
49,504
13,349
Baldwinsville, NY
Really the only thing Nvidia has over AMD at this point is ray tracing hardware. AMD has generally been better at the Nvidia equivalent at raster with RDNA 2 and it might kind of stay to that (except the 4090, because that thing is insane).

Basically, AMD NEEDS to run real hard are ray tracing, however, there is one key difference between AMD and Nvidia when it comes to ray tracing, Nvidia is using proprietary software to go with their proprietary hardware. AMD, while their own method to do ray tracing is their own, the software side of things is much more open source than Nvidia and RTX/DLSS. RTX only works on Nvidia cards, while FSR can work on many cards across different makers.
 

93LEAFS

Registered User
Nov 7, 2009
34,185
21,382
Toronto
Really the only thing Nvidia has over AMD at this point is ray tracing hardware. AMD has generally been better at the Nvidia equivalent at raster with RDNA 2 and it might kind of stay to that (except the 4090, because that thing is insane).

Basically, AMD NEEDS to run real hard are ray tracing, however, there is one key difference between AMD and Nvidia when it comes to ray tracing, Nvidia is using proprietary software to go with their proprietary hardware. AMD, while their own method to do ray tracing is their own, the software side of things is much more open source than Nvidia and RTX/DLSS. RTX only works on Nvidia cards, while FSR can work on many cards across different makers.
I don't care about it all, and I assume most don't unless they make income from it, but Nvenc is big for streamers and therefore further promotes the product to impressionable kids who actually follow these people.
 

93LEAFS

Registered User
Nov 7, 2009
34,185
21,382
Toronto
AMD introduced chiplet GPU design years ago and are now going for this in RDNA 3. Like how it is with Ryzen, it seems to be the way to go for the current future.

AMD also provides a USB-C on the back of the RDNA 3 cards and supports DP 2.1, which the RTX 4000 series does not.

However, in general, for Nvidia, they just put all their money into graphics and AI work, and most of the hardware side of things goes towards graphics in some way. Yes, they have SoC for things like the Switch, most of their money goes towards graphics and the tech that goes along with it. Such as with their RTX and whatnot. However, while AMD isn't up to that kind of level, they are still keeping up in raster performance and generally good value, overall, for the money.

For reference, I'm on an Nvidia streak right now (GTX 690 -> GTX 970 -> RTX 2080 -> RTX 3080).

One last thing for me, the reference card from AMD for the 7900 XTX would actually fit in a SFF Lian Li H2O case, because it's just under 3 slots and less than 300mm long, so it would fit into my dream build.
As a company, I think Nvidia's big play is data centers, which is why they are valued more than AMD/Intel combined. Just not something consumers think about as that is genrally a B2B buisness. I believe they reported that Data Centers were either equal or has surpassed GPU's in the past year.
 

93LEAFS

Registered User
Nov 7, 2009
34,185
21,382
Toronto
If the rumored recommendations for Jedi: Survivor are legit, it may be the most aggressive spec recommendations I've seen. I wonder if this is just terrible optimization or a sign of whats to come in AAA games made in Unreal 5.
Jedi Survivor.png
 

Osprey

Registered User
Feb 18, 2005
27,922
10,805
I think that it's because they're no longer targeting last-gen consoles. They don't need to optimize the game for consoles with less than 8GB, so they're just not going to bother optimizing it for PC GPUs with less than 8GB, either. The GTX 1070 and RX 580 are the lowest end cards with 8GB, so that's probably why they're the minimums. The game could probably run fine on the GTX 1050 and 1060 (after all, Fortnite updated to Unreal Engine 5.1 runs fine on Medium and Low settings on the 1060), but they probably don't want to bother just for the sake of PC gamers with those cards.
 

aleshemsky83

Registered User
Apr 8, 2008
17,918
464
The ryzen 1400 isn’t that aggressive a spec since the next oldest cpu is a bulldozer cpu I think there might be certain instructions that some of the older cpus don’t support. This happened with a recent game I can’t remember. The gpus do seem pretty aggressive though, I think it may be the vram as mentioned.

I think that it's because they're no longer targeting last-gen consoles. They don't need to optimize the game for consoles with less than 8GB, so they're just not going to bother optimizing it for PC GPUs with less than 8GB, either. The GTX 1070 and RX 580 are the lowest end cards with 8GB, so that's probably why they're the minimums. The game could probably run fine on the GTX 1050 and 1060 (after all, Fortnite updated to Unreal Engine 5.1 runs fine on Medium and Low settings on the 1060), but they probably don't want to bother just for the sake of PC gamers with those cards.
Yeah lumen seems to run alright on console, but the consoles have 16 gb (other than the series S which I assume requires some opitimization).
 

Smelling Salt

Busey is life
Mar 8, 2006
7,230
3,668
Winnipeg
Very excited for my first custom build since 2005 haha. Since that one it's been two $500-600 (CDN $) Dells.

Currently I have a nine year old Dell i5-4440 based system. This thing has never had a problem really. It could use a fresh install of Windows and even an SSD would spice it up, but I think it's time to move on. Hell, I even use an ATI 5450 GPU in it, which I have been using in two different PCs for 13 years. :laugh: This PC would still be perfect for my oldest kid for school work.

So I ordered these components from Memory Express. Some parts were in stock locally, others were only available at their other locations/online store. So I'm not sure how long I will have to wait for them. I considered a 12600, then figured well if I am going 12600 why not 13600. But ultimately I realized I am not going to need that kind of power, and it's about $200-$400 more depending on how they're configured. Plus I am going to be using the iGPU for the foreseeable future so why bother loading up on raw CPU power.

CPU - Intel i5-12400
MOBO - MSI MAG B660M MORTAR WIFI
RAM - G.Skill Trident Z RGB 16GB (2 x 8gb)
SSD - Kingston KC3000 NVMe m2 1TB
PSU - Cooler Master MWE Gold v2 750w
CASE - Fractal Design Pop Mini Air RGB Black mATX mid tower case w/ glass side panel

Stock cooling (stock CPU cooler, three stock RGB case fans - two up front 1 rear).

If my kid bugs me enough, down the road I may grab a 3060/3070/6600/6700 or whatever the equivalent is at that time. At which point I may add one or two more fans at the top of the case. But for now this is gonna do nicely.
 
  • Like
Reactions: GreytWun

Seedtype

Registered User
Sponsor
Aug 16, 2009
2,577
1,109
Ohio?!?!
Looks like Armored Core 6 will be the first AAA game that will push my new computer... Can't wait!:D
 

Osprey

Registered User
Feb 18, 2005
27,922
10,805


To summarize, the 7900 XTX is generally on par with the 4080 in rasterized gaming and on par with the 3090 Ti in ray traced gaming. He questions whether the $200 savings is enough to make it more attractive than the 4080. Performance is most disappointing compared to the 6950 XT, since AMD led us to expect 1.5-1.7x the performance of the 6950 XT and they saw only 1.4x. Claims aside, the unexpected closeness to the 6950 XT could mean that there's room for driver improvements.
 
Last edited:

Ad

Upcoming events

Ad

Ad