New gaming rig - intel or AMD?

SpyderTracks

We love you Ukraine
The X870E chipset is basically a renamed X670E board, just the former has USB4 , which I probably would never use/need anyway.
You may think that, but basically any connected device will now be USB C (which USB 4 incorporates as well as Thunderbolt 3), it's been written into law in Europe that all devices need to have a standardised port to cut down on needless e waste of perfectly good chargers and cables and also meaning you can have a single charger for your laptop, mobile, tablet etc etc, and USB C is what's been mandated.

You can also daisy chain USB 4 ports to several devices, so use one port for one device, then daisy chain from that device to the next and so on, rather than having to have millions of individual ports per connection.

And manufacturers aren't going to fragment manufacturing based on sales region, so UK will get the same as EU (which in turn is now translating to worldwide because it makes sense). This is what made Apple finally use a decent standard rather then their confounded proprietary connectors that they either changed far too frequently, or didn't change for decades when it was incredibly outdated and slow (lightning port anyone???)

So all devices from now on will be USB C for the foreseeable future.

This was only written in fairly recently

 
Last edited:

H1N1

Active member
You may think that, but basically any connected device will now be USB C (which USB 4 incorporates as well as Thunderbolt 3), it's been written into law in Europe that all devices need to have a standardised port to cut down on needless e waste of perfectly good chargers and cables and also meaning you can have a single charger for your laptop, mobile, tablet etc etc, and USB C is what's been mandated.

You can also daisy chain USB 4 ports to several devices, so use one port for one device, then daisy chain from that device to the next and so on, rather than having to have millions of individual ports per connection.

And manufacturers aren't going to fragment manufacturing based on sales region, so UK will get the same as EU (which in turn is now translating to worldwide because it makes sense)

So all devices from now on will be USB C for the foreseeable future.

This was only written in fairly recently

I didn't consider the recent law ...y for the sake of it, can mean getting stung.
 

H1N1

Active member
I didn't consider the recent law change to USB-C and that impacting motherboards ... so again, thank you for the info.

Thank you for making me aware of ASUS and their quite despicable behaviour towards consumers; quite staggering really. I wonder if that's why they've increased their prices - to make up for those avoiding their products after the stunts they played; it has certainly made me think twice!

If another brand like Gigabyte offers the same performance, quality of build etc, and is £200 less (which is a huge saving!) then will certainly look into buying one! Like you say, buying with a certain company for the sake of it, can mean getting stung
 

SpyderTracks

We love you Ukraine
I didn't consider the recent law change to USB-C and that impacting motherboards ... so again, thank you for the info.

Thank you for making me aware of ASUS and their quite despicable behaviour towards consumers; quite staggering really. I wonder if that's why they've increased their prices - to make up for those avoiding their products after the stunts they played; it has certainly made me think twice!

If another brand like Gigabyte offers the same performance, quality of build etc, and is £200 less (which is a huge saving!) then will certainly look into buying one! Like you say, buying with a certain company for the sake of it, can mean getting stung
It's a crazy marketplace, these companies that we're so used to being top dogs like Asus and EVGA and Intel, everything has changed so much in the last 5 years.

Competition has been so fierce industry wide that it's really shaken things up for those that rested on their laurels. There are so many new manufacturers (thinking mainly in the case side of things) really shaking things up with extremely competitive products that haven't been done before.

Just wish the same was true in the high end GPU market!!!
 

H1N1

Active member
It's a crazy marketplace, these companies that we're so used to being top dogs like Asus and EVGA and Intel, everything has changed so much in the last 5 years.

Competition has been so fierce industry wide that it's really shaken things up for those that rested on their laurels. There are so many new manufacturers (thinking mainly in the case side of things) really shaking things up with extremely competitive products that haven't been done before.

Just wish the same was true in the high end GPU market!!!

So much in the market to get your head around, especially if you're someone (like me) who isn't tech-savvy at all ... It is quite overwhelming looking at all the motherboard brands and knowing who to exclude and who to consider, and then the headache of knowing what features are important and what are sales gimmicks.

I so hope GPUs will soon become available so that prices can reduce; what should the 5090 card be sitting at price wise?
 

SpyderTracks

We love you Ukraine
I so hope GPUs will soon become available so that prices can reduce; what should the 5090 card be sitting at price wise?
The MSRP of the 5090 founders edition (reference model) is £2000

So after market overclocked models should be around £2200 upwards.

But MSRP has become a bit of a joke with GPUs, both AMD and NVidia are to blame for that. I wouldnt be surprised if regulators step in at some point because it has gotten really bad blurring the lines of price fixing IMHO
 

H1N1

Active member
Just thinking.

The 9800X3D is 8 cores with 16 threads = one CCD
the 9950X3D is 16 cores with 32 threads = two CCD - only one of these have 3D V-Cache, so when gaming the other CCD will shut down.

Does that mean the threads will drop from 32 to 16? If so, that will massively affect gaming performance for games such as Cities Skylines 2, which likes more cores and threads.

If that is the case, would that mean the 14900KS with 24 cores and 32 threads; all of which will be active during game play would actually be the better CPU ?
 

SpyderTracks

We love you Ukraine
Just thinking.

The 9800X3D is 8 cores with 16 threads = one CCD
the 9950X3D is 16 cores with 32 threads = two CCD - only one of these have 3D V-Cache, so when gaming the other CCD will shut down.

Does that mean the threads will drop from 32 to 16? If so, that will massively affect gaming performance for games such as Cities Skylines 2, which likes more cores and threads.

If that is the case, would that mean the 14900KS with 24 cores and 32 threads; all of which will be active during game play would actually be the better CPU ?
14900KS is miles behind X3D chips.

In gaming the nonx3d CCD isn’t disabled, they’re windows scheduler just balanced processes as required. The thhreads are still fully available in games, nothing gets disabled.
 

H1N1

Active member
I dont see how a CPU that shuts down half the CPUs to 8 , can be better performing than a processor of over double the cores (24) such as the intel 14900KS.
Everything online , and even chatgtp says that 149000KS would outperform an AMD chip, and the threads reduce. Sure, on some games, such as first-shooters - single thread games, the AMD may come out on top. However, for multi-threaded games which like multi-cores, like Cities Skylines II , is AMD 9950X3D still the better choice?
 

H1N1

Active member
14900KS is miles behind X3D chips.

In gaming the nonx3d CCD isn’t disabled, they’re windows scheduler just balanced processes as required. The thhreads are still fully available in games, nothing gets disabled.
I have read that the nonX3D CCD is shut down when gaming , only the CCD with the 3D V-Cache runs - effectively an 8-core 16 thread CPU; only on things like workstations do both CCDs work = 16 cores and 32 threads
 

SpyderTracks

We love you Ukraine
I dont see how a CPU that shuts down half the CPUs to 8 , can be better performing than a processor of over double the cores (24) such as the intel 14900KS.
Everything online , and even chatgtp says that 149000KS would outperform an AMD chip, and the threads reduce. Sure, on some games, such as first-shooters - single thread games, the AMD may come out on top. However, for multi-threaded games which like multi-cores, like Cities Skylines II , is AMD 9950X3D still the better choice?
AI chatbots are trained on openly available data, this will be from sources like UserBenchmark and other comparison sites which are bogus data, they don’t perform any benchmarks, they just assign a metric to frequency and cores, none of which is enough to estimate real world performance.

Sounds like you’re looking at these comparisons sites, they’ve been banned on most of the internet as it’s just flat out lies
 

H1N1

Active member
well. the initial question was something which occurred to me; If the intel uses all cores and threads 24 cores and 32 threads, however, with the AMD CPU, half of the cores shuts down, does that mean half the threads are cut - and it turns out they are. Therefore, it has left me wondering, sure, for single core/thread games like COD etc, it won't make much of a difference and is why these games are comfortable on the one CCD , and score highly on reviews. However, for games like cities Skyline II which want as many cores and treads as possible, what impact does having the CPU power halved have on Cities skylines ....

there are little to no benchmarks for cities skylines between intel and AMD and that's a shame!
 

TonyCarter

VALUED CONTRIBUTOR
If you want to compare high-core workloads, then I'd suggest looking at the productivity benchmarks (Premier Pro, Blender, VRay, DaVinci etc.) from PugetSystems.

In some edge cases the 14900 Intel wins out, in others the AMD 9950X wins. There's also cases where the lower tier 14700 or 9700X beats the higher tier CPUs (usually due to higher single core boosts). BUT the Intel is using 2-3x the power to do so (and producing a lot of heat), and for long runs it slows down if there's not enough cooling. To solve this, Intel has handicapped the 15th Gen CPUs (i.e. the newly named 285K, to hide the fact that that it's a handicapped 14900), which involves removing the efficiency cores, lowering the power usage, and turning down all-core boosts.




 

H1N1

Active member
This is what someone put regarding the 7800X3D: "The 7800x3d wasn't even close so don't even bother with anything less than 16 cores/32 threads"

This links to the concerns which I set out above; the one CCD working with the other 'parked' could cause huge loss of performance if threads and cores are not available.
hmm
 

SpyderTracks

We love you Ukraine
This is what someone put regarding the 7800X3D: "The 7800x3d wasn't even close so don't even bother with anything less than 16 cores/32 threads"

This links to the concerns which I set out above; the one CCD working with the other 'parked' could cause huge loss of performance if threads and cores are not available.
hmm
The second CCD only parks if it’s not in use, as already advised, the windows scheduler will manage the cores as per the games requirements. Cities skylines 2 will use the full 16 cores just as with MSFS2024

Its not the case that it disables the non 3dvcache by default for games, ONLY if the scheduler recognises that those cores can’t be used

 

TonyCarter

VALUED CONTRIBUTOR
This is what someone put regarding the 7800X3D: "The 7800x3d wasn't even close so don't even bother with anything less than 16 cores/32 threads"

This links to the concerns which I set out above; the one CCD working with the other 'parked' could cause huge loss of performance if threads and cores are not available.
hmm
But that core parking issue doesn't happen on the 9000 series.

You seem to be ignoring the majority of posts saying the 9950X/9950X3D will be the best and parroting only those comments that support your view that the 14900K is the best...end of story?
  • The 7800X3D was faster than the 14700K/14900K in most games (there's always outliers).
  • The 7950X3D was slightly slower in most games due to having slower cores for heat management...but also had Windows scheduling problems for some games where the wrong CCD was used.
  • The 9800X3D is about 20-30% faster than the 7800X3D.
  • The 9950X3D is about the same speed as the 9800X3D (no problem with slower cores as heat management is no longer an issue) - and the scheduling has been fixed
The 'closest' comparisons I can find on youtube are for sims where the cores are important for running the AI algorithms between turns (Stellaris 3 for example)...

But the only way you're really going to know is to do a public service by buying the 2 different systems (where only the CPU/motherboard is different) and share the results with everyone on the internet who's asking the same question.
 
Last edited:

Ekans2011

VALUED CONTRIBUTOR
This is what someone put regarding the 7800X3D: "The 7800x3d wasn't even close so don't even bother with anything less than 16 cores/32 threads"

This links to the concerns which I set out above; the one CCD working with the other 'parked' could cause huge loss of performance if threads and cores are not available.
hmm
I think they forgot that 14900s have silicon defects and cannot be used to their full potential without a proper cooling solution (at least 420mm AIO or custom open loop), thus choosing INTEL is pointless if you value stability and cost/performance ratio.
 

H1N1

Active member
guess what ... I've another question! lol

So I know AMD's sweetspot is 6000mhz.

Is it worthwhile getting a 72000 mhz kit just in case I can work out overclocking the RAM, or the future, AMD support higher than 60000??
 

SpyderTracks

We love you Ukraine
guess what ... I've another question! lol

So I know AMD's sweetspot is 6000mhz.

Is it worthwhile getting a 72000 mhz kit just in case I can work out overclocking the RAM, or the future, AMD support higher than 60000??
AMD happily support up to more like 7600MHz it's not the support that's the issue. Some boards can support over 8000MHz

On any DRAM architecture (DDR5 in this instance) the apps using that architecture will have a frequency they operate most efficiently and above that they don't get any gains, that will never change until the DRAM architecture is updated.

So on DDR4 is was 3200MHz for gaming

On DDR5 it's 6000MHz

Overclocking the RAM will simply up the speeds the RAM is operating at, but won't improve gaming performance at all as the game load doesn't have any benefit over 6400MHz.
 
Last edited:
Top