Not So Fast AMD! 13th Gen Soon

JUNI0R

VALUED CONTRIBUTOR
While everyone's been busy buying their AM5 CPU's, Intel has announced the 13th gen K SKU's so might be worth holding off on that buy button just for a while longer. Due to be released on the 20th of October, PCS already has them available for pre-order although as always I'd wait for reviews before going through with a purchase. This generation is more of an evolution rather than revolution but the increase to L3 cache and those extra E Cores could really tighten things up between Intel and AMD and the continued support of DDR4 will certainly help Intel while DDR5 is at a premium.

Below, are a few helpful resources about the launch. As always, take performance metrics with a pinch of salt!

intel-sku-chart.jpeg

Resize


"Intel 13th Gen Core Processors Revealed: Raptor Lake Unleashed" - Screen Grabs from the Event:

13th Gen Explained:
 

SimonPeters116

Well-known member
Seems to me Intel are panicking a bit. I wonder why they'd choose today of all days to release the pre-orders...
I also took note of the AMD cpu's they are comparing against.
These are Intels new cpu's, to compete with the new AMD AM5 offerings But Intel are comparing to AM4 cpu's :unsure:
Aren't they? (I've put my size 12s into my big mouth before, frequently :D ).
 

JUNI0R

VALUED CONTRIBUTOR
Seems to me Intel are panicking a bit. I wonder why they'd choose today of all days to release the pre-orders...
I mean I'd imagine it's an intentional tactic to take coverage/ hype away from AMD.

Its pretty similar to what AMD did for the RDNA3 annoucment event- the SVP and GM for AMD Radeon tweeted this out literally 2 hours before the NVIDIA 40 series event went live

Screenshot 2022-09-27 192406.png
 

AccidentalDenz

Lord of Steam
I also took note of the AMD cpu's they are comparing against.
These are Intels new cpu's, to compete with the new AMD AM5 offerings But Intel are comparing to AM4 cpu's :unsure:
Aren't they? (I've put my size 12s into my big mouth before, frequently :D ).
Putting my foot in it is basically my role on here.

Well that and talking about video games far too much. :ROFLMAO:

I mean I'd imagine it's an intentional tactic to take coverage/ hype away from AMD.

Its pretty similar to what AMD did for the RDNA3 annoucment event- the SVP and GM for AMD Radeon tweeted this out literally 2 hours before the NVIDIA 40 series event went live

View attachment 34795
That's exactly what I was thinking to be honest. That's why I left the "..." at the end of my post.

It seems a bit mad to me, but companies are constantly trying to overshadow competitors!
 

JUNI0R

VALUED CONTRIBUTOR
I also took note of the AMD cpu's they are comparing against.
These are Intels new cpu's, to compete with the new AMD AM5 offerings But Intel are comparing to AM4 cpu's :unsure:
Aren't they? (I've put my size 12s into my big mouth before, frequently :D ).

They are indeed, although we've only just recieved the performance metrics and these slides would've been confirmed weeks ago. I doubt AMD would freely be giving infomation on their performance to Intel ;)

That said, assuming Intel will be showing themselves in the best possible light, the 12900K on average seems to just about beat out the 7950X and 7600X and they have the 13900K out performing the 12900K in every benchmark they show, which implies it'll be a chart topper. Again, take it all with a grain of salt though, you know what they're all like!

Screenshot 2022-09-27 193534.png



That's exactly what I was thinking to be honest. That's why I left the "..." at the end of my post.

It seems a bit mad to me, but companies are constantly trying to overshadow competitors!
My appologies! yeah, shame they couldn't all have their own 5 minutes, eh?!
 

SimonPeters116

Well-known member
Putting my foot in it is basically my role on here.

Well that and talking about video games far too much. :ROFLMAO:


That's exactly what I was thinking to be honest. That's why I left the "..." at the end of my post.

It seems a bit mad to me, but companies are constantly trying to overshadow competitors!
It seems a bit mad to me, but companies are constantly trying to overshadow competitors!


Seems a bit mad to me too. There are really only the two of them in this particular race.
It might look like one-upmanship to their advertising executives (or whoever), but to me anyway, it looks like being a "distinctly male anatomical appendage".
 

SpyderTracks

We love you Ukraine
That's a really good video, and I was wondering what their opinion on Z790 was.

My concern is that although power at the same frequency hasn't changed, these run at significantly higher frequencies, and from what I'm seeing WILL use far higher wattage (no idea on temps). I'm a little worried that it will turn out very much like Ryzen 5000 where although there was compatibility back with B450 and X470, there's no way you'd spec that in a new system as the VRM's just couldn't handle the increased wattage, it generally led to instability for any processors higher than the 5600x. There's a big difference with "compatibility" and "suitability".

Roughly 240W PL2 on the 12900k at 5.5GHz which was a TDP of 125W

Roughly 350W PL2 on the 13900k at 5.8Ghz which has a TDP of 125W


They are heavily pushing that it's not using any more power with smoke and mirrors, and usually with Intel it's because they're trying to hide something.

Seems to me Intel are panicking a bit. I wonder why they'd choose today of all days to release the pre-orders...
It's very Intel, I'm trying not to hate on them and just give facts so won't say any more.

13th Gen is I think going to surprise most of us haters, I don't doubt it's going to be impressive at raw performance. I'm just concerned about power and heat based on the numbers we're seeing and the 12th gen issues.

It seems a bit mad to me, but companies are constantly trying to overshadow competitors!


Seems a bit mad to me too. There are really only the two of them in this particular race.
It might look like one-upmanship to their advertising executives (or whoever), but to me anyway, it looks like being a "distinctly male anatomical appendage".
It's gotten so bad over the last decade, it's basically not worth listening to any marketing, only go by 3rd party reviews and leaks from reputable establishments you trust. This is the case for AMD or Intel, nVidia also, all of them, you can't take anything they say as truth.
 

SimonPeters116

Well-known member
They are indeed, although we've only just recieved the performance metrics and these slides would've been confirmed weeks ago. I doubt AMD would freely be giving infomation on their performance to Intel ;)

That said, assuming Intel will be showing themselves in the best possible light, the 12900K on average seems to just about beat out the 7950X and 7600X and they have the 13900K out performing the 12900K in every benchmark they show, which implies it'll be a chart topper. Again, take it all with a grain of salt though, you know what they're all like!

View attachment 34796



My appologies! yeah, shame they couldn't all have their own 5 minutes, eh?!

Aye, nothing has changed there, to be frank.
It's never been a concern to me before this. I've never had the finances to be buying a "latest, greatest" anything.
By the time I was buying, all the lumps and bumps had been smoothed out, there was a chance it was starting to go rusty even.
This time I'm buying a new thing, the minute it comes on stream 😮 😁

Oooooh! I wondered why there was only 3 bars on their chart @IRLRobinS .
I missed that tiny hyphen all together 😳
That is a sneaky one.
 

Salmon Fisher

Enthusiast
And in six months "Hey! We're launching a new generation of architecture!" And the roundabout carried on. It really doesn't seem too long ago that the Intel P100 chip was 'the chip to beat all chips.'. I just can't get all shivery n excited any more. 🤷‍♂️
 

SpyderTracks

We love you Ukraine
And in six months "Hey! We're launching a new generation of architecture!" And the roundabout carried on. It really doesn't seem too long ago that the Intel P100 chip was 'the chip to beat all chips.'. I just can't get all shivery n excited any more. 🤷‍♂️
Every release I tell myself afterwards never to listen to hype again, and every release I go through the same cycle of lapping up all the rubbish that people are paid to put out to mislead.

It's a never ending circle.

But every release, I get insanely excited. For me, the leaps that are being made in the last 5 years are so huge. Hardware is really taking over software capabilities in some respects, gaming is playing catch up to the GPU's at the moment.
 

SimonPeters116

Well-known member
And in six months "Hey! We're launching a new generation of architecture!" And the roundabout carried on. It really doesn't seem too long ago that the Intel P100 chip was 'the chip to beat all chips.'. I just can't get all shivery n excited any more. 🤷‍♂️
I remember the P100, but that's all I remember, and only because you've mentioned it.
When was it? And what was its clocked Hz ????
P.S. No need to be exact, was it a '90s chip? :D
 

JUNI0R

VALUED CONTRIBUTOR

LTT seems to now be putting clickbait titles on their video's, ignore that title, it's actually pretty positive.
First time seeing an LTT video? ;) Feel like they've been like that for a while now

They've also done a graph with a filled in 5800X3D bar that @IRLRobinS noticed 👀 Doesn't look quite so impressive now! There was also an interesting part about them using 4800mhz DDR5 to run the 12900K tests assuming this means their percentage gains are bigger? Not sure, bit odd!
unknown.png
 

SpyderTracks

We love you Ukraine
There was also an interesting part about them using 4800mhz DDR5 to run the 12900K tests assuming this means their percentage gains are bigger?
13900k was tested with DDR 5 5600MHz C28 in every test

12900k was tested with 4600MHz C28 (absolutely no reason should have been different, platform has supported 5600MHz for over a year)

5800X3D was tested with DDR 4 3200MHz C14 (arguably they should have tested with at least 3600MHz as it's a competitive chip)

It also looks like the 5950X was tested on the lowest TDP 105W which strongly suggests PBO was disabled which would affect performance hugely. This hasn't be officially corroborated AFAIK

Source: https://edc.intel.com/content/www/us/en/products/performance/benchmarks/desktop/

This is nothing new for Intel, they quite frankly flat out lie to the point they got annoyed when people called them out for it, and suggested all benchmarks should be removed as they kept being found out to be lying through their teeth. And they didn't just try it once, they actually tried it several times over a few years, even making it a requirement for any board partner or anyone involved in their Microcode that they were not allowed to publicise any real world benchmarks or it would break their contract, they quickly reversed this when partners basically said they were leaving.


 
Last edited:

SpyderTracks

We love you Ukraine
Worth watching.

I'm a little hesitant with Jayz after his recommendation to buy RTX 3000 series when they were still remarkably high price, which seemed rather a shill move

But it's still worth watching

 
Top