Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

joelypolly

macrumors 6502a
Original poster
Sep 14, 2003
511
218
Bay Area
With Anandtech's analysis here and assuming relatively good scaling we are looking at GPU performance of a 6800 and 6800XT out of the rumored 32 core GPU at around 50 to 55W and between 5700 and 5700 XT for the 16 more model at < 30W. Both are probably going to be game changers for battery life when it comes to any serious GPU related work
 

sunny5

macrumors 68000
Jun 11, 2021
1,679
1,522
Apple-M1X-GPU-performance-estimates-1030x579.jpg
 

Pro Apple Silicon

Suspended
Oct 1, 2021
361
426
That seems like pretty high performance? IDK. I only pay attention to GPU for driving displays and gaming. Precious few things use the GPU for hardware acceleration.
 

sunny5

macrumors 68000
Jun 11, 2021
1,679
1,522
That seems like pretty high performance? IDK. I only pay attention to GPU for driving displays and gaming. Precious few things use the GPU for hardware acceleration.
For its performance and power consumption, it's a revolutionary GPU which AMD and Nvidia can't even make. Mobile RTX 3070's performance with only 40W is almost impossible. Mobile RTX 3070 itself consumes up to 120W but only 40W is somewhat amazing... if it actually works. We'll see.
 

- rob -

macrumors 65816
Apr 18, 2012
1,007
683
Oakland, CA
I put a very speculative estimate of Metal performance on M1X 16 and 32 core GPUs in the XDR Owners thread.

My estimate is based more around what Apple would need to do to outperform official GPU boosts available for the Intel Mac Mini: the Blackmagic eGPU products.

I make a ton of assumptions in this, so I'd be happy to have it broken down if someone wants to add comments or rework the spreadsheet.
 

Jorbanead

macrumors 65816
Aug 31, 2018
1,205
1,434
I put a very speculative estimate of Metal performance on M1X 16 and 32 core GPUs in the XDR Owners thread.

My estimate is based more around what Apple would need to do to outperform official GPU boosts available for the Intel Mac Mini: the Blackmagic eGPU products.

I make a ton of assumptions in this, so I'd be happy to have it broken down if someone wants to add comments or rework the spreadsheet.
That’s a really interesting analysis and I think you may be right on your reasoning. Personally, the GPU on the M1 is the single biggest reason why I haven’t upgraded my 2018 mini + eGPU. It just can’t compete and frankly I want something even better than my Vega card. If the new 32-core GPU mini comes close to your estimations I will be extremely happy. My wallet is ready Apple…
 
  • Like
Reactions: - rob -

zakarhino

Contributor
Sep 13, 2014
2,480
6,709
Let’s get this straight:

If this ends up being true and it can sustain this level of performance in a Windows virtual machine running a game, along with unbeatable battery life/efficiency, a mini LED ProMotion display, the ability to run macOS as well as Asahi Linux natively, Windows in a virtual machine, the best trackpad hardware on a laptop, great industrial design, then Apple will have the definitive, unquestionable best laptop on the market.

It will be practically impossible to call it a bad laptop unless it’s got an astronomical price tag. Even the windows gaming crowd will have to give this a serious look.
 

sunny5

macrumors 68000
Jun 11, 2021
1,679
1,522
Let’s get this straight:

If this ends up being true and it can sustain this level of performance in a Windows virtual machine running a game, along with unbeatable battery life/efficiency, a mini LED ProMotion display, the ability to run macOS as well as Asahi Linux natively, Windows in a virtual machine, the best trackpad hardware on a laptop, great industrial design, then Apple will have the definitive, unquestionable best laptop on the market.

It will be practically impossible to call it a bad laptop unless it’s got an astronomical price tag. Even the windows gaming crowd will have to give this a serious look.
If it's true, then it would be revolutionary cause nobody can make M1X grade chip so far. Even Nvidia's GPU consume a lot of power. But we still dont know anything about the unified memory cause GPU requires high bandwidth with faster speed such as GDDR6 and HBM2e. It's not yet prove so we'll see.
 
  • Like
Reactions: BenRacicot

jjjoseph

macrumors 6502a
Sep 16, 2013
503
643
I would hope with shared memory we will have a 32+gb option, since I want a 32+ core GPU and a 12+ core cpu and enough ram for everything else.. More RAM the better IMHO with this shared architecture...

I want the new M1X to compete with workstations.. hopefully that is the goal.
 

- rob -

macrumors 65816
Apr 18, 2012
1,007
683
Oakland, CA
That’s a really interesting analysis and I think you may be right on your reasoning. Personally, the GPU on the M1 is the single biggest reason why I haven’t upgraded my 2018 mini + eGPU. It just can’t compete and frankly I want something even better than my Vega card. If the new 32-core GPU mini comes close to your estimations I will be extremely happy. My wallet is ready Apple…
Thank you. Are you using the Blackmagic Pro Vega 56 or a custom build?

In Apple land, the Blackmagic eGPUs are the only possible comparable for sub-Mac pro workstations.

So I don’t know what Apple could possibly do apart from releasing a new Apple Silicon eGPU that lifts the GPU performance of any Apple Silicon product (this would be great imo) or simply start releasing increasingly competitive compute in the machines themselves.

It seems like it is about time to put more distance between Apple and fast follow attempts, (they did this with iPhone repeatedly)
 

Jorbanead

macrumors 65816
Aug 31, 2018
1,205
1,434
Thank you. Are you using the Blackmagic Pro Vega 56 or a custom build?

In Apple land, the Blackmagic eGPUs are the only possible comparable for sub-Mac pro workstations.

So I don’t know what Apple could possibly do apart from releasing a new Apple Silicon eGPU that lifts the GPU performance of any Apple Silicon product (this would be great imo) or simply start releasing increasingly competitive compute in the machines themselves.

It seems like it is about time to put more distance between Apple and fast follow attempts, (they did this with iPhone repeatedly)
I’m using a Sonnet box with a Vega 64 inside. It works really well, but I’ve certainly pushed it to its limits. I’m really hoping they announce the mini next week. My ideal setup would be a 10-core mini, 32-core GPU, 32Gb ram, 1Tb SSD, with the same I/O as the 2018 mini.

I think from what we’ve seen with M1 (a 15W chip) they certainly could rock some powerful GPU’s in their 45-65W devices, so this is one area I’m eager to see how it plays out. I’m certain the CPU will be plenty powerful.
 

leman

macrumors Core
Oct 14, 2008
19,184
19,038
I put a very speculative estimate of Metal performance on M1X 16 and 32 core GPUs in the XDR Owners thread.

Sorry, but things don't really work that way. Performance improvements from increasing the cluster size are linear, not multiplicative. The difference between the 13 and 13 pro is roughy 25% because 5 is 25% larger than 4. That's it. Assuming the same technology and config (and brushing away external factors like cooling and RAM bandwidth), a 16-core GPU cluster is expected to be 2x faster than an 8-core cluster, and a 32-core cluster is expected to be 4x faster. That's it. Hence: around 42k GB5 for 16-core GPU and around 85k GB5 for the 32-core GPU.
 
  • Like
Reactions: Fomalhaut

leman

macrumors Core
Oct 14, 2008
19,184
19,038
In Apple land, the Blackmagic eGPUs are the only possible comparable for sub-Mac pro workstations.

So I don’t know what Apple could possibly do apart from releasing a new Apple Silicon eGPU that lifts the GPU performance of any Apple Silicon product (this would be great imo) or simply start releasing increasingly competitive compute in the machines themselves.

Loss of the eGPU support is definitely among the drawbacks of Apple Silicon. You do get a much more competent GPU solution than before, but understand that some users will lose flexibility. It is certainly a tough call: raise the GPU level for everybody, but potentially are the experience worse for a small group that relies on powerful eGPUs to do their work...

However, good news is that professional applications (once properly optimized) run much better on Apple GPUs because they can leverage Apple's unique technology and thus utilize the hardware much more efficiently. For most pro-level workloads, a powerful GPU spends most of it's time copying data around. That's why one usually sees very bad performance scaling (e.g. a GPU that's nominally 2x faster only sees a 10-20% improvement in performance). For many applications, Appel GPUs are going to be significantly faster than traditional GPUs that are supposed to be better on paper. You can already see it with M1 that outperforms much larger machines for tasks such as image and video processing.

Personally, I believe that the isolated and controlled nature of Apple's development ecosystem will, maybe paradoxically, lead to a new golden age of the platform. Apple deprecated the "standard" GPU technologies, which forced the developers to move on and embrace new Apple technologies. The amount of professional software that has adopted Apple Metal and M1-specific features in the last year has been unprecedented. It seems clear that the developers do not want to lose the market, and that they rush to embrace the stability and performance advantages of the new platform.
 
Last edited:

Mayo86

macrumors regular
Nov 21, 2016
104
304
Canada
Let’s get this straight:

If this ends up being true and it can sustain this level of performance in a Windows virtual machine running a game, along with unbeatable battery life/efficiency, a mini LED ProMotion display, the ability to run macOS as well as Asahi Linux natively, Windows in a virtual machine, the best trackpad hardware on a laptop, great industrial design, then Apple will have the definitive, unquestionable best laptop on the market.

It will be practically impossible to call it a bad laptop unless it’s got an astronomical price tag. Even the windows gaming crowd will have to give this a serious look.
The M1 GPU already has decent performance somehow under Windows running in Parallels or through CrossOver. I want to say it outperformed some of the Intel-based MacBook Pros frequently. I have little doubt that the 32-core GPU variant of the Mx(y?) chip will be more than competent of what you are thinking it can do. I have to say I was down right astonished that even the 8-core M1 GPU held up decently in a virtualized gaming situation. And from what I gather the performance should scale relatively linearly. The only thing I am potentially squeamish about is the bump up in price from the 16-core GPU to the 32-core GPU.
 
  • Like
Reactions: EmotionalSnow

pshufd

macrumors G3
Oct 24, 2013
9,942
14,437
New Hampshire
This kind of GPU performance is a big boost though I'm actually fine with M1 performance - my complaint is the number of monitors that you can run off of M1. So I'll be happy with whichever GPU, I'm sure. It would be amazing if Apple's GPU efficiency is a great as in the second post.
 

UBS28

macrumors 68030
Oct 2, 2012
2,893
2,340
How about we wait a couple of days and we know the answers. There is no point of speculations anymore.
 
  • Angry
Reactions: N69AP

pshufd

macrumors G3
Oct 24, 2013
9,942
14,437
New Hampshire
How about we wait a couple of days and we know the answers. There is no point of speculations anymore.

We will really know in a few weeks when the reviewers get their hands on them and put them through their paces. There is nothing in my workload to really test the GPUs and I'm sure that my stuff will work fine with whatever they put out. I'll just watch the review videos to see how good Apple's M1X GPU is.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.