Mac Studio

bunnspecial

Site Champ
Posts
295
Reaction score
644
Yeah, I’ve been thinking of other displays I could use temporarily. I have an old 27” ACD, but I don’t think it’ll work with the Mac Studio.

I used one for a while with my M1 MBP(and then went a step further and started using the same size Thunderbolt display with it).

This is what I initially used https://www.amazon.com/AllSmartLife-DisplayPort-Aluminium-resolution-ChromeBook/dp/B017TZTMBG (although I later switched to using a TB hub/dock that had mini-DP out, then as I mentioned ditched ACD entirely for the near identical TB display...)
 

Roller

Elite Member
Posts
1,443
Reaction score
2,813
I used one for a while with my M1 MBP(and then went a step further and started using the same size Thunderbolt display with it).

This is what I initially used https://www.amazon.com/AllSmartLife-DisplayPort-Aluminium-resolution-ChromeBook/dp/B017TZTMBG (although I later switched to using a TB hub/dock that had mini-DP out, then as I mentioned ditched ACD entirely for the near identical TB display...)
Thanks. I got my Mac Studio today and connected it to an HP 27” monitor from work. It’ll do until my Studio Display arrives in 2-3 weeks. I’m transferring stuff from backup and will then reinstall third-party apps. I’ll post first impressions when I’ve had a day or so to put it through its paces.
 

Roller

Elite Member
Posts
1,443
Reaction score
2,813
A few preliminary observations:
  • Setup went much faster than expected. I copied everything but apps from a Carbon Copy Cloner backup, which took ~80 minutes for ~700 GB. Next, I reinstalled all my apps working off a list I prepared in advance. Downloading and running installers was quick, too. I started with 1Password, which is where I have activation keys and other required info. stored.

  • I haven't yet decided how to back up the computer. It has a 2 TB SSD, and the external SSDs I've used for backup are 1 TB. Unfortunately, SSDs are still pricier than spinning disks, so I may go that route at the expense of speed. I'd like to hear what y'all suggest.

  • I can definitely hear some low level white noise. Once you know it's there, the sound is difficult to completely ignore, but it's not objectionable and won't affect my daily use, since it's quieter than typical ambient levels in my home office. I haven't yet exercised the machine enough for the fans to ramp up, though.

  • It's certainly faster than the 2017 iMac it's replacing. I'll know more tomorrow when I use it for some work projects. I will say that it boots a lot more quickly, LOL.

  • I still have a lot of peripherals with USB A connectors, including a printer, microphone, wired mouse, and some other stuff. My old 4-port hub is doing fine, though.

  • I bought a Touch ID keyboard yesterday. It's identical to my old keyboard apart from the Touch ID sensor, which is incredibly convenient.
 

MEJHarrison

Site Champ
Posts
928
Reaction score
1,830
Location
Beaverton, OR
I'm getting ready to order the 24-core M1 Max version. I was set on 1TB. Then I noticed that my current 1TB is 80% full. I looked into external SSDs, but nothing can come close to Apple's SSD speeds. So I might just suck it up and get a 2TB model.

Still haven't ordered it. At this point the shipping date is May 23-May 30. I think I'll just wait till they slip into early June. Then hopefully I'll get it around the time of WWDC. That way if there's any announcement there that knocks my socks off, I can just send the Studio back and still be within that 14-day window 🤞. If not, I'll be ready to play with new betas.
 

Yoused

up
Posts
5,621
Reaction score
8,938
Location
knee deep in the road apples of the 4 horsemen
It occurred to me yesterday, when the power blinked and then was out for an hour or so, years ago, I had a big ugly UPS that saved me some lost work a couple times: given that the M-series is geared toward power efficiency, would it make sense for Apple to put a small battery or supercap in its desktop models to allow them a chance to save state when the power goes out unexpectedly?
 

throAU

Site Champ
Posts
257
Reaction score
275
Location
Perth, Western Australia
The reason I turned down a job at Intel and instead first worked on a PowerPC and then on a sparc was that I really really wanted something other than x86 to be competitive. Sadly I had to give in, because working with good people and enjoying my job actually mattered (and the awesome team for the PowerPC was scattered when we couldn’t get more funding).

This is why I'm such an ARM/Apple Silicon cheerleader. Wanting something other than x86 to be competitive/win. X86 is just.... really inherently garbage. Anyone who has coded for the thing in assembly language will attest - it's just nasty. Doesn't matter how fast they make it run, it just plain feels... dirty.

Having x86 be the de-facto standard just... hurts.

  1. the performance speaks for itself in terms of performance per watt
  2. whilst no doubt not as old as some here, I'm old enough to remember the 80s computing landscape, and machines like the Amiga, ST and Archimedes (also the 8 bits before them, but those were properly powerful in their time). Especially the archimedes. It was the first ARM machine. I always wanted one, and with the Apple Silicon, Pis, etc. I at least have a descendant :)
 
Last edited:

Eric

Mama's lil stinker
Posts
11,437
Reaction score
22,075
Location
California
Instagram
Main Camera
Sony
A few preliminary observations:
  • Setup went much faster than expected. I copied everything but apps from a Carbon Copy Cloner backup, which took ~80 minutes for ~700 GB. Next, I reinstalled all my apps working off a list I prepared in advance. Downloading and running installers was quick, too. I started with 1Password, which is where I have activation keys and other required info. stored.

  • I haven't yet decided how to back up the computer. It has a 2 TB SSD, and the external SSDs I've used for backup are 1 TB. Unfortunately, SSDs are still pricier than spinning disks, so I may go that route at the expense of speed. I'd like to hear what y'all suggest.

  • I can definitely hear some low level white noise. Once you know it's there, the sound is difficult to completely ignore, but it's not objectionable and won't affect my daily use, since it's quieter than typical ambient levels in my home office. I haven't yet exercised the machine enough for the fans to ramp up, though.

  • It's certainly faster than the 2017 iMac it's replacing. I'll know more tomorrow when I use it for some work projects. I will say that it boots a lot more quickly, LOL.

  • I still have a lot of peripherals with USB A connectors, including a printer, microphone, wired mouse, and some other stuff. My old 4-port hub is doing fine, though.

  • I bought a Touch ID keyboard yesterday. It's identical to my old keyboard apart from the Touch ID sensor, which is incredibly convenient.
Very much the same experience for me, only my iMac was from 2015 and had become so slow to the point I wanted to throw it through the window at times lol. I got the base model Mac Studio, the biggest drawback as the smaller 512GB drive and I didn't feel like waiting around for a custom built model to up it to 1TB so I made it work, needed to get rid of some old data anyway.

Since I've already been using the M1 on my MBP I have an idea of just how capable this system is, it smokes on the Mac Studio just as well and I'm more than happy with it. The only real thing I'm missing is that beautiful 5K monitor that was built-in to the iMac, I ended up with a LG 27” IPS 4K UHD and it's not bad but to get the same quality I had before would be out of my price range, it's a worthy tradeoff though.

Adobe LR and PS open up nice and fast now, though I'm still at the beck and call of my external USB drive to load my photos it's still far more usable. I also ended up getting the Touch ID keyboard, great feature on the MBP that I also like having on the new Studio.
 

ArgoDuck

Power User
Site Donor
Posts
106
Reaction score
168
Location
New Zealand
Main Camera
Canon
For anyone interested Craig Hunter has posted the review I’d hoped to see, concerning the cpu capabilities of the studio and the ultra in particular, applied in a scientific or engineering context. It’s here: hrtapps dot com/blogs/20220427/

His comparison with several generations of Mac Pro (up to the 28 core), iMac Pro (18) and I think a MacBook Pro from a few years ago is quite interesting… 😉
 

Yoused

up
Posts
5,621
Reaction score
8,938
Location
knee deep in the road apples of the 4 horsemen
He shows the Ultra pantsing a 28 core Cascade Lake processor using only 6 cores. This is on an airfoil flow dynamics calculation. What I wonder about, though, is why he is running this on the CPU. It seems like this sort of work is the embarrassingy parallel stuff that belongs on the GPU.

chart1.png


Interesing that the performance does not seem to decay as more cores are recruited. Apple's bus bandwidth appears to be just that good.
 

ArgoDuck

Power User
Site Donor
Posts
106
Reaction score
168
Location
New Zealand
Main Camera
Canon
^ true that, about why not GPUs. But so refreshing to see a review not focussed on content creation 🙂

Apparently, he plans a GPU review in future.

The almost ruler straight linear relationship between performance and cores is - as you say - just remarkable. I’d hoped for something like this…
 

mr_roboto

Site Champ
Posts
288
Reaction score
464
He shows the Ultra pantsing a 28 core Cascade Lake processor using only 6 cores. This is on an airfoil flow dynamics calculation. What I wonder about, though, is why he is running this on the CPU. It seems like this sort of work is the embarrassingy parallel stuff that belongs on the GPU.
I'm guessing the software package he used, NASA TetrUSS, doesn't support GPU.

Your comment made me curious and according to my highly scientific googling process, CFD software has only begun to get ported to GPUs quite recently.

I suspect it might be one of those categories of simulation software which isn't always as embarrassingly parallel as you'd hope, or takes a lot of work to transform into a fully embarrassingly parallel problem. I have no idea if this is how CFD actually works, but if a CFD solver divides space up into cells and simulates what's happening to each one, well, for every time step the cells probably need a lot of cross-communication. Lots of neighbor influence. GPUs are at their best when each GPU core gets to do the same math on different, completely independent data.
 

Andropov

Site Champ
Posts
617
Reaction score
776
Location
Spain
I suspect it might be one of those categories of simulation software which isn't always as embarrassingly parallel as you'd hope, or takes a lot of work to transform into a fully embarrassingly parallel problem. I have no idea if this is how CFD actually works, but if a CFD solver divides space up into cells and simulates what's happening to each one, well, for every time step the cells probably need a lot of cross-communication. Lots of neighbor influence. GPUs are at their best when each GPU core gets to do the same math on different, completely independent data.
Here's my guess of what the problem might be (I don't know, it's just a guess) with making it parallel. First: a recap of how fluid simulations are done on GPUs, from the book GPU Gems 3, in case you're interested. The article focuses the Navier-Stokes equations for incompressible flow. But real fluids are compressible to some degree. Some more than others, obviously. Water can be approximated as incompressible in many places, but for fluids like air in the atmosphere, compressibility can't be ignored in any realistic setting.

Anyway: the method in the article cleverly uses an implicit integration method to handle advection (see the Advection section). That works because the fluid is incompressible. Hence, the influx particles and particle associated quantities of each cell come from another, *single* cell. So that's massively parallel, almost trivially. For each cell, you receive flow from another cell, whose properties you can look up. The amount of work you need to do per cell is fixed.

Now, for compressible flow? Advection can come from many cells at once (or from none at all). There's no upper bound to how many neighbouring cells can contribute to a single cell. The whole cell-as-a-particle approximation they do in the article breaks down. You can't trace back where the 'cell particle' comes from because there's an unknown number of 'particles' on each cell (not just one). So you could have cells where advection comes primarily from a single other cell (like in incompressible flow) and cells at high pressure with influx flow from thousands of other cells.

I'm sure there are clever ways around this, but I think that looks like a massive roadblock in the way of a GPU-capable implementation. On the CPU, there's no such problem. Integrate the equations explicitly (with something like Runge-Kutta or Euler), and for every origin cell you process, you modify (serially) whatever cells are affected by the origin cell. Since the implementation is serial, whether a given destination cell happens to be modified once or a thousand times is a non-issue.
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
That’s how it’s done.
Clearly, this is how @Cmaier and AMD solved Athlon XP thermal issues. To prevent spontaneous semiconductor combustion, due to lack of an on-die gamma-ray spectrometer to prevent cosmic rays, the Opteron team used cacao butter as thermal paste to supplement copper, aluminum, and cesium heatsinks. Instead of worrying about a crushed die during a bad mount, they'd use a Twix bar as a shim, and that's after applying a licorice-derived compound to the substrate. Rumor has it that AMD even had Snickers bars inside hermetically sealed containers that were labeled "in case of emergency, smash sugar glass". During the infamous lederhosen shortage at the Dresden fab, Jerry Sanders allegedly kept handy boxes of Schogetten, just in case of pipeline stalls in AMD's Itanium workstations. Not only do "real men have fabs", but they have mostly authentic candy to go with their Jägermeister-inspired CPU designs. At the time, it was widely reported among the tech press that @Cmaier learned German to communicate with AMD's Prussian employees, despite the heavy enforcement of a pantaloons-only dress code, but it was in fact to ensure ample supplies of Schwarzwälder Kirschtorte, just in case Fab 30 went offline, after the now-defunct Fab 69 was a proven frustration, when AMD's deeply embedded servers failed penetration testing, once programs were compiled using Teutonic tart flags.

I learned all of this from the well-informed, historically accurate, highly logical, even-handed, scientifically astute, and very respectful posts over at the MR forums, when I made an inquiry with fanboys of AMD (Amazing Milk-chocolate Delicacies).
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,329
Reaction score
8,520
Clearly, this is how @Cmaier and AMD solved Athlon XP thermal issues. To prevent spontaneous semiconductor combustion, due to lack of an on-die gamma-ray spectrometer to prevent cosmic rays, the Opteron team used cacao butter as thermal paste to supplement copper, aluminum, and cesium heatsinks. Instead of worrying about a crushed die during a bad mount, they'd use a Twix bar as a shim, and that's after applying a licorice-derived compound to the substrate. Rumor has it that AMD even had Snickers bars inside hermetically sealed containers that were labeled "in case of emergency, smash sugar glass". During the infamous lederhosen shortage at the Dresden fab, Jerry Sanders allegedly kept handy boxes of Schogetten, just in case of pipeline stalls in AMD's Itanium workstations. Not only do "real men have fabs", but they have mostly authentic candy to go with their Jägermeister-inspired CPU designs. At the time, it was widely reported among the tech press that @Cmaier learned German to communicate with AMD's Prussian employees, despite the heavy enforcement of a pantaloons-only dress code, but it was in fact to ensure ample supplies of Schwarzwälder Kirschtorte, just in case Fab 30 went offline, after the now-defunct Fab 69 was a proven frustration, when AMD's deeply embedded servers failed penetration testing, once programs were compiled using Teutonic tart flags.

I learned all of this from the well-informed, historically accurate, highly logical, even-handed, scientifically astute, and very respectful posts over at the MR forums, when I made an inquiry with fanboys of AMD (Amazing Milk-chocolate Delicacies).

Das stimmt. Ausgezeichnet.
 
Top Bottom
1 2