No, Do not do that. it never works

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,295
Reaction score
8,453
In the talk, available online via YouTube, Keller discusses how when planning the Zen 3 core – now at the heart of AMD's "Milan" Epyc processor chips – he and other engineers realized that much of the architecture was very similar for Arm and X86 "because all modern computers are actually RISC machines inside," and hence according to Keller, "the only blocks you have to change are the [instruction] decoders, so we were looking to build a computer that could do either, although they stupidly cancelled that project."

No. Just no.

And I worked at a place where they tried the whole “just make a cpu that can do either!” thing. Then they gave up and made it just RISC, leaving on the silicon some detritus that was to have supported x86.

It doesn’t work. It just doesn’t. It might work if you weren’t competing against companies that are not doing this. But what you will end up with is a degraded risc machine and a degraded x86.

And it’s not “just risc on the inside”. It’s just not. It’s a nice quote, but…

Also: what would the purpose of this chip be?
 

throAU

Site Champ
Posts
256
Reaction score
273
Location
Perth, Western Australia
It doesn’t work. It just doesn’t. It might work if you weren’t competing against companies that are not doing this. But what you will end up with is a degraded risc machine and a degraded x86.

Pretty much.

See: Itanic

Not RISC but same premise.

Given Apple (and others, in the past) have proven that you can get close enough for legacy compatibility reasons in software - not sure why you'd waste the hardware on it.

Not only are you probably getting inferior results to a more pure design - you're encouraging people to continue to compile for the legacy architecture, thus ensuring your shiny new arch never gets proper support.

I mean how long did it take for x64 to properly take off (as in, displace x86 as the OS people installed by default, despite their hardware supporting it fully)?
 

Andropov

Site Champ
Posts
615
Reaction score
773
Location
Spain
EhdCvcMVoAAZjpt.jpg-large.jpeg
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
"because all modern computers are actually RISC machines inside," and hence according to Keller

I'm an average user, not a chip architect, just someone who has had a passing interest in semiconductor design since I started building my own PCs as a teenager. In other words, I'm an average tech nerd, no more, no less.

I'm not old enough to be wise; I'm not young enough to know everything.

However, I've watched repeatedly as other tech enthusiasts, who are more or less at my experience level, worship Jim Keller as if he were the most brilliant man in the industry, the mythological Nichola Tesla of chip engineers. For instance, Moore's Law is Dead released a video with the title "Jim Keller's Royal Core designed to Kill Zen 5". It has this as the thumbnail, despite Keller not being personally involved in the video in any way:

kellerworship.png


In recent years, I've seen this repeated cult hero worship of Jim Keller from within PC circles, and have never seen another engineer placed on a pedestal of honor as Keller has. For Intel fanboys, he's the savior who architected the phoenix with which Intel will arise from the ashes, reborn into its rightful role as the premiere technology company. This is the same genius savant who got evicted from Intel for "personal reasons", and the circumstances of his departure are still unclear.

The overall PC Master Race sees Keller's time at Intel as the kickstart of x86 designs finally overtaking Apple, leaving Arm designs in the dust, handily taking the efficiency and performance crown within the next couple of years. This miraculous new "Royal Core" project at Intel that Keller had spearheaded, along with the brain drain that the Nuvia exodus caused, has resulted in Apple's ultimate doom. The Mac has already fallen into the dustbin of history, a dead platform that simply hasn't realized that it has already reached the culmination point of its own demise.

As PC fanboys have predicted for decades, Michael Dell's advice for Steve Jobs in 1997 was right, and that Jobs should "shut it down and give the money back to the shareholders". Despite being the most valuable company in the world, and just three days ago deemed the most valuable brand in the world, it's time for Tim Cook to close up shop, and finally be done with it. Only then will PC fanboys, both at MacRumors and beyond, be able to take a well-deserved respite from their holy quest.

Some of this mythos surrounding Keller probably has to do with him being relatively media savvy, at least for an engineer. He's been on Lex Fridman's podcast, not just once, but two times thus far. I watched part of them, and the only thing I can recall from either of those podcasts is that Jim Keller is related to Jordan Peterson. That's literally the only detail that I can remember from those interviews.

As I said, I'm by no means an expert in my own right. However, from the quotes that @Cmaier has provided, even I know that Keller is barking down the wrong river. As I have learned from numerous intelligent individuals, including folks on this forum, the difference between RISC and CISC isn't trivial, and switching around a few gewgaws inside a chip with billions of transistors isn't going to magic a miraculous product into existence, one which can easily handle both x86 and Arm instructions with some translation circuitry tacked on.

In a previous thread I referred to Jim Keller as a "brilliant engineer" and @Cmaier said that he wasn't sure if he was brilliant, but has worked with folks, many of which are not famous outside of select tech circles, whom could be classified as such. I was stating that mainly based upon Keller's surface reputation (and also just wanting to be polite) but am now rethinking much of what I had assumed about Keller. He is, at the very least, a smart man. Given what he has accomplished, he has to be intelligent.

However, I'm now left wondering how much is myth, and how much is reality. Nichola Tesla was undoubtedly brilliant, but much of his reputation today came about long after his death. I am left wondering how much of Keller's reputation is deserved, or simply based upon a cult of personality.

AMD most likely made a mistake in not keeping an Arm project going, Intel had been given mana from heaven when StrongARM fell into their laps after picking over DEC's desiccated bones, only to toss it all away when they sold XScale to Marvell. (Not the comic book company, which would have had the same result.) Imagine where Intel would be now if they had kept an ARM v4 product that had been designed by the DEC Alpha team?

Intel's historical blunders aside, Keller's sentiment may be correct, but his statement boiling down modern CISC into nothing more than an internal RISC design shows that he's either playing it up for the cameras like an attention whore, or is making assumptions that are demonstrably untrue, as @Cmaier and others have repeatedly explained with great detail.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,295
Reaction score
8,453
I'm an average user, not a chip architect, just someone who has had a passing interest in semiconductor design since I started building my own PCs as a teenager. In other words, I'm an average tech nerd, no more, no less.

I'm not old enough to be wise; I'm not young enough to know everything.

However, I've watched repeatedly as other tech enthusiasts, who are more or less at my experience level, worship Jim Keller as if he were the most brilliant man in the industry, the mythological Nichola Tesla of chip engineers. For instance, Moore's Law is Dead released a video with the title "Jim Keller's Royal Core designed to Kill Zen 5". It has this as the thumbnail, despite Keller not being personally involved in the video in any way:

View attachment 15172

In recent years, I've seen this repeated cult hero worship of Jim Keller from within PC circles, and have never seen another engineer placed on a pedestal of honor as Keller has. For Intel fanboys, he's the savior who architected the phoenix with which Intel will arise from the ashes, reborn into its rightful role as the premiere technology company. This is the same genius savant who got evicted from Intel for "personal reasons", and the circumstances of his departure are still unclear.

The overall PC Master Race sees Keller's time at Intel as the kickstart of x86 designs finally overtaking Apple, leaving Arm designs in the dust, handily taking the efficiency and performance crown within the next couple of years. This miraculous new "Royal Core" project at Intel that Keller had spearheaded, along with the brain drain that the Nuvia exodus caused, has resulted in Apple's ultimate doom. The Mac has already fallen into the dustbin of history, a dead platform that simply hasn't realized that it has already reached the culmination point of its own demise.

As PC fanboys have predicted for decades, Michael Dell's advice for Steve Jobs in 1997 was right, and that Jobs should "shut it down and give the money back to the shareholders". Despite being the most valuable company in the world, and just three days ago deemed the most valuable brand in the world, it's time for Tim Cook to close up shop, and finally be done with it. Only then will PC fanboys, both at MacRumors and beyond, be able to take a well-deserved respite from their holy quest.

Some of this mythos surrounding Keller probably has to do with him being relatively media savvy, at least for an engineer. He's been on Lex Fridman's podcast, not just once, but two times thus far. I watched part of them, and the only thing I can recall from either of those podcasts is that Jim Keller is related to Jordan Peterson. That's literally the only detail that I can remember from those interviews.

As I said, I'm by no means an expert in my own right. However, from the quotes that @Cmaier has provided, even I know that Keller is barking down the wrong river. As I have learned from numerous intelligent individuals, including folks on this forum, the difference between RISC and CISC isn't trivial, and switching around a few gewgaws inside a chip with billions of transistors isn't going to magic a miraculous product into existence, one which can easily handle both x86 and Arm instructions with some translation circuitry tacked on.

In a previous thread I referred to Jim Keller as a "brilliant engineer" and @Cmaier said that he wasn't sure if he was brilliant, but has worked with folks, many of which are not famous outside of select tech circles, whom could be classified as such. I was stating that mainly based upon Keller's surface reputation (and also just wanting to be polite) but am now rethinking much of what I had assumed about Keller. He is, at the very least, a smart man. Given what he has accomplished, he has to be intelligent.

However, I'm now left wondering how much is myth, and how much is reality. Nichola Tesla was undoubtedly brilliant, but much of his reputation today came about long after his death. I am left wondering how much of Keller's reputation is deserved, or simply based upon a cult of personality.

AMD most likely made a mistake in not keeping an Arm project going, Intel had been given mana from heaven when StrongARM fell into their laps after picking over DEC's desiccated bones, only to toss it all away when they sold XScale to Marvell. (Not the comic book company, which would have had the same result.) Imagine where Intel would be now if they had kept an ARM v4 product that had been designed by the DEC Alpha team?

Intel's historical blunders aside, Keller's sentiment may be correct, but his statement boiling down modern CISC into nothing more than an internal RISC design shows that he's either playing it up for the cameras like an attention whore, or is making assumptions that are demonstrably untrue, as @Cmaier and others have repeatedly explained with great detail.

Jim’s a smart guy. When I landed at AMD after Sun and Exponential, AMD eventually offered me a deal that made it impossible to leave, otherwise maybe I’d have switched jobs every few years too. I guess Tesla and Intel and Apple backed up trucks full of money to get him to jump around.

Anyway, I have nothing bad to say about Jim, and I wish him the best. I have no insight into what he’s done since we worked together, but I think you raise some good questions. Certainly, if I didn’t know him, and I got a resume where someone changed jobs every 2 or 3 years but was credited with finishing the complete architecture for a big CPU in each of those windows, I’d at least have some questions to ask at the interview.
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
Jim’s a smart guy. When I landed at AMD after Sun and Exponential, AMD eventually offered me a deal that made it impossible to leave, otherwise maybe I’d have switched jobs every few years too. I guess Tesla and Intel and Apple backed up trucks full of money to get him to jump around.

Anyway, I have nothing bad to say about Jim, and I wish him the best. I have no insight into what he’s done since we worked together, but I think you raise some good questions. Certainly, if I didn’t know him, and I got a resume where someone changed jobs every 2 or 3 years but was credited with finishing the complete architecture for a big CPU in each of those windows, I’d at least have some questions to ask at the interview.
I think what I find most intriguing isn't Jim Keller himself, as you said, he's a smart guy. What is more interesting is how he became a symbolic representation of chip design innovation to a certain subset of tech nerds and the press outlets that serve them. There are plenty of brilliant engineers, dozens if not hundreds of folks working on any one architecture generation, but he's become a folk hero. His reputation has gotten to the point where everything he touches turns to gold, and there is the assumption that whichever company he works for will automatically have the advantage for the generations of chips that he was involved with.

My question is how this phenomenon evolved, and whether it's truly deserved. Like you said, he's a smart guy, but his reputation is one of absolute genius. Having never been on the inside, I don't fully grasp how CPU engineering teams work, but I do believe that the success (or failure) of the end product depends upon more than the contributions of a single individual, particularly with how much more complex these designs have become, compared to perhaps just a decade ago.

Anyway, my point was never to disparage or diminish Jim Keller's contributions to the industry; I'm simply trying to wrap my head around the status he has publicly cultivated, even if he's not doing it intentionally, and the tech press and PC nerds are doing it all for him.
 

mr_roboto

Site Champ
Posts
282
Reaction score
453
Here's a question to ponder: if there's actually any possibility of this idea working well in reality, which pair of architectures would it be? I suspect there's at least some pairs of RISC ISAs which could work reasonably well.

In fact, this is something many Arm chips already do. Arm v8 decoders and execution units have to be in one of three ISA modes - A32, T32 (Thumb), or A64. Support for all three ISAs in one core is optional, not mandatory. However, in practice, many Arm Holdings cores do support both A32 and A64. (Apple's early 64-bit Arm designs also supported both, but after a few years of that they deleted 32-bit support from their cores and 32-bit only apps from the iOS app store.)

While T32 instructions are mostly different encodings of A32 instructions, A64 is essentially a new ISA. That said, obviously A64's designers had the opportunity and motivation to make sure A32 support didn't cause too many headaches in a 64-bit Arm implementation.

But that's kind of a digression from the more interesting (to me) question: which pairs of otherwise unrelated ISAs could work well? I suspect you could make a case for A64 coexisting with at least one of the non-Arm RISC ISAs. I know there's some you probably wouldn't want to do - e.g. POWER/PowerPC or PA-RISC, thanks to their oddities.

Maybe RISC-V would work? I haven't looked at it (or A64 for that matter) closely enough to think about the pain points myself.

No real reason to do it, obviously, it's just an interesting thought exercise.
 

throAU

Site Champ
Posts
256
Reaction score
273
Location
Perth, Western Australia
I guess the question is why you would do it, even if you could?

Adding a second instruction set to the processor achieves what? Backwards compatibility that you can run via emulation at or near native speed (or recompilation) in a small number of years.

At what cost?
  • maintaining that block across future generations?
  • slowing the transition to your native ISA
Surely we're now at the point in CPU design, at least from an instruction set perspective where we know what sort of things work and what doesn't? Or is there still a bunch of research into this field? And even if there is, wouldn't you just do what you can with the best knowledge you have today, and when the times comes that you've determined a significant need to change, run again via emulation?

Hasn't apple basically been using intermediate byte-code via LLVM as a way of recompiling for different platforms at a later date?


Bit-code?


Why wouldn't you plan ahead and just do that sort of thing these days?
 

mr_roboto

Site Champ
Posts
282
Reaction score
453
Hasn't apple basically been using intermediate byte-code via LLVM as a way of recompiling for different platforms at a later date?

Bit-code?
I've seen commentary by Apple engineers that bitcode was never intended to support migrating existing binaries to completely new architectures. It's some form of LLVM IR (intermediate representation), but not at a phase where it would be clean to retarget to a completely different ISA.

Also... it seems that Bitcode's dead. From the Xcode 14 beta release notes:

Deprecations​

  • Starting with Xcode 14, bitcode is no longer required for watchOS and tvOS applications, and the App Store no longer accepts bitcode submissions from Xcode 14.
    Xcode no longer builds bitcode by default and generates a warning message if a project explicitly enables bitcode: “Building with bitcode is deprecated. Please update your project and/or target settings to disable bitcode.” The capability to build with bitcode will be removed in a future Xcode release. IPAs that contain bitcode will have the bitcode stripped before being submitted to the App Store. Debug symbols for past bitcode submissions remain available for download. (86118779)
 

mr_roboto

Site Champ
Posts
282
Reaction score
453
Apple consider any sort of transition there (ARM 32 bit to ARM 64 bit) to be complete i guess?
Yeah, I think that might be it - the upcoming watchOS 9 drops support for Apple Watch Series 3, which was the last 32-bit only Arm chip in their entire lineup.

(The bad thing about sunsetting Series 3 support is that I said 'was', but it's really 'is'. Series 3 is a current product, they've kept it around as the cheapest entry level watch. Very unusual and customer-hostile for Apple to sell devices as new which will lose OS updates less than a year from purchase date.)
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,295
Reaction score
8,453
Yeah, I think that might be it - the upcoming watchOS 9 drops support for Apple Watch Series 3, which was the last 32-bit only Arm chip in their entire lineup.

(The bad thing about sunsetting Series 3 support is that I said 'was', but it's really 'is'. Series 3 is a current product, they've kept it around as the cheapest entry level watch. Very unusual and customer-hostile for Apple to sell devices as new which will lose OS updates less than a year from purchase date.)
If you’re buying a series 3 at this point I would hope you know what you’re buying - a very old device.
 
Top Bottom
1 2