Moderator: E.J. Peiker

All times are UTC-05:00

  
« Previous topic | Next topic »  
Reply to topic  
 First unread post  | 38 posts | 
by E.J. Peiker on Tue Nov 24, 2020 10:18 am
User avatar
E.J. Peiker
Senior Technical Editor
Posts: 86776
Joined: 16 Aug 2003
Location: Arizona
Member #:00002
Clearly Apple's new M1 chip is a major disruptive product that is shaking up the computer industry.  Here is an excellent video that explains how we got here.  For anyone interested in computers, this is 20 minutes well spent:
https://www.youtube.com/watch?v=OuF9weSkS68

If it happens to start in the middle just drag the time back to the beginning of the video.
 

by Anthony Medici on Tue Nov 24, 2020 4:31 pm
User avatar
Anthony Medici
Lifetime Member
Posts: 6879
Joined: 17 Aug 2003
Location: Champions Gate, FL
Member #:00012
It started at the beginning for me. And you can spend a little less time if you play it back at faster than normal speed. ;)

My only disappointment thus far is that rosetta didn't include both 32 bit intel apps and the original powerpc apps. A lot of older technology could have been removed from my home if it had!

I'm really interested in what the new machines will look like once they start redesigning them. And also how fast the plug in versions will be and what that chip will have in it.
Tony
 

by Scott Fairbairn on Wed Nov 25, 2020 6:44 pm
User avatar
Scott Fairbairn
Forum Contributor
Posts: 5131
Joined: 13 Jan 2005
Member #:00437
Pretty cool information, thanks for posting.
 

by lelouarn on Tue Dec 01, 2020 2:50 am
lelouarn
Forum Contributor
Posts: 154
Joined: 24 Mar 2006
I thought the video was a bit simplistic (RISC is much better than CISC, and that's why)... So if you want to know (much) more:

https://erik-engheim.medium.com/why-is- ... 62b158cba2

Yes, RISC seems to have an edge on CISC, but there's much more to it (basically, Apple has integrated a lot more into the chip than a CPU - and they can do it because of their hardware / software vertical integration).

I am really eager to see what they can do with "serious" machines, that aim at high performance, rather than long battery life.
 

by E.J. Peiker on Mon Dec 07, 2020 9:49 pm
User avatar
E.J. Peiker
Senior Technical Editor
Posts: 86776
Joined: 16 Aug 2003
Location: Arizona
Member #:00002
Here's a glimpse into the more powerful M1 processor coming for higher end machines:
https://www.theverge.com/2020/12/7/2215 ... acbook-pro
 

by Brian Stirling on Tue Dec 08, 2020 6:52 pm
Brian Stirling
Lifetime Member
Posts: 2558
Joined: 23 Dec 2004
Location: Salt Lake City, UT USA
Member #:00446
E.J., I wonder what your take on CISC versus RISC is going forward at not only the mobile end but also desktop and server. Seems that Apple was dissatisfied with the PowerPC CPU which was more RISC like and went to Intel with the full x86 CISC model but as ARM continued to develop along with Apple, including moving to 64-bit, and particularly the year-over-year performance improvement curve, the RISC model has once again and perhaps finally putting a nail in the CISC coffin. As a former Intel engineer I have to believe this isn't something that makes you happy just as I, a former IBMer, was disappointed at what became of my old company.

The M1 is planting a new flag and tearing down the old one and it will only get worse for Intel as other ARM chip makers like Samsung see what Apple has done and apply the same approach to advance RISC beyond mobile. Apple has the easier job as they have only there own hardware to worry about but the lesson is: power matters. Not just the processing power but power consumption: performance per watt. The question I have is ... do you think Intel will abandon chip making as AMD did and become simply an IP developer?

If the M1 is a death blow to Intel, and that's probably stretching things a bit, it must also give there friends in Redmond a scare to as the connection between the two, the WINTEL alliance, went hand-in-hand. MS has been moving towards services like the Cloud and gaining more of there revenue from that but if the WINTEL paradigm is dead or dying then the core of MS is also at risk.

Interesting to think that the money Apple made, hand-over-fist style with the iPhone, helped drive the development cost of there A series silicon and now the M series. IBM, my former company, could never imagine that a handheld device could generate more money than all the mainframes and peripheral's IBM could make. Is Intel going the way of IBM? Towards the end of my time at IBM, when MS was clearly running the show, a reporter asked Bill Gates about his relationship with IBM and he quipped that "... IBM is irrelevant ..." -- are we there or near there with Intel?


Brian
 

by E.J. Peiker on Tue Dec 08, 2020 7:39 pm
User avatar
E.J. Peiker
Senior Technical Editor
Posts: 86776
Joined: 16 Aug 2003
Location: Arizona
Member #:00002
Apple's discomfort with the PowerPC architecture wasn't so much the architecture itself but rather the pace of development which was perpetually behind what Intel and occasionally AMD was able to do resulting in the pace of improvement of their computers falling farther and farther behind. ARM has serious juice and money behind it and now that nVidia owns the architecture and Apple has seemingly unlimited license to modify it, it's rate of development is not in question.

Even Intel knew that there were significant advantages to RISC and did a product called the i960 for a while which ultimately ended up being used more in higher level microcontroller environments. Nobody wanted to do build computers and then need a new OS including an emulator. Apple doesn't have any qualms about obsoleting old things and supporting the old with an emulator until ultimately they just abandon it. It's always been a difference in the proprietary world of Apple compared to the open world of Wintel with thousands of competitors.

I very much feel that Intel is in steep decline and perhaps could go the way of IBM. Intel made a few very strategic long term errors way back around 2005 that they are paying for dearly now. The company started losing it's way with Paul Otelini at the helm and the loss of way got put into overdrive when the company went with a "bean counter" form of management over a technology driven form of management.
 

by Brian Stirling on Tue Dec 08, 2020 8:30 pm
Brian Stirling
Lifetime Member
Posts: 2558
Joined: 23 Dec 2004
Location: Salt Lake City, UT USA
Member #:00446
The bean counter model, which Wall Street just loves, has been the death of many a company -- Boeing may just survive the 737MAX fiasco which had at its heart a need to quickly address a competitive disadvantage to Airbus that was, thanks to the decision Boeing made to cancel the 737 replacement (bean counters) . In the case of IBM I'm not sure it was all the fault of bean counters but rather there rather longer term disconnection from engineering in the executive ranks -- IBM had long been driven by the marketing department MBA types, not completely dissimilar to bean counters. The thing with Wall Street is that they want the highest return possible NOW and they reward the executives, through the board, with compensation when they deliver. If the decisions a company makes ends up killing the company in the long run Wall Street will just move on and strip-mine someone else. We are now at a point where company boards appear to actively seek sociopathic types that do what the investor class wants with little or no regard for the impact on the workforce, company or indeed the nation. This era can't end fast enough!


Brian
 

by E.J. Peiker on Tue Dec 08, 2020 9:35 pm
User avatar
E.J. Peiker
Senior Technical Editor
Posts: 86776
Joined: 16 Aug 2003
Location: Arizona
Member #:00002
Brian Stirling wrote:The bean counter model, which Wall Street just loves, has been the death of many a company -- Boeing may just survive the 737MAX fiasco which had at its heart a need to quickly address a competitive disadvantage to Airbus that was, thanks to the decision Boeing made to cancel the 737 replacement (bean counters) .  In the case of IBM I'm not sure it was all the fault of bean counters but rather there rather longer term disconnection from engineering in the executive ranks -- IBM had long been driven by the marketing department MBA types, not completely dissimilar to bean counters.  The thing with Wall Street is that they want the highest return possible NOW and they reward the executives, through the board, with compensation when they deliver.  If the decisions a company makes ends up killing the company in the long run Wall Street will just move on and strip-mine someone else.  We are now at a point where company boards appear to actively seek sociopathic types that do what the investor class wants with little or no regard for the impact on the workforce, company or indeed the nation.  This era can't end fast enough!


Brian
Agree 150%
 

by Brian Stirling on Wed Dec 09, 2020 12:53 am
Brian Stirling
Lifetime Member
Posts: 2558
Joined: 23 Dec 2004
Location: Salt Lake City, UT USA
Member #:00446
Moving beyond the impact to Intel the thing is ARM and RISC in general is providing much better performance per watt and not by a small degree either. This is very helpful for mobile applications, of course, but there are many other application's, just now being developed, that will also benefit from low power draw. The move to electric cars, particularly the ones with FSD capabilities like Tesla, need loads of processing power to keep the car on the road and out of harms way and battery limits make low power processors a must as it directly translates to greater or lessor range. Other things like drones, particularly consumer/prosumer drones at 250g - 5kg are pretty processor hungry not only to manage flight, obstacle avoidance and navigation, but also 4K60 video with H.265 -- doing that with a battery only 40WHrs (Mavic Air 2) with a flight time nearing 30 minutes requires some pretty efficient silicon.

So, while Intel and AMD may have there back against the wall the value to all of us is undeniable. The other ARM chip makers, learning the lesson from Apple and there A and M series processors, will no doubt achieve similar results within a few years and that will benefit us all. The shared/stacked memory approach made possible with the M1 appears a game changer by removing a great deal of overhead -- such an approach isn't really possible with CPU's that generate so much heat. It looks to me that the M1 may well be as significant a milestone as the first iPhone and probably more so. As someone that's decidedly NOT an Apple fan-boy I have to tip my hat to them with this.


Brian
 

by E.J. Peiker on Wed Dec 09, 2020 8:45 am
User avatar
E.J. Peiker
Senior Technical Editor
Posts: 86776
Joined: 16 Aug 2003
Location: Arizona
Member #:00002
Brian Stirling wrote:Moving beyond the impact to Intel the thing is ARM and RISC in general is providing much better performance per watt and not by a small degree either.  This is very helpful for mobile applications, of course, but there are many other application's, just now being developed, that will also benefit from low power draw.  The move to electric cars, particularly the ones with FSD capabilities like Tesla, need loads of processing power to keep the car on the road and out of harms way and battery limits make low power processors a must as it directly translates to greater or lessor range.  Other things like drones, particularly consumer/prosumer drones at 250g - 5kg are pretty processor hungry not only to manage flight, obstacle avoidance and navigation, but also 4K60 video with H.265 -- doing that with a battery only 40WHrs (Mavic Air 2) with a flight time nearing 30 minutes requires some pretty efficient silicon.  

So, while Intel and AMD may have there back against the wall the value to all of us is undeniable.  The other ARM chip makers, learning the lesson from Apple and there A and M series processors, will no doubt achieve similar results within a few years and that will benefit us all.  The shared/stacked memory approach made possible with the M1 appears a game changer by removing a great deal of overhead -- such an approach isn't really possible with CPU's that generate so much heat.  It looks to me that the M1 may well be as significant a milestone as the first iPhone and probably more so.  As someone that's decidedly NOT an Apple fan-boy I have to tip my hat to them with this.


Brian
Brian, you may find this interesting:
https://www.extremetech.com/computing/3 ... erformance
 

by photoman4343 on Thu Dec 10, 2020 11:09 am
photoman4343
Forum Contributor
Posts: 1952
Joined: 1 Feb 2004
Location: Houston, TX
Review of Mac Book Pro with M1--13 inch model



https://www.houstonchronicle.com/techbu ... 785199.php
Joe Smith
 

by Scott Fairbairn on Thu Dec 10, 2020 11:22 am
User avatar
Scott Fairbairn
Forum Contributor
Posts: 5131
Joined: 13 Jan 2005
Member #:00437
photoman4343 wrote:Review of Mac Book Pro with M1--13 inch model



https://www.houstonchronicle.com/techbu ... 785199.php


Unfortunately, you have to be a subscriber to read that article.
 

by DChan on Thu Dec 10, 2020 12:19 pm
DChan
Forum Contributor
Posts: 2206
Joined: 9 Jan 2009
Scott Fairbairn wrote:
photoman4343 wrote:Review of Mac Book Pro with M1--13 inch model



https://www.houstonchronicle.com/techbu ... 785199.php


Unfortunately, you have to be a subscriber to read that article.

The last few paragraphs in the article:
...And I could not use an M1 Mac as the computer I use for working at the Chronicle. That’s because the software we use to write stories for the print edition won’t run on Big Sur, which is the only version of the macOS that works on M1 systems.

Normally, I’d get around this by firing up Parallels, the software that lets you run Windows and other operating systems (including older versions of macOS) on the Mac to handle the Chronicle’s application. But the latest release of Parallels doesn’t work at all on M1 Macs. There’s a new, compatible version coming in the near future — but again, it’s not here now.

And that pretty much sums up the experience of the initial M1 Macs. They’re incredibly powerful, but they remain a work in progress. I would love to own one, but it doesn’t makes sense to do so now because key software I need doesn’t work. If you’re interested in owning one, check first to see if your critical apps will run on it and, if they don’t, when the software developer will release a version that will.

Or just wait for the second wave of Apple Silicon Macs, which should come some time next year. Those should be even faster — and more compatible.
 

by Scott Fairbairn on Thu Dec 10, 2020 1:13 pm
User avatar
Scott Fairbairn
Forum Contributor
Posts: 5131
Joined: 13 Jan 2005
Member #:00437
How important is it to get 16GB of Ram(the max allowed) with these machines? I read one review that said they are so much faster they didn't have problems with only 8gb.
 

by E.J. Peiker on Thu Dec 10, 2020 1:56 pm
User avatar
E.J. Peiker
Senior Technical Editor
Posts: 86776
Joined: 16 Aug 2003
Location: Arizona
Member #:00002
Scott Fairbairn wrote:How important is it to get 16GB of Ram(the max allowed) with these machines? I read one review that said they are so much faster they didn't have problems with only 8gb.
Very if you plan on using programs like LR, C1, PS, etc.  The video memory is shared so you will be eating huge chunks of that just to perform the graphics functions.
 

by Scott Fairbairn on Thu Dec 10, 2020 2:15 pm
User avatar
Scott Fairbairn
Forum Contributor
Posts: 5131
Joined: 13 Jan 2005
Member #:00437
E.J. Peiker wrote:
Scott Fairbairn wrote:How important is it to get 16GB of Ram(the max allowed) with these machines? I read one review that said they are so much faster they didn't have problems with only 8gb.
Very if you plan on using programs like LR, C1, PS, etc.  The video memory is shared so you will be eating huge chunks of that just to perform the graphics functions.

Ok, thanks. It's annoying that you can't add RAM on your own afterwards. I have a points card and I could get one of the machines, but they only have 8GB of Ram available in the ones they offer.
 

by E.J. Peiker on Thu Dec 10, 2020 9:52 pm
User avatar
E.J. Peiker
Senior Technical Editor
Posts: 86776
Joined: 16 Aug 2003
Location: Arizona
Member #:00002
Scott Fairbairn wrote:
E.J. Peiker wrote:
Scott Fairbairn wrote:How important is it to get 16GB of Ram(the max allowed) with these machines? I read one review that said they are so much faster they didn't have problems with only 8gb.
Very if you plan on using programs like LR, C1, PS, etc.  The video memory is shared so you will be eating huge chunks of that just to perform the graphics functions.

Ok, thanks. It's annoying that you can't add RAM on your own afterwards. I have a points card and I could get one of the machines, but they only have 8GB of Ram available in the ones they offer.
When the more powerful 16" models come out, possibly using the higher end M1 that is under development, I would expect discrete graphics with its own memory and 32GB as options.  
 

by Scott Fairbairn on Fri Dec 11, 2020 9:37 am
User avatar
Scott Fairbairn
Forum Contributor
Posts: 5131
Joined: 13 Jan 2005
Member #:00437
I think I'll wait until the chip is supported by the software I use and the more powerful ones come out. Nice to see someone pushing the ball down the field though.
 

by Anthony Medici on Fri Dec 11, 2020 4:07 pm
User avatar
Anthony Medici
Lifetime Member
Posts: 6879
Joined: 17 Aug 2003
Location: Champions Gate, FL
Member #:00012
E.J. Peiker wrote:When the more powerful 16" models come out, possibly using the higher end M1 that is under development, I would expect discrete graphics with its own memory and 32GB as options.  
Why would you want discreet memory when by putting more shared ram on the chip, the GPUs and CPUs don't need to spend the time copying things from one memory pool to another?
Tony
 

Display posts from previous:  Sort by:  
38 posts | 
  

Powered by phpBB® Forum Software © phpBB Group