Apple Thread - The most overrated technology brand?

What killed Steve Jobs?

  • Pancreatic Cancer

    Votes: 60 12.2%
  • AIDS from having gay sex with Tim Cook

    Votes: 431 87.8%

  • Total voters
    491
Well, on ARM, TSMC held their Technology Symposium yesterday, I'll just do the choice bits of what they announced. (TSMC uses NX for normal chips, and NXP for refined chips. Think of it like Apple's A12 compared to Apple's A12X.)

The N5 have been in full production for the past few months and have been rolling out to companies, with products featuring them to be released later in the year. In addition, the N5s are approximately three months ahead in comparison to the 7 and 10 nms (due to the more refined lasers being more efficient and precise in cutting), and are currently experimenting with the N5P, which should be 5% more powerful and 10% more efficient over N5s

They're also working on the N4 as part of a roadmap towards the N3 (yes, 3 nm), with the N3 to enter risk production in 2021, and full production in 2022, which they say should be approximately 25-30% more efficient and 10-15% more powerful over a N5.


I'm not sure about their 5/3nm claims. I'm no electrical engineer but I know from the past that what would be considered 5nm would be a significant challenge and if TSMC had a solution there would be more fanfare about their breakthrough. Nodes like penises on the internet are all measured in creative ways and TSMC tries to have the smallest one.
 
I'm not sure about their 5/3nm claims. I'm no electrical engineer but I know from the past that what would be considered 5nm would be a significant challenge and if TSMC had a solution there would be more fanfare about their breakthrough. Nodes like penises on the internet are all measured in creative ways and TSMC tries to have the smallest one.

I heard 5 nm was kind of pretty much the end of silicon as a substrate. Below that, background radiation, electromigration, and similar issues meant that it's ogre for silicon at 3 nm, and they'd have to move to InGaAs or graphene. At that miniscule size of transistor background radiation is energetic enough to flip bits where otherwise they wouldn't, meaning that calculations are all out of goose.
 
From what I heard, they began using a small amount of cobalt to help get past the 5 nm-3 nm area, something about it being dense enough to get around that area.

When doping the crystal?

From what little I understand lithography is also a problem. The wavelength of UV light stops at 10nm so focusing and projecting enough of it through the mask and onto the wafer to (in the end) create a 5nm transistor(gate?) becomes very tricky. But I don't know these things.
 
Given Apple's switch to yet another processor architecture, it makes me wonder why they made Swift (a pre-compiled LLVM-backed language) rather than a JIT runtime like the JVM or .NET which can run the same software on multiple architectures without change. In fact it's kind of strange, most ecosystems these days use a JIT runtime, it's not like it's unfashionable or anything.
 
  • Thunk-Provoking
Reactions: Kot Johansson
Given Apple's switch to yet another processor architecture, it makes me wonder why they made Swift (a pre-compiled LLVM-backed language) rather than a JIT runtime like the JVM or .NET which can run the same software on multiple architectures without change. In fact it's kind of strange, most ecosystems these days use a JIT runtime, it's not like it's unfashionable or anything.
I mean it doesn't really hurt them. Apple aren't looking to support multiple architectures long term. .NET 4 supported what, three random architectures plus x86? If Microsoft were ever able to get anyone to start using Windows on ARM via .NET, they would still be supporting x86 for probably.. two decades plus after people started switching over.

When will the last new x86 Mac be sold? Guessing it might take 2-3 years for the Mac Pro to disappear or go to ARM. And they might be really useful for five years after that, as long as Mac developers continue to crosscompile for both x86 and ARM, but it isn't that big a deal to compile for the two official architectures. Just means Mac developers will have to continue creating good cross architecture code.
 
Yeah, I suppose Apple just has this mentality of controlling their hardware. MS made .NET because they don't directly control hardware, they more-or-less just guide the overall hardware industry, but always plan for the future. For Apple, the future is whatever they say it is, so I guess they don't see a reason to make a JIT'd language (even though they offer a lot of benefits from both a marketing and performance standpoint). Plus, they really don't give a shit about whatever burdens they put on their developers, lol.
 
It's hard to say without actually having them released, but we know from WWDC and dev videos made that Office and Photoshop were running natively on ASi/ARM in a week of development, and that the average time to compile and modify/optimise a program to work natively is about 3 days.

Granted, it's an interesting point, since Apple's got the ability to just go "Right, we're doing this by this date, adapt or die.", whereas Microsoft has more of a range to deal with, as well as all the various legacy programs and code out there lurking around.
--
For the rumour side of things, the most interesting one is the iPad Air 4, which is supposed to be getting a processor boost.

The main problem is that the current Air has an A12, which leaves very little room between it and the Pro (A12Z, which is basically a very slightly faster X), and it and the A14.

If Apple puts in an X, it becomes dangerously close to a Pro. If Apple puts in an A13, then the Air'll start to benchmark higher for single core. If Apple puts in an A14 (fat chance), then that would completely flatten the Pro.
 
My first ever laptop was an iBook G3, one of the last all-white ones with the translucent keyboard. I loved that thing. I was able to use that laptop for about 5 years with not much getting in my way concerning obsolescence. I had the combo drive replaced under warranty one time, but that was the only issue I had with it. After that I upgraded to a PowerBook G4 17-inch, and I used that laptop daily all the way up to around 2011 when I finally was able to get an Intel MacBook Pro.

Now, that computer? I hated it. It was so much less serviceable than my two previous laptops, I had to send it in for warranty work for the stupidest things (one day the screen just died, the week after I got it back the AirPort card died, etc), and I ended up buying a Windows laptop for the first time in my life (Acer TravelMate) and using that for a solid 3 years. Not as sexy as a MacBook Pro, of course, but at least it worked. But, I was missing a lot of the things I could do on a Mac, so in 2016 I bought yet another MacBook Pro. In a single year, its charging circuit went completely dead. Now, I buy an extended warranty with every computer I purchase. But Apple refused to honor this one and swore up and down it stopped working due to "spill damage" when I do not let liquids get NEAR any of my PCs. (And I mean that, I'll slap the fuck out of anyone who brings a capless drink near my computer)

Sold the Mac for parts, still fucking using that same TravelMate I got years ago and it's going strong to this day with a new SSD I added just a month ago. Fuck Apple. Though, I will say, throughout that whole time, I had iPods. I had a 3rd gen iPod, an iPod mini, a 2nd-gen iPod nano, a 4th gen iPod nano and two 3rd gen iPod Touches, and all of those were great. And of course, the iPod is now dead. So, again, fuck Apple.
 
TIL Ronald Wayne was the third Apple founder along side Woz and Jobs. He sold his 10% share of Apple for $2,300, which would be worth about $200 billion today. He also made the original Apple Logo:

iu.png
 
Nvidia Buys SoftBank’s Arm in Record $40 Billion Chip Deal (a)

Having all these licenses being held by a company that didn't actually make anything meant that there could be a degree of independence in the licensing deals, but having these rights being held by a hardware company puts that company in a conflict of interests with licensees that make hardware in the same fields. Granted, that sort of thing isn't necessarily new in the computing industry; see Samsung selling cell phone components to Apple.
 
Nvidia Buys SoftBank’s Arm in Record $40 Billion Chip Deal (a)

Having all these licenses being held by a company that didn't actually make anything meant that there could be a degree of independence in the licensing deals, but having these rights being held by a hardware company puts that company in a conflict of interests with licensees that make hardware in the same fields. Granted, that sort of thing isn't necessarily new in the computing industry; see Samsung selling cell phone components to Apple.

I doubt Nvidia's going to do anything but let Apple pay them licensing fees for forever. Apple money is reliable, on time money, backed by a fucking tech giant with hundreds of billions in the bank. There's very little love between nVidia and Apple (mostly because the companies are of the same mentality of "my hardware and my drivers are mine to work with and develop, deal on my terms or don't"), but I doubt nVidia's dumb enough to pull a Tim Sweeny and lose a steady and reliable income stream for a pretty nebulous outcome.
 
Last edited:
I doubt Nvidia's going to do anything but let Apple pay them licensing fees for forever. Apple money is reliable, on time money, backed by a fucking tech giant with hundreds of billions in the bank. There's very little love between nVidia and Apple (mostly because the companies are of the same mentality of "my hardware and my drivers are mine to work with and develop, deal on my terms or don't), but I doubt nVidia's dumb enough to pull a Tim Sweeny and lose a steady and reliable income stream for a pretty nebulous outcome.
Sure, but I didn't have Apple in mind when I wrote that. I was thinking of more direct competitors, like VIA, or Qualcomm, or even AMD or Intel. It won't be in Nvidia's interests to give them great licensing deals on tech when they create some of the same products that Nvidia does or may wish to in the future. Perhaps fears of lawsuits or regulatory punishment will keep Nvidia in line but I doubt it'll be 100% effective.
 
Sure, but I didn't have Apple in mind when I wrote that. I was thinking of more direct competitors, like VIA, or Qualcomm, or even AMD or Intel. It won't be in Nvidia's interests to give them great licensing deals on tech when they create some of the same products that Nvidia does or may wish to in the future. Perhaps fears of lawsuits or regulatory punishment will keep Nvidia in line but I doubt it'll be 100% effective.

>Not thinking about the Apple angle
>In the Apple Tech thread

I'd tease you on you maybe a bit about this but tbh I just went on a memory lane trip in the Win95 thread that's 80% Apple content so he who is without sin cast the first stone.

I guess its possible if nVidia wanted to cockblock some of its direct competitors, but ARM seems to be the wave of the future. Everything Apple does that people give it shit for the rest of the computer industry will do in 2-3 years ( if you don't count the cellphone market, where Apple used to be the frontrunner but now takes cues from Samsung and Google), they're switching to ARM and Microsoft is hungry to do the same, as seen with products like the Surface Pro X, albeit with less "adapt now or die" conviction than Apple. AMD and Intel have their work cut out for themselves to keep x86 relevant in the coming days, and I suspect they'll start working towards some ARM or hybrid solution. Maybe if nVidia wants to make those coming gouges cut a little deeper they could, but I suspect their investment in ARM will pay for itself and then some simply by licensing out to these companies and letting the free market do its thing and collecting a paycheck no matter what side wins.

After all, nVidia's business is built on licensing if you really think about it. They design up a big beefy video card that can do all this crazy shit, build a "founders" model or a showpiece card directly, but most of their cash comes from EVGA or MSI or Gigabyte licensing the cards specs and chips and building their own versions. Whether you buy nVidia's flagship founder cards or the third party, nVidia's still getting paid. So, why wouldn't they adopt ARM to this tried and true model that they've proven they're good at?
 
I've personally never had any issues with Apple products. I'm no shill for them or anything and I certainly don't think they're necessarily amazing but so far as I'm concerned they're the best product on the market that i can reasonably obtain... as far as phones go. I'd never buy an apple computer or laptop.
 
  • Agree
Reactions: fartsnstuf
>Not thinking about the Apple angle
>In the Apple Tech thread

I'd tease you on you maybe a bit about this but tbh I just went on a memory lane trip in the Win95 thread that's 80% Apple content so he who is without sin cast the first stone.

I guess its possible if nVidia wanted to cockblock some of its direct competitors, but ARM seems to be the wave of the future. Everything Apple does that people give it shit for the rest of the computer industry will do in 2-3 years ( if you don't count the cellphone market, where Apple used to be the frontrunner but now takes cues from Samsung and Google), they're switching to ARM and Microsoft is hungry to do the same, as seen with products like the Surface Pro X, albeit with less "adapt now or die" conviction than Apple. AMD and Intel have their work cut out for themselves to keep x86 relevant in the coming days, and I suspect they'll start working towards some ARM or hybrid solution. Maybe if nVidia wants to make those coming gouges cut a little deeper they could, but I suspect their investment in ARM will pay for itself and then some simply by licensing out to these companies and letting the free market do its thing and collecting a paycheck no matter what side wins.

After all, nVidia's business is built on licensing if you really think about it. They design up a big beefy video card that can do all this crazy shit, build a "founders" model or a showpiece card directly, but most of their cash comes from EVGA or MSI or Gigabyte licensing the cards specs and chips and building their own versions. Whether you buy nVidia's flagship founder cards or the third party, nVidia's still getting paid. So, why wouldn't they adopt ARM to this tried and true model that they've proven they're good at?

To be fair, I suspect the real money is in the industrial, commercial, and data centre markets. Their Quadro and Tesla line of cards (which use the same chips and similar PCBs to the standard GeForce cards, but with firmware optimised for CAD and/or large scale number crunching) go for huge prices, most of which is probably profit once the specialist grade firmware and driver development costs are taken into account. The difference in the firmware is that a "gaming" card will basically do calculations to a "good enough" level of precision and guess at things far in the distance before purging everything to start calculating the next frame, but one of these cards will do all the calculations to huge precision and in full every time. This is because when you're, say, designing a plane or a bridge or a building, there is no such thing as "good enough." You also pay for the support as well, like with professional-grade software.
 
Back