Apple Thread - The most overrated technology brand?

What killed Steve Jobs?

  • Pancreatic Cancer

    Votes: 60 12.2%
  • AIDS from having gay sex with Tim Cook

    Votes: 431 87.8%

  • Total voters
    491
tl:dw: Lightning connectors are 8 years old and everyone else uses USB-C
Cool. HDMI is 20 years old or so. Should we stop using HDMI cables?

Anyway, now that USB-C is more or less on feature parity with Lightning now, I'm sure they'll switch eventually. There's been rumors of it for a while now. One reason they might not is that I think a Lightning port is still a bit thinner than a USB-C port and Apple is fetishistic about thinness.

I'm a tech tard. Is this as ominous as Rossman makes it out to be?
Rossman is smart when it comes to hardware, but he's definitely guilty of doing clickbait titles and doomer-pilling. I'm not sure if what he's talking about in that video is true or not - it's the first I've heard of it.
 
  • Like
Reactions: Falcos_Commisar
Cool. HDMI is 20 years old or so. Should we stop using HDMI cables?

Anyway, now that USB-C is more or less on feature parity with Lightning now, I'm sure they'll switch eventually. There's been rumors of it for a while now. One reason they might not is that I think a Lightning port is still a bit thinner than a USB-C port and Apple is fetishistic about thinness.


Rossman is smart when it comes to hardware, but he's definitely guilty of doing clickbait titles and doomer-pilling. I'm not sure if what he's talking about in that video is true or not - it's the first I've heard of it.
I get your point, but like, HDMI is a really awesome standard what with it being fully backwards compatible all the way through, and able to carry ethernet and 4k @ 120hz now. Not to mention it's on absolutely everything, not just one brand's platform.

Linus sort of skimmed over it, but there really is that thing where the EU's been putting pressure on everyone to stick to one kind of connector for many different reasons, and Apple are like the one holdout left. Even Sony's finally cut the crap, and they're super notorious for having proprietary cables and memory cards. For a company that's proudly touting how environmentally friendly they are for no longer including USB -> mains adapters and headphones with their $1000+ phones, they're sure not doing what they can to stop the manufacturing of even more cables than are necessary.
I'm a tech tard. Is this as ominous as Rossman makes it out to be?
I don't see how anyone could have ever believed Apple's whole "We don't spy on you! Honest!" shit after The Fappening way back in 2014, considering that was a massive iCloud breach where a lot of those pictures were ostensibly deleted from both the users phones and iCloud backups.
 
Rossmann is the man. Yeah, some of his video titles are clickbait-y but if you don't play the trend game on YT, you don't get seen, except by the hardcore subscribers. If a few more people gave Apple the level of shit he does, the Right To Repair bills wouldn't be having as hard a time as they do currently. A few more people publicly pushing Apple's shit in like he does and we might eventually see them learning from their mis-steps instead of touting them as quirky idiosyncrasies.
 
I think a Lightning port is still a bit thinner than a USB-C port and Apple is fetishistic about thinness.
Just once i want Apple or some other company to go in the opposite direction. have a thick computer, imagine the specs if apple decided to make a laptop that weighs 10lbs and the width of your hand.
 
Ok, you have native iOS development but; no docker, no virtualbox, no bootcamp, no IDEs, seriously who is this Macbook "Pro" for?
Actually having to use containers is for a very few developers and loser faggots who use shit tier Linux distributions like Ubuntu that refuse to build up to date packages for insecure high-risk programs like Google Chrome.
None of these devices use anything other than the integrated GPU for that matter. Can't help but wonder if that's a limitation of the first generation of this thing; they didn't get around to support for "external" (to the SoC) GPUs. After all it's not a problem they had to solve for the iOS devices…
The only ARM product at this point where that even theoretically comes into play is that little version of the MacBook Pro. All those models had shitty Intel integrated GPUs already. Were there people carting around external GPU enclosures to show off at client sites who used 13" MBPs rather than 16" ones? I suppose. Probably tens if not hundreds of them. I guess those people are going to have to wait for an upgrade path.

It will be interesting to see where they go with the more capable models. Sure, the memory is currently integrated into the SOC package with these low end units, but it would be surprising if they just try to keep expanding that for iMacs and the 16" MBP. I did get the impression from earlier articles prior to this latest release that the memory architecture of ARM chips- or Apple's, at least, to date, at least- could be problematic for functioning with off-chip GPUs, with the 'unified memory architecture' sharing all that on-package RAM between the CPU and GPU, and apparently not providing good options for fast access to off-chip GPUs, but one would expect at least one of AMD or Nvidia to suck it up and work with Apple to make it work. Forget finding any article that discusses this topic seriously now without filtering for dates before November, though.
 
No, I agree, the lower end models almost always had integrated graphics (13 inch MBP has had a few custom options) and were usually bought with base RAM, or bumped up slightly. They were relatively capable machines for their size, but there was the bit of Apple tax. And Intel's obsession with trying to make laptops set themselves on fire.

That being said, I'd argue that these aren't so much for the 2018-2020 Mac users. They're for guys like me, using older Macs or PCs and waiting to see what Apple could do with their own Mac CPU. For reference, 2014 Mac Mini, 8 GB RAM. It shouldn't be a contest for a M1 Mini.

Anecdotally, we've seen the shipping date for the new Macs collapse to near 2021 last I checked.
Also of interest was indeed Baldur's Gate 3 being shown off on them, and looking fairly smooth. Steam recommends 4 GB graphics card for min, 8 GB for recommended. (Geekbench is showing the M1 as clocking between a GeForce 1050 and a GeForce 1060 6 GB.)

Apple's going to crow about the M1, despite the limitations we saw, like only 2 Thunderbolt ports (MBA and MBP are also limited in that regards, but Apple loves slathering the back end of Minis with ports.), and 16 GB RAM. And I'd say that Apple is justified in crowing about it.

10 watt processors should not be clocking above 2020 i9s in single core, they shouldn't be just behind a 2013 Xeon in multi. Integrated graphics should be able to load the main menu of BG3, but that's it.

An M1 in an iMac or MacPro would be comical. But a M1X or M2 that fixes those flaws?
 
Just once i want Apple or some other company to go in the opposite direction. have a thick computer, imagine the specs if apple decided to make a laptop that weighs 10lbs and the width of your hand.
Hyperbolism aside, I agree completely. There's a point of diminishing returns when it comes to making things smol and Apple's been on the wrong side of it for the last few years. But they might be starting to get the message with the return of the full-size "cheese grater" Pro and the new iPhones going back to the flat edges rather than the curved/tapered edges of the last few gens (which I personally found hard to hold without a cover).

Now if we can just find a way to get them to stop trying to make the Touch Bar a thing.
 
Cinebench single/multicore scores against mobile CPUs.
applem11.jpgapplem12.jpg
 
I don't see how anyone could have ever believed Apple's whole "We don't spy on you! Honest!" shit after The Fappening way back in 2014, considering that was a massive iCloud breach where a lot of those pictures were ostensibly deleted from both the users phones and iCloud backups.
I believe that Tim Cook has been pushing for years to getting into the business of selling "services", so probably whatever privacy advantage Apple had over Mircosoft, Google and others would be slowly pushed out after that.

I've been reading some people saying that Apple Silicon is going to be beginning of the end of x86. Is this true? I was just thinking of building a PC right about now …
 
I've been reading some people saying that Apple Silicon is going to be beginning of the end of x86. Is this true? I was just thinking of building a PC right about now …
There is no world in which Microsoft and their partners get their shit together to move to a new, capable, ARM or other alternative architecture-based platform for PCs and particularly laptops, with reasonable backwards compatibility with existing Windows programs which is the only reason to have a Windows PC in the first place, in less than three years if not considerably more. No need to worry about it for now.
 
Apple announces App Store Small Business Program (a)

tl;dr: For developers who will make less than $1 million in net revenue (so after Apple's 30% cut and excluding taxes), Apple will only take a 15% cut instead. This goes into place next year. As I presume there's a massive long tail of developers who aren't making anywhere near seven digits on the App Store, this is a significant cut on Apple's side and a significant bonus to the dev - if your app does $50,000 in gross sales in a year, those fees drop from $15,000 to $7500. Not a bad raise.

Don't be at all surprised if Google and MS are prompted to do their own similar reductions in the near future - for the good of America's small business in these troubled times of course.
 
  • Like
Reactions: Pissmaster
Alright, we've had heaps of graphics tests and benchmarkings being done with the M1.

If it was a dedicated card, it would be average. Since it's integrated, it's fucking amazing, about 60% above Intel Xe. Civ 6 jumps from 7 fps with a early 2020 MBA to 37-46 fps with the M1. You won't play the latest and greatest on ultra high, but you will on medium.

If's being done by Rosetta, you'll get about 30-40 fps, if it's native, you'll be looking at high 40s to possibly 60 on medium, or 30 on ultra.
 
We've had a couple of other M1 models appear, although it's not likely you'll see them in the wild.

There's a 128 GB M1 MBA for education (purchaseable in quantities of 5), and a 10 GB Ethernet port variant of the M1 Mac Mini.

Incidentally, Apple took a lesson from themselves. M1 Macs that run iPhone/iPad apps report themselves as a 10 inch iPad Pro. So if you're an app/website developer and you see iPad Pros clocking 3.2 GHz on your stats, that's the reason why.
 
Back