• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Apple Removes Remaining Intel Components from M2 MacBooks

Apple sells over 228 million iphones which use Lightning, compared to around 20 million macs per year.

Apple settled on the Lightning connector before USB-C existed. And then the Android industry squabbled about on a number of competing USB-based connections for years before eventually arriving on the decent USB-C standard. Seems like the best business move to make, stay with the best design for you, wait until everyone else gets their act together and then ride out your design until required to change.
 
But, the news article is about MacBook, no? Why talk about iPhone ?


Because the only reason that macs have grown by leaps-and-bounds is because Apple locks down ios developers to Mac OS.

Because of lock-in, one implies the other - and anything that increases profits for these required macs is a another dime in Tim's pocket!
 
.
Because the only reason that macs have grown by leaps-and-bounds is because Apple locks down ios developers to Mac OS.

Got a correlational graph that shows this? And shows how the correlation is anything other than coincidental?
 
Hey look logic escapes the paid Apple Troll.

So the answer is: No, you don't. You could try posting some actual info.

For instance: Apple sells 20M Macs a year and there are 3M iOS devs, so assuming replacing their Mac every 3 years (a fast cadence) that's 5% of Macs sold to iOS Devs. Its reasonable to assume that a quarter to half of all devs are Mac users anyway, so that pushes about 2.5-4% of all Macs are sold to "Unwilling" iOS Devs.

No, Mac sales have not grown leaps-and-bounds by locking iOS devs to Macs.

Unless you think 2.5-4% is leaps-and-bounds.
 
Last edited:
It is how products are used. Cpus have a one click button that sets up maximum wattage you want it to draw. The only people that don't use it are the people that want to complain that a product isnt efficient.

That could be the start of a good joke
- i have my cpu pulling 900 watts
- why?
- so i can complain on the forums that its not efficient

If you have a cpu at 100 watts that can finish a task in 100 second and you have a cpu at 200 watts that can do a task that can do the same task at 50seconds, they are at the same efficiency.

No need for CPUs to be at the same wattage to be compared in efficiency.
 
If you have a cpu at 100 watts that can finish a task in 100 second and you have a cpu at 200 watts that can do a task that can do the same task at 50seconds, they are at the same efficiency.

No need for CPUs to be at the same wattage to be compared in efficiency.
Yes but if you put the cpu that ran at 200 watts at 100 watts it will finish the task at 75 seconds cause power and performance dont scale linearly.

That's why you should normalise for either performnace or wattage, else the test is pretty much flawed
 
Self-preservation is the only logical reason.
 
It is how products are used. Cpus have a one click button that sets up maximum wattage you want it to draw.
My notebook doesn't have that button.

Nor does my PC.
 
Hey look logic escapes the paid Apple Troll.
If you make a statement, it's on you to prove that it's true. Condescension and excuses, on your part, don't make that less true.
 
Back
Top