What percentage of developers who use Macs do you imagine create macOS/iOS software, vs. POSIX software?
I’m fine with using an ephemeral toolkit to create long-lasting infra, and it even makes sense to me that some people want to sometimes write new pieces of the ephemeral toolkit to help me do that (like stylish GUI text editors or SFTP clients or notification-area service managers.)
That’s on the developers to be honest, less so on Apple (but Apple is not entirely guilt free). It was very clear in 2016 that they should’ve been targeting 64bit.
Not really, Apple could have just supported running 32 bit apps.
Google is pulling similar things in the Play Store. They will pull apps just because they are not 64 bit. They delayed the deadline for a year now, because many developers really don't care.
And as a consumer it really does not matter. Why would a 32 bit app not work 10 years from now?
That's a stupid argument, when comparing to windows examples, were that simply doesn't matter, you develop for current and 20years later it's supported.
There is no technical reason (in that they could technically keep 32bit support) for Apple to drop 32bit support since the x86 CPUs they rely on support them - i've heard that there were some architectural reasons with Cocoa (i think) that made it harder to do that, but they came up with that architecture themselves while knowing that people have invested on 32bit support. They knowingly decided that making the work easier for a handful of people was more important than the many thousands of applications that would break - they put themselves above their customers.
Suppose the sole developer died five years ago. Had he created an application for Windows, his users would still get to enjoy it. Had he created a FOSS application and given his users the source code, his users would still get to enjoy it. But if he created a closed source MacOS application, then his application soon joins him in death.
Please be less patronising in the future; it's not a good look. In reality, of course, many people currently "sitting around typing" are doing plenty of useful work using a variety of different tools.
There are various tradeoffs to be made when considering the benefits and costs of a strategy for long-term backwards compatibility. Microsoft was traditionally on the "it will continue to work until the heat death of the universe" end of the spectrum, and Apple on the "we will be dropping support soon" end. That is, Microsoft invests a bunch of effort into ensuring old software will continue to run on new platforms, whereas Apple prefers to drop support for various platform features (68k, PPC, 32-bit) with a few years of notice.
In some cases and for some users, backwards compatibility has a value that is worth the cost paid in development time, testing, and complexity. For others, it is not. I've been a Mac user since the G4 days, and I don't think I've ever really been in the situation where I thought "oh I need to run this old software but it's no longer supported" – that is, the feature has no value to me. And I'm pretty confident that the things I produce will continue to have value in the future.
TLDR – other people are different to you sometimes.
Not every piece of software is a vape-delivery-via-drone-app.
I have about a dozen small very specific scientific 'calculators' (only the windows/linux ones that still run, the mac ones are all dead which is a darn shame since there were some jewels out there) that were written by researchers 10-20 years ago and never updated since. They haven't been touched by the original developer in years, in fact many of their host websites have been lost to the ages so they exist only through sharing on forums and whatnot. But, since they do not need any sort of connection to the outside world (times were simpler back then you see...) they still run fine in compatibility mode. For some of them their functionality can be replicated with modern software (probably in a web app or something), but the original tools still work perfectly for their jobs and are still useful today. Don't even get me started on all of the equipment which is stuck running windows 98 because of the serial port library used to communicate with the equipment was using the dos api...
I also use a number of paid tools that I paid for back in the '90s or early 2000's and haven't felt compelled to 'upgrade'. They run great and give me the answers I need, so until the vendor has added actual useful features (instead of reskinning them with a flat/metro interface...) I see no reason to upgrade.
In 1991 I updated for the last time a piece of software (DOS based with proprietary multithreading and graphics layers). The software controls some scientific device, collects, processes and displays experimental data in real time and stores data for later analysis.
About 2 years ago or so I think I was shocked to discover that they're still using said device and the software in unmodified form running it under DOSBox or something like that
I wasn't claiming that there isn't any software built 20+ years ago which hasn't seen updates and yet still works and still has utility.
The point is there isn't much of it relative to the wider ecosystem.
One particularly important reason for that is security. Most of the dependencies for software written 20+ years ago have long since been EOL'd and are therefore vulnerable.
There is more software out there than just desktop apps.
At my work there is code from 1980's written for IBM mainframe in PL/I language still running. Has seen minor changes over the years but very much 40 year old code at the heart of it. This code has basically had continuous uptime since then. I work at industrial plant I have heard not uncommon for Banks, insurance companies, power plants etc to have similar setups with same vintage of software.
We have C and C++ code from the 90's originally written for RS/6000 era Unix boxes. That stuff is pushing 30 now. still actively maintained.
There is Fortran with genesis dating back to 80's still being run today.
A 30 year lifespan for industrial equipment is not uncommon it would be nuts to replace control software and things like that so often.
Yeah! Screw those filthy Mac lovers! They do nothing important. You can tell by the OS they are using! They should get a real computer designed for serious business! I am still using Windows Vista and plan to for the rest of my life!
Dude, if you are trolling, don’t. If you aren’t, also don’t.
If there is anyone who'd think "screw those filthy Mac lovers" that would be Apple themselves since it is they who are in a position to preserve backwards compatibility and yet they decide to not do so.
As a Mac user, I don’t feel it. I know people get really particular about this shit, but I spend roughly 4 hours every year making macOS do what I want. That’s it. When I ran Ubuntu, it was about 8-12 hours every 6 months. When I ran Windows it was... it was a lot more. Backwards compatibility isn’t as important to me as shit just working. My current setup hasn’t been changed in about 2.5 years. Early 2017 is the last time I opened system settings at all. Sure I periodically install a homebrew package or two, but I consider that to be a part of the normal workflow, not an OS tuning thing. All my other daily software and even stuff I don’t use frequently just works.
I know that sounds like I am some kind of fanboy. I assure you I am not. I wish I could go back to Ubuntu where I feel most at home. But the fact that I think so little about my OS that I forget it is there is what makes this so easy for me to stay.
> Backwards compatibility isn’t as important to me as shit just working.
Backwards compatibility is about shit working! You can't have that when a program you want to use stops working because the OS broke backwards compatibility.
I have used an iMac for a long time, every time there is a new version of the OS something breaks and in some cases there isn't a way to fix it. My 2009 iMac went through several OS upgrades and pretty much half of the installed software doesn't work anymore and in most cases i cannot even upgrade - either because i bought the software outside of MAS (which didn't exist at the time i made some of those purchases), or because the developer stopped supporting the software (including some of Apple's own software - e.g. i really like iWeb, though Apple doesn't anymore) or simply because the developer doesn't exist anymore. Or i just prefer the version i have over the version the developer would want me to use (because they bloated it up, added ads or whatever reason).
Yeah, when i first started using macOS i had pretty much the same stance as you. It went away when thing started breaking left and right over the years. There is no "shit just working" if the OS breaks said shit.
(of course in the 2009 iMac's case that isn't very relevant anymore since Apple decided a 3GHz Core 2 Duo isn't enough to run their OS, but it still happens with some newer Macs i have)
Backwards compatibility is for when software reliant on the OS isn't actively maintained. If it is, developers simply update it and it keeps working. So it depends on what you use in your stack/workflow.
> few of those people you see sitting around typing on their MacBooks are doing anything that will be of value 5, 10, or 20 years from now.
I mean, the people "typing on their MacBooks" combined together have resulted in a company with THE highest value ever in human history, and is copied by all the other companies that you champion.
Graphviz was written on Macs and Linux. Its native OSX container was developed by someone now working at Apple. Most of its code is OS independent. Like everyone, we are not thrilled with MacBooks these days (expensive, not great keyboards, useful ports removed, and seeming love-hate relationship with Unix * ) but if you want to run Office and Lightroom, run typical python data science code, what else is there? A colleague tried the Windows Linux system and it kind of works but not well enough for what we're doing. It's the same everywhere.
* Here's a fun exercise: try to dump your iCloud passwords in cleartext so you can scan for potentially compromised passwords on forgotten systems (using shell commands, not a proprietary password manager.) OSX doesn't even generate messages to acknowledge that it's designed to not let you do this.
> if you want to run Office and Lightroom, run typical python data science code, what else is there? A colleague tried the Windows Linux system and it kind of works but not well enough for what we're doing
Ubuntu, Debian, Fedora, Arch
The answer to Office/Adobe/gaming is simply don't do it, get someone else to do it, or get a different job. Your time is better spent on stuff like python data science, which Linux supports better than anything else out there.
If you must, run Windows 10 in a VM. If you're doing heavy duty video editing or Photoshop, get a second GPU just for the Windows 10 VM[1].
> The answer to Office/Adobe/gaming is simply don't do it, get someone else to do it, or get a different job.
I am not sure recommending someone to get a different job is a good idea simply because of the choice of their operating system... I'm a huge linux fan, both of my personal laptops have linux and have been using it since 2006. But I still know that realistically it's not the best OS for certain tasks, like gaming. That's why my gaming PC has Windows on it.
> try to dump your iCloud passwords in cleartext so you can scan for potentially compromised passwords on forgotten systems (using shell commands, not a proprietary password manager.)
I suppose that depends on where you live or work. Around here there are perhaps 10% of users of any brand who act like that (except in the BlackBerry era where it was part brand, part BBM). The rest just uses what makes sense for their use case.
I currently do part time consulting in company that has floor full of developers. All Apple. Not a single piece of software they develop actually targets Mac. They do full stack wit cloud and browser targets.
Most of the companies I visited the last few months have a choice of systems plus a BYOD budget, but still most people prefer getting a Mac, and again, almost none of them are writing software for macOS or iOS, and are as in your example often working for web, cloud or managing *nix servers.
The obvious conclusion is that there is quite a bit of prejudice and bigotry in your worldview.
And if you find this observation unwarranted: you not only made a sweeping generalization against everyone who does X, you authoritatively dismiss all FUTURE potential of their work. It must take no small amount of bitterness to see things that way.
Well looking at where the Microsoft ecosystem is now and where the Apple ecosystem is, I think Apple made the right choice. Apple alone sells more devices running macOS variants than all Windows PC makers combined.
As far as backwards compatibility though - MS’s new ARM based Surface won’t run 64 bit x86 Windows apps and run 32 bit x86 apps slowly and still gets half the advertised life of iPads.
Windows on Surface took 12GB of hard drive space compared to about 3GB for iOS on iPads.
I said MacOS variants - iPhones, iPads, and Macs. Those are three areas where we have recent real numbers from Apple before they stopped reporting volume numbers.
I could include Watches and AppleTVs but those are far from generally purpose computing devices.
MacOS is a desktop operating system, the other devices you mentioned don't run desktop operating systems. You have to do an Apples to Apples comparison, because otherwise linux wipes the floor with every OS if all we are talking about is market share on metal.
Why do you have to compare desktop operating systems? An iPad Pro running the latest OS is faster than most x86 PCs, run Office and now Photoshop and can easily take the place of a desktop computer for many people.
But regardless, if you are referring to “successful”. Which I’m discussing, success from a business standpoint is profitable. Apple is definitely making more in profit and revenue than Microsoft. As in, Apple had the more successful strategy.
There are plenty of people whose only computing device is a phone and others who are hardly ever use a computer for personal use. Even the iPhone can keep up performance wise with some low end PCs being sold.
Well, if I wanted to say that, that’s what I would have said....
But I said variants. But, if you take MacOS out, the statement remains.
And seeing that iOS is now running Microsoft Office a version of Photoshop and is using more powerful processors than most x86 based PCs, we can complete take out MacOS and just iOS.
I said nothing about popularity, I was saying successful. Success in a business isn’t marketshare or popularity its profitability. Seeing how little profit that OEMs are making selling either Android devices (most of whom are losing money - except for Samsung) or PCs, I wouldn’t be surprised that Apple is more successful than all of the PC makers combined selling Macs and its well known that Apple captures more than 70% of the profit in mobile.
People don't use their phones to write documents or do professional image/video editing or any professional workload, really. And I think market share is as good a metric as any when it comes to defining "success". Profitability is due to many reasons and it's not always due creating the best product.
Can you spend “marketshare”? Can you sustain a business on “marketshare”?
A company can’t stay viable based on “marketshars”. Next am I arguing what’s “best”. Do you think Dell would rather be in Apple’s position with “market share” or Apple’s in Dell’s?
There are plenty of people whose only personal computing device is their phone either by choice or necessity. Heck I am a developer and the only thing I use my computer at home for is as a Plex server.
It’s on my list to get a powerful enough NAS to transcode movies and I won’t even use it for that. I’ll run Plex and bit torrent (to uhh download Linux ISO’s) and B2 backed cloud backup app directly on it.
My wife gave her computer to our son because she uses her iPad for everything including Office.
Now that Apple (finally) supports a mouse, I connect my same Bluetooth keyboard+mouse to my iPad that I use for my work computer when I bring it home.
As an outsider, it is very difficult for me to understand Apple's story on whether these devices are, or are not, on the same OS; what its/their "root" was; and what's the future direction (separate OS, or convergence, and if latter which one will form the core and which one will transform). The newest "iPad OS" or whatever it is called does not help matters - again, as an outsider, I cannot tell if this is a marketing differentiation or a true one.
All that being said, I rather thought that iPhones & iPads run "iOS", and Macs run "OSX", and that they were different.
First there were two operating systems Classic MacOS and Next.
Apple acquired Next and they combined some parts of classic MacOS and NextStep to create OS X.
Apple then stripped OS X down, got rid of some Mac specific parts, added some new frameworks to create “iPhone OS” and introduced the iPhone Initially they claimed that the iPhone was “.running OS X”.
They introduced the iPod Touch later the same year and the iPad 3 years later at this point, they renamed “iPhone OS” to “iOS”.
They introduced watchOS and tvOS as more variants of iOS. There is also some iOS variant running on HomePods.
Over the years the two operating systems have both somewhat diverged and they still share both some old code and new code between the two.
This year, they renamed the version of iOS running on the iPad “iPadOS” as they started adding more iPad specific features.
They also introduced the “Catalyst” framework to bring iOS specific frameworks to the Mac to make porting from iPad to the Mac easier.
Finally, they introduced Swift UI as a common cross platform framework for watchOS, iOS, macOS, iPadOS and tvOS.
They all run the same kernel and runtimes, but some diversified distributions have extra runtimes and frameworks for their task-specific implementations.
- Cocoa
- Objective-C
- XNU
It used to be all based on pure Darwin (also from Apple, but OSS). But since the iOS releases it has been diversified too much and Apple no longer wanted to backport to their own OSS (but they still backport to all GPLv2 and lower).
What does “variant” mean? It’s not some obscure word that no one understands. Would it have been more clear if I said Darwin variants? Who the heck knows what Darwin is besides a few nerds?
I didn’t say that MS isn’t good at maintaining backwards compatibility. My mom in fact uses my old Mac Mini 1.66Ghz Core Duo introduced in 2006 running Windows 7 and I just retired a Core 2 Duo 2.66Ghz laptop introduced in 2009 from running my Plex Server running Windows 10.
I’m saying that it hasn’t been a sound business strategy as the rest of the tech industry has moved on. Windows (not Microsoft) has failed in the cloud, the web browser market, and mobile.
> Windows (not Microsoft) has failed in the cloud, the web browser market, and mobile.
I don't agree on the Cloud, it's the strongest competitor to Amazon, and on the other businesses there is no competition: mobile and browser is Google.
That’s what I meant. Microsoft hasn’t failed in the cloud - Windows has.
As far as mobile being “google”. In terms of revenue, it came out in the Oracle lawsuit that Android has only made Google about $33 billion its entire existence. Less than the amount that Apple makes in two quarters on iPhones. Google also reportedly still pays Apple $8 billion a year to bectge default search engine in Apple devices.
> Microsoft hasn’t failed in the cloud - Windows has.
as a server, maybe.
as a client not really.
> Android has only made Google about $33 billion its entire existence
Yeah, they are good at hiding profits
and they are an ADV company which dominated the mobile market because it was strategic, they don't need to profit from selling (and manufacturing) the HW, they just need your screen time.
There are between 2.5 and 3 billion android devices around the world.
And it's almost impossible to own an android devices without Google SW on it.
> Less than the amount that Apple makes in two quarters on iPhones.
That's not really true, and iPhone revenues are declining every year.
In 2018 thy made $33.36 billions, down of 9.2% from the previous year.
If Apple loses the mobile market, it's finished.
But people would still watch YouTube advs on iPhone replacements.
Yeah, they are good at hiding profits
and they are an ADV company which dominated the mobile market because it was strategic, they don't need to profit from selling (and manufacturing) the HW, they just need your screen time
You think Google lied under discovery? Oracle wasn’t just counting their meager hardware sells but they were also counting as sales, Google Play revenue etc.
And it's almost impossible to own an android devices without Google SW on it.
There is a country that has over 1 billion people running Android with no Google Services.
That's not really true, and iPhone revenues are declining every year.
And it’s still about the same as MS’s revenue last quarter at 38 billion
If Apple loses the mobile market, it's finished.
Apple is far more diversified than Google. Almost all of Google’s profits come from advertising, 48% of Apple’s revenue come from the iPhones.
The Mac business by itself is about the size of McDonald’s the last time I checked.
Estimates for Youtube is that it’s barely profitable if at all.
"Well looking at where the Microsoft ecosystem is now and where the Apple ecosystem is, I think Apple made the right choice. Apple alone sells more devices running macOS variants than all Windows PC makers combined."
This is a fancy way of saying that Apple sells more watches and phones than Microsoft does computers, which is a whole lot less impressive than the disinformation version you wrote above.
When I look at the microsoft ecosystem: something like 90% of all computers, nearly every corporation and business in the world, etc, I don't see anything to be ashamed of.
This is a fancy way of saying that Apple sells more watches and phones than Microsoft does computers, which is a whole lot less impressive than the disinformation version you wrote above
How is it “disinformation”? Do you think anyone on HN doesn’t understand the statement to mean what I said for it to mean? I said nothing about the computers MS sells (the Surface line). I said Windows PCs in general.
Microsoft sells a lot of Windows licenses to business and consumers but all of the energy from a development and usage standpoint is on the web. I bet most businesses could replace a lot of their computers with Chrome OS boxes and not miss a beat.
Apple sells a >$1000 phone labeled for "Professionals" which comes with 32GB of on board memory.... total. When you upcharge high end customers $150 for a $15 stick of NAND, yeah, people are more touchy about 8GB of system file
It’s not about the “system file”. It’s also about RAM usage. Lowend Surface laptops come with the same amount of storage as iPads but between using x86 chips that are a lot slower and less energy efficient and the boost of Windows, it’s nowhere near the same experience.
No iPhone Pro comes with less than the 64GB of storage. Do you really want to talk about marginal price and marginal cost? A Windows license has no marginal cost.
> How is it “disinformation”? Do you think anyone on HN doesn’t understand the statement to mean what I said for it to mean?
At least one person[0] didn't understand and publicly stated as much. While iOS was originally spun off from macOS, it's very much it's own system, with there being no ability to run macOS applications on iOS, and vice versa (although the latter is slowly changing with recent macOS releases). Comparing the two is disingenuous at best.
Most iOS app can run on MacOS - the iPhone simulator compiles iOS code natively to x86 and links to an x86 version of the iOS frameworks.
Apple also introduced both catalyst and SwiftUI to make porting back and forth easier.
And yes “words mean things”. I specifically said “variants.” So if again you want to be pedantic, I could just as easily say that Apple sells more iOS devices than all Windows PC makers combined.
Also, if you want to try to exclude iPads from the iOS ecosystem because Apple now calls it “iPadOS”, iPads can still run iPhone apps.
You can't run After effects or do software development on iOS. You can't run PHPStorm or docker or final cut pro. I see the iPad/iOS as being usable for a very small slice of professionals.
Guess what? Most people aren’t developers nor do they run After Effects. They do however run Office. Photoshop was also just introduced for mobile. Heck most people don’t do anything with their personal computers but “consumption” and even that has been moving to mobile
What part of my post were you addressing? My whole argument is that in the long term, MS’s focus on the Enterprise and backwards compatibility caused them to miss out on all of the next waves of tech industry - including the web and mobile.
Even in their crowning achievement - the cloud, they are a distant second to AWS and Amazon always brags that they run more Windows instances in AWS than Microsoft runs on Azure.
MS’s revenue is half that of Apple’s and they have lower profits. Heck, last quarter Apple made about the same as Microsoft even if you subtract iPhone revenue.
> My whole argument is that in the long term, MS’s focus on the Enterprise and backwards compatibility caused them to miss out on all of the next waves of tech industry - including the web and mobile.
They missed out on mobile for sure, but it's not because of their focus on the enterprise or backwards compatibility, and you'll have a very hard time proving so. Also they haven't missed out on the web either, and neither can you prove that they did. Just because there is a bigger cloud provider doesn't mean that they 'missed out'. And there will be bigger fish to fry, new technologies are coming out every year.
Apple and Microsoft are in vastly different spaces. Apple focuses on consumer products, Microsoft makes its money from the enterprise. An Apples to orange comparison is not meaningful.
You don’t call having to use its biggest rivals browser engine and Bing missing out on the web?
As far as mobile, Apple was able to take its much smaller less bloated OS, cut it down to a size that could fit on a device with 128MB of RAM and have for what at the time was a full featured web browser. Did you ever use IE on a Win CE device?
And are you forgetting that Microsoft was the dominant consumer operating platform before mobile? MS didn’t have any choice but to retreat to Enterprise.
So what “bigger technologies” do you see coming out then mobile? The smart phone already has an 80%+ worldwide penetration rate of adults.
They run Windows because their customers demand it for their VMs. You really think Microsoft spent 24 years fighting the browser wars and liked just giving up and giving control over to Google?
Microsoft for once acted very differently from its usual self, and Windows Phone repeatedly broke backward compatibility. Arguably even more than iOS did - and this was one of the factors in its failure. So this is not the argument against backward compatibility you think it is.
Besides, if you're comparing MS and Apple, you need to include all the VMs running Windows (it's not a physical device - but MS makes the same amount of money on it), and also MS's foothold in Cloud.
Apple practically does not exist in those spaces, and Microsoft would have been in the same position had it broke compat as often as Apple does on the desktop.
> The obvious conclusion: few of those people you see sitting around typing on their MacBooks are doing anything that will be of value 5, 10, or 20 years from now.
Maybe to a troll or someone with an axe to grind, but my point of view is different.
If you’re doing the same exact thing 5 years later (let alone 10 or even 20) without any tweaks in your process, then frankly you have a job a robot should be doing.
I’ve got production code I wrote twenty years ago still up and running, despite multiple upgrades to the language and hosting Infrastucture itself that it’s running on. Should that be the case? Not in my opinion. Business needs change, that code should’ve been updated. Instead developers just added on more around it and that’s how you get bloated beasts like MS Office, Photoshop, etc.
> If you’re doing the same exact thing 5 years later (let alone 10 or even 20) without any tweaks in your process, then frankly you have a job a robot should be doing.
Banks and insurance companies tends to disagree with you.
Maintaining software is just like maintaining buildings, if you don't they fall apart, but mainly it's just about checking that everything is still the same it was when it was built.
You don't change elevators in a building just because the old model is not supported anymore.
There's no conceot of "not maintained anymore" for elevators.
So maybe the poster was trolling, but it is true that you cannot rely on Macs if your software has a predicted life span longer than a couple of years.
> Business needs change
Again, many established businesses work because they don't change much over time.
They just keep doing what they do best.
> that code should’ve been updated.
That code worked, why in the hell risking to introduce new bugs?
I worked on software packages made by millions of lines, you don't just update them because your supplier won't bother supporting your workflow for at least 10 years.
Even Github, a modern fast changing company, was running on Rails 3.2 that was released in 2012 until September 2018, they switched to 5.2 and it took a tech company with some of the smartest engineers around and Rails contributors one year and a half.
"If you’re doing the same exact thing 5 years later (let alone 10 or even 20) without any tweaks in your process, then frankly you have a job a robot should be doing."
Find me a robot that can navigate 4x4 trails and identify rocks with accuracy.
None exist.
And that kind of prospecting hasn't changed in centuries.
Those are some very naive assumptions you've got there padawan. Sure some software needs to fade away, but just rewriting something to rewrite it is stupid.