A friend of mine found himself working for a school in the mid 1990s, deploying a new lab of Apple Mac computers designed to teach graphic design. Before the deployment, he went to the technology decision-makers and told them that they could build a much faster lab for half the cost, since their Adobe software can also run on PC. He also noted that Macs play well with other proprietary Apple stuff, whereas PCs played well with most technologies and had a much longer useful lifetime due to upgradability. It was where the industry was moving, and it made logical sense to move in that direction.
They were stupefied. All they knew was Apple, and moving to something else was blasphemy. In the end, they deployed a very slow and expensive lab of Apple Macs.
In my circles, Apple was a footnote during the rise of the Internet and networks during the 1990s. We called them beige toasters, because they were beige in color, and about as powerful as a toaster. Most people in the tech industry viewed them as expensive tinker toys for people who feared technology (or two mouse buttons ;-). Since they had their own proprietary versions of most things, including connectors, cables, file formats and protocols, it was a pain to get them to work with other technologies that were outside of the Apple ecosystem. And their OS (typically Mac OS 8/9) had a lot of issues, especially if you used certain programs, such as Internet Explorer for Mac (at the time, Internet Explorer was the best Web browser, but prone to crashing on Mac OS if you didn’t have enough hardware).
Of course, that changed in the early 2000s with the release of the Mac OS X operating system, which was essentially a rebranded version of NeXTSTEP UNIX. Because Mac OS X was UNIX, it was fundamentally open - you could easily port/run a plethora of open source software for UNIX/Linux on it, or easily customize the operating system to suit your needs. Plus, it played well with everything else on the network.
Ditto for Apple hardware hardware in the 2000s. While they used relatively obscure PowerPC CPUs in the early 2000s before switching to Intel, nearly all of their other hardware was non-proprietary, and easily upgradable. Plus, their laptops at the time had a multitude of different ports, an Escape key, and used Phillips screws. Apple even published step-by-step visual guides for upgrading or repairing anything on their website.
They were still more expensive than equivalent PC hardware, but not tremendously so, and the added build quality alongside their no-questions-asked warranty policy (at the time) justified the extra cost.
To tech-minded people in the 2000s, Apple was no longer this closed, proprietary, expensive tinker toy. They were a decent UNIX workstation manufacturer that fostered openness. And I purchased my first Mac in 2003 as a result.
The iPad and iPhone made Apple a massive company, and encouraged people to buy Mac computers, especially software developers that wanted to create mobile or Web apps. If you asked me what computer you should buy in 2012, I would have said “Mac” because their hardware, operating system, and support was excellent at the time.
But since then, it seems as if Apple has continually made decisions to move back in time to the 1990s. Rather than adopting open standards in their operating systems, they started focusing on developing their own, intended to work within their own ecosystem only. More and more parts of the operating system started to become closed and unalterable. On the hardware side, prices kept rising to reflect their newly-attained brand awareness, and corners were cut to bolster even more profit. Big hardware quality issues started popping up that resulted in class action lawsuits and lengthy repair programs. Expandability and upgradability of hardware was reduced or eliminated altogether. Apple repair shops were threatened with legal action while Apple stores told customers that small repair issues would cost thousands of dollars to fix (to encourage them to buy a new Mac). And fixed storage is now held hostage by a security chip that prevents you from installing non-macOS operating systems like Linux natively on the hardware.
In short, Apple’s earlier openness has been crushed by their desire for a closed ecosystem.
And in today’s age of open source, it’s clear that the future is open, and Macs are starting to look more and more like expensive tinker toys for rich kids.