Guardian Talks iPod and Mac History

I took a look at this story at the Guardian earlier today, which relates Apple's decision to not license the Mac to the current state of things with the iPod. There have a been a number of stories with the same general conclusion: not licensing the Mac was a mistake and Apple should license its music products.

That's the mainstream, headline-only wisdom about the history of the Mac, but I think it's wrong and really misses the point. Apple's mistake with the Mac was removing Steve Jobs from the project and the company. He was the heart of both. The executives that followed him just didn't understand the idea of the thing. It's no coincidence that the Mac experienced a major resurgence in the years following 1997.

Now you could argue that Steve Jobs getting fired is the best thing that could have happened to Apple in the long run, since he had time to go out and start NeXT, which would  lead to Mac OS X and a new generation of software. It also meant that he had time to buy Pixar and was forced to get Apple to make the iPod.

But back to the article. The author tries to paint a picture that the desire for "control" is at the root of Apple's past missteps.

The whole history of Apple Computer is a series of astonishing successes followed by disastrous collapses. And two clear threads run through each cycle. One is chief executive Steve Jobs. Co-founder in 1976, kicked out of the company in 1985 [...] Then there is Apple's corporate culture, which is independent, creative, imaginative and determined.


This is incredibly misleading. At the bottom of the story, the author highlights what she feels is Apple's top three successes: the Apple II, the Macintosh, and iPod/iTunes. Guess who was running the company when all of these were developed? Guess who was the driving force behind at least two of them?

The author then lists what she feels are the top three failures: the Lisa, the Newton and the G4 Cube. Steve was at Apple for the Lisa, but shifted his attention away from it towards the Mac. He wasn't even at Apple for the Newton (that was Sculley's). The Cube did happen recently and was too expensive, but it did pave the way for the Mini.

So this doesn't help her point much. Statistically, Steve's been right about technology more often then he's been wrong. It's also worth noting that Apple's "independent, creative, and imaginative" culture was sorely lacking in Jobs's absence, but this isn't really a story about Steve. It's about the Mac and the iPod. The author continues:

It is a little-known fact that in 1985, Microsoft chief executive Bill Gates wrote to Apple pleading with it to license the Mac operating system. Apple refused.

It had built its business by keeping strict control of both the hardware (the computer itself) and the software (the operating system), and feared the money from software licenses would never cover lost hardware sales. It was the biggest mistake Apple ever made.


Now let's meditate on this for a moment. These articles tend to have a business slant. Why? Because the business side of issues at least appear to be easier to understand and business people actually talk to the press. Unfortunately, this leaves out the most important part of this story: the hardware and software. In other words, the stuff that people actually use.

Microsoft won from a business perspective, but was it worth it? Did they really win in the grand scheme of things? More to the point, if they did win, was it because they licensed Windows, or because of their general ruthlessness at a time when the entire industry was naive to their intentions? IBM (the original Microsoft) in its earlier days monopolized the industry without licensing its products to competitors.


The State of Things Today

In my opinion, Windows computers are hopelessly complicated and frustrating. They've pushed more than a few people to the edge of insanity.

How did this happen? I think it's because Microsoft and the hardware vendors have always had too much faith in ability of programmers to write enough code to solve any business problem. There's this idea that it's practical to have Windows run on a wide variety of different hardware from different manufacturers and still give the consumer a good experience. If something doesn't work, it's just a bug to be fixed.

I think this is a lie. There's no way to fix bugs at a fast enough clip to overcome the fundamental issue. Either Microsoft and its licensees didn't anticipate all of the user experience problems, or they just didn't care. Either way, these issues have caused millions of people endless calls to tech support, wasted days, and miles of needless frustration. So why is this so rarely addressed by journalists? Because they don't know any better. In fact, most executives don't know any better.

Using the tools we have today, I just don't believe it's possible to create easy-to-use, appealing computers with the wintel-style business model. That is, one company makes the OS and applications and about a billion other companies of various levels of competency attempt to make hardware that works well with this software. It's classic Design by Committee.

In this equation, there's no true continuity or accountability. Even on their best days, many hardware vendors just don't have the experience or culture to make a computer product that's easy to use and predictable, even if they are in control of the software. Take software control out of that equation, and all bets are off.

A consumer believes that when they're buying a computer, they're buying a single, complete product. If they buy an HP computer, they believe that HP is actually responsible for their experience. In reality, they're buying two separate things -- a computer and an operating system -- that were developed independently. The fact that this is acceptable is somewhat unique to the computer industry.


Video Games

All of the consoles from "big 3" game companies are proprietary. You need to obtain a license to publish software for a console and there's only one manufacturer per platform. In other words, there's no "Panasonic Xbox" or "Sharp Xbox." Microsoft didn't license the Xbox OS and name to other manufacturers.

Why is this? I believe a big part of it is that consumers simply wouldn't tolerate the hassles that arise from multiple manufacturers implementing a spec in different ways. QA is hard enough for games these days, but supporting multiple implementations of the hardware would be an even bigger burden. PC gamers have tolerance for patches, drivers and updates. Console gamers (that is, most people who play games) do not.

On top of that, there are the really basic experience issues. Consumers wouldn't cope well with different manufactures employing different on-screen interfaces, peripherals, and manuals. It has to be easy to use, and a single, consistent implementation is (for now) the only reasonable way to guarantee that.


Draw a Line

If this is the conventional wisdom for game consoles, why is it not for computers? After all, the stakes are much higher. For some reason, technology companies and journalists just don't draw a line between point A and point B. To me, it seems obvious: wintel computers can't be easy to use because they're set up for complexity.

Microsoft would say that the Windows model is best for the consumer, as choice enables them to get the lowest prices for the best specs, but does this matter if the experience is frequently frustrating? The wintel model does not encourage good experiences. It's also particularly ironic that Microsoft's PR talk about providing choice in hardware is cleverly designed to divert attention away from the lack of choice in software.

Let's say that Apple had licensed the Mac to other manufacturers after Steve left the company in 1985. Or what if Apple hadn't ended the Mac clone market in the late nineties? Where would we be today? I think we could find ourselves with a very Windows-like situation on our hands. Consumers would be unknowingly buying Macs from people that had no business selling them. We even saw this with the clones that did ship.

We'd have no true oasis from the wintel model. Just one, big chaotic landscape where everyone tries to smash incompatible hardware and software together. As a result, I think Windows would be in a worse state because it wouldn't look so bad next to the Mac.

Thankfully, we do have a choice. People that value their sanity can buy a computer -- a complete product -- built by Apple. In the same sense, you can buy an iPod and get a complete, working hardware and software system for your music.

The public embraced the iPod/iTunes solution because it's easy-to-understand and it's consistent. The fact that it makes money in face of P2P networks is a testament to how important convenience is to the equation. Unfortunately, executives at various stakeholder companies don't quite get this, and they want Apple to fragment the system at the cost of usability.


So Licensing is Bad?

Can everyone pull off this strategy? Absolutely not. Apple is in a unique position. You have to know what you're doing and have real conviction to make something like the Mac work. Apple is willing to make radical and controversial choices to get to where they want to be. Most organizations just don't have this sort of culture.

It's popular to use the word "control" when describing Apple's approach to these things, but that has a overly negative connotation. I think the assumption on the part of the media is that Apple's hardware + software strategy is exclusively for business reasons -- that they want to make money from both.

While that's certainly true, it only takes a quick look at history to see the deeper motivation is that Jobs has no tolerance for things that don't just work. The best hope Apple has of making something that works properly is for them to make it themselves. Apple's agenda may not be as much about control as it is about responsibility about the experience.

This is why iPod, iTunes and the Music Store succeed. They're all designed to work with each other, and they're created with the idea that the best thing technology can do is get out of the way. This is what consumers want, and this is something that pure technology or pure media companies can't do.


The Other Side
    
Of course, the business people don't want to hear any of this. The business culture is one of licensing fees, joint press releases and stock photos of handshakes. Rarely does any of this make technology easier to use. Human factors are too often an afterthought. They don't spend a lot of time on these things at business school.

Contrary to what several journalists have said about Apple recently, licensed platforms don't always win. The video game industry is a prime example. It's anything but open, but it's radically profitable. This wouldn't be the case if consoles were a chore to use. Also, let's not confuse "licensed" with "open." Microsoft tries to use them interchangeably, but they're not the same.

Apple could probably make a lot of money in the short term licensing the music products, and I'm sure the executive team realizes this. The question is it really the smartest play if usable technology is the goal?

Apple seems to make a conscious effort to not be reactive. It's easy to forget, but more than a few predicted iTunes' demise because the Windows version didn't come out right away. The press was sure that somebody would swoop in and capture the Windows market before Apple got there.

Good, usable, reliable technology just can't be born by -- or trusted to -- a committee. The iPod wasn't the first MP3 player, and iTunes wasn't the first music store. So why did the two succeed? Because Apple knows that you have to take responsibility for the complete solution.

It may seem that all of this is a rather elaborate homage to Apple, but really my reasons for writing this are much more selfish. I don't want to live in a world where all computers and devices are complicated and intrusive. It just so happens that Apple is one of the few groups that actually get this.
Design Element
Guardian Talks iPod and Mac History
Posted Nov 4, 2005 — 3 comments below




 

Chris — Nov 04, 05 508

You are correct, sir. I've long been annoyed with those who think Apple's mistake Apple was not licensing the OS. This conventional wisdom was wrong in 1985, 1996, 1998, and continues to be wrong.

IMHO, Apple's biggest original mistake, aside from not handling Steve Jobs better, was overpricing the Mac. Andy Hertzfeld makes it clear that this was Sculley's decision in 1984, and it continued until he was ousted in the early 90's, by which time the total lack of design "taste" engendered by Steve's absence -- and an influx of useless middle management -- and a weak board of directors -- was making Macs appear to be marginally upscale Dells. To say nothing of the ridiculous OS situation.

(*Steve's* biggest mistake, of course, was John Sculley. Good marketing guy, worst possible CEO.)

My recollection (from reading various books) is that Steve was removed from the Lisa project because he was being perceived as a nuisance.

Uli Kusterer — Nov 05, 05 513

Generally agree, though from the information I have right now, I wouldn't say it was a mistake to oust Jobs the first time. He really had some deficits when it came to business sense, and the experience he gained between being thrown out of Apple and his return during the NeXT takeover apparently made him a more rounded person.

Jason — Nov 06, 05 514

I wrote a letter to the Guardian complaining about the simplistic nature of this article and the reinvention of the Apple is dead 'story' (non-story). It's very boring to see it reinvented. If Apple struggle, they are on the way out. If Apple succeed, it'll all end in tears. Good to see it properly dissected and the 'truth' about licensing Mac OS challenged (look what's happened to Palm...!)




 

Comments Temporarily Disabled

I had to temporarily disable comments due to spam. I'll re-enable them soon.





Copyright © Scott Stevenson 2004-2015