Yesterday James, my co-founder at Percolate, sent me over a really interesting nugget about how Apple structures its company about 35 minutes into this Critical Path podcast. Essentially Horace (from Asymco) argues that Apple’s non-cross-functional structure actually allows it to innovate and execute far better than a company structured in a more traditional, non-functional, way. As opposed to most other companies where managers are encourages to pick up experience across the enterprise, Apple encourages (or forces), people to stay in their role for the entirety of their career. On top of that, roles are not horizontal by product (head of iPhone) and instead are vertical by discipline (design, operations, technologies) and also quite siloed. He goes on to say that the only parallel he could think of is the military, who basically operates that way. (I know I haven’t done the best job articulating it, that’s because as I listen again I don’t necessarily think the thesis is articulated all that well.)
Below is my response back to James:
While I totally agree with what he says about the structure (that they’re organized functionally and it works for them), I’m not sure you can just conclude that’s ideal or drives innovation. The requirement of an org structure like that is that all vision/innovation comes from the top and moves down through the organization. That’s fine when you have someone like Jobs in charge, but it’s questionable what happens when he leaves (or when this first generation he brought up leaves maybe). Look at what happened when Jobs left the first time as evidence for how they lost their way. Apple is a fairly unique org in that it has a very limited number of SKUs and, from everything we’ve heard, Jobs was the person driving most/all.
My question back to Horace would be what will Apple look like in 20 years. IBM and GE are 3x older than Apple is and part of how they’ve survived, I’d say, is that they’ve built the responsibility of innovation into a bit more of a cross-functional discipline + centralized R&D. I don’t know if it matters, but if I was making a 50 year bet on a company I’d pick GE over Apple and part of it is that org structure and its ability to retain knowledge.
Military is actually a perfect example: Look at the struggles they’ve had over the last 20 years as the enemy stopped being similarly structured organizations and moved to being loosely connected networks. History has shown us over and over centralized organizations struggle with decentralized enemies. Now the good news for Apple is that everyone else is pretty much playing the same highly organized and very predictable game (with the exception of Google, who is in a functionally different business and Samsung, who because of their manufacturing resources and Asian heritage exist in a little bit of a different world).
Again, in a 10 year race Apple wins with a structure like this. But in a 50 year race, in which your visionary leader is unlikely to still be manning the helm, I think it brings up a whole lot of questions.
Matthew Yglesias makes a decent argument that Apple Maps, while a terrible product, is succeeding at its intended goal:
To get out of that bind, Apple has never needed to make a product that’s actually superior to Google Maps. What they’ve needed to do is produce an application that clears two bars. One is that it has to be good enough that your typcial doesn’t-care-too-much phone consumer doesn’t reject iOS out of hand. The other is that it has to be good enough such that if Google doesn’t want to lose the entire iOS customer base it has to scramble and release a great Google Maps app for iOS and not just for Android. Apple’s Maps app easily clears both of those bars. Before the release of iOS 6, the inferiority of Apple’s Google-powered iOS Maps app to Android’s Google maps was a real reason to prefer an Android phone. Today, there is no such reason. Not because Apple Maps is as good at Google Maps, but because Google Maps for iOS is as good as Google Maps for Android.
This was actually part of the original Chrome strategy as well. While Google released the product because long-term they couldn’t afford to have their biggest competitor (at the time) controlling the majority of their usage, they also did it to push Internet Explorer to innovate so that Google could deliver a better and faster experience for its customers. By entering the browser market Google was able to light a fire under Microsoft that a company like Firefox never could and the versions of IE that followed were a thousand times better than what had existed before.
Ran across an interesting quote (reportedly) by Jony Ive about the difference between measurable (speed, hard drive size, etc.) attributes and the non-measurable ones:
But there are a lot of product attributes that don’t have those sorts of measures. Product attributes that are more emotive and less tangible. But they’re really important. There’s a lot of stuff that’s really important that you can’t distill down to a number. And I think one of the things with design is that when you look at an object you make many many decisions about it, not consciously, and I think one of the jobs of a designer is that you’re very sensitive to trying to understand what goes on between seeing something and filling out your perception of it. You know we all can look at the same object, but we will all perceive it in a very unique way. It means something different to each of us. Part of the job of a designer is to try to understand what happens between physically seeing something and interpreting it.
I think about this a lot. One of the things that inspired Brand Tags originally was a similar quote from my friend Martin Bihl’s 2002 AdWeek article: “The way I look at it, a brand only exists in the consumer’s mind. That other product isn’t a brand yet because consumers don’t really know about it. It’s still a product.”
The most interesting part of this interview with Horace of Asymco about the Surface is his take on the difference between how Apple and Microsoft view tablets:
I have some guesses but I don’t think it’s something that is defensible. Too many things can change. Fundamentally I believe Microsoft sees the tablet as a PC and intends to migrate a substantial portion of would-be PC customers to tablet forms. If they are successful then they preserve the existing PC user base and allow it to grow a bit.
In contrast Apple sees the iPad as a new type of device that is used for things not directly related to PC style computing. In that sense the iPad competes with PC non-consumption. It means people may own both a PC and an iPad and some will own only an iPad. The iPad will expand the market while taking share from the PC. Windows tablets will try to hold the Windows share steady.
Douglas Rushkoff asks some interesting questions about the lengths we’re going in the patent battle between Apple and the rest of the industry:
But when it comes to gestures, such as the now ubiquitous “pinch and zoom” technology through which users stretch or shrink pictures and text, well, that no longer feels quite the same. They are gestures that may have begun on the device, but which have become internalized, human movements. When my daughter was three I used to watch her attempt to enact those same swipes and stretches on the television screen – a phenomenon so prevalent that many television dealers now keep a supply of Windex handy to clean their giant flat screens of children’s fingerprints on a regular basis.
I’m assuming you’ve heard this, but Microsoft announced a new tablet they’ve developed the other day and it has lots of people talking. Anyway, I had one thought I wanted to share, which is roughly based around this quote from The Verge:
There is a gray area that exists for me with the iPad. I love using it to read, to browse the web, to share content, to occasionally create content. But there is a moment when I have to put the iPad down and grab my laptop. I travel with both. I keep both nearby when I’m at home. And I think this is true for a lot of people (it’s certainly true for a lot of people I know in the tech press).
Basically what I find most interesting about Surface is that it seems to be a nothing-to-lost move and exactly the sort of thing Apple wouldn’t do. By that I mean Apple created a new computing category with the iPad: It put a computer in a place (bed or couch) that it never really existed before. This wasn’t a replacement device, it was additive. I, like many I know, use both an iPad and a laptop and Apple’s laptop business is doing pretty well for them. Surface tries to imagine a future where there is just one. It’s not to say it will happen, but it feels like something Apple wouldn’t do and for that I applaud them.
After reading about Apple iOS VP Scott Forstall selling off almost $40 million in shares I was curious to learn more about him. I found this BsuienssWeek profile from last October that had an interesting tidbit about the Apple ecosystem:
Yet even critics don’t deny his accomplishments or ability to troubleshoot. Before the introduction of the iPhone, Forstall supported Jobs’s view that Apple didn’t need to create an ecosystem of third-party developers. Back then they figured the device would stand out for combining a phone with an iPod plus a superfast browser. For the most popular activities—watching YouTube videos, for example—Forstall’s team would simply partner with market leaders such as Google (GOOG) to create apps built specifically for the iPhone.
That worldview changed fast, as consumers began tweaking their iPhones to run unauthorized apps from hundreds of developers inspired by the new device. Forstall oversaw the creation of a software developer’s kit for programmers to build iPhone apps as well as an App Store within iTunes. Forstall’s flexibility impressed even his rivals. “Scott’s a pretty amazing guy,” says Vic Gundotra, a senior vice-president at Google. “In terms of running an operating system team, he’s one of the best I’ve ever seen.”
Forgot that the ecosystem wasn’t there from the start.
Felix Salmon has an excellent piece breaking down the whole Mike Daisey/This American Life thing and what it really means. Included is this quote from Rebecca Hamilton, author of Fighting for Darfur:
To build a mass movement quickly, it helps to have an over-simplified, emotive narrative with a single demand. It also helps to tells people that by doing easy tasks – sharing a link on Facebook, buying a bracelet — they can save lives. Central to the formula is that the agency of local actors gets downplayed to hype up the importance of action by outsiders. But all those ingredients inevitably lead to eventual failure when the simple solutions can’t fix the complex reality. The movement walks away, disillusioned. And in the meantime untold resources have been expended on solutions that have been out of step with what local activists need.
Steve Jobs was an asshole. That seems to be the overwhelming conclusion of anyone who read the biography. Genius for sure, but also not very nice and a fairly tortured soul. I used to worry that people’s takeaway from the Jobs era was that managing by being a massive jerk was the way to go, but I actually think we are past that … Anyway, that’s all a long-winded intro to this paragraph about Jobs that I would have agreed with 8 months ago (but still think is well said):
The biggest thing that bothers me about the “Cult of Jobs” is that people often seem to mistake the unfortunate, frequently counterproductive, side effects of the personality that made him great for the very cause of his greatness. Steve has long been, and always will be, one of my heroes, but I really worry that an entire generation of entrepreneurs is learning the folkloric lesson that the secret to success is to be a mercurial asshole who abuses everyone and listens to no one. There’s a reason people like Steve start successful companies: because they believe in themselves, envision their success unwaveringly, and don’t compromise. But there can be a dark side to that fanatical self belief: a disdain for the ideas of others. I think there are a lot of reasons for Steve’s late-in-life success at Apple, but I suspect one of the biggest is that he finally managed to surround himself with brilliant people (like Chiat Day’s Lee Clow) who knew how to handle him, curb his worst tendencies, and present important ideas to him in a way that he would accept.
Malcolm Gladwell’s New Yorker review of the new Steve Jobs book is excellent. In it he makes a point I haven’t seen elsewhere, essentially categorizing Jobs as an innovator, not an inventor (Gladwell calls him a tweaker, but who’s counting):
In the eulogies that followed Jobs’s death, last month, he was repeatedly referred to as a large-scale visionary and inventor. But Isaacson’s biography suggests that he was much more of a tweaker. He borrowed the characteristic features of the Macintosh—the mouse and the icons on the screen—from the engineers at Xerox PARC, after his famous visit there, in 1979. The first portable digital music players came out in 1996. Apple introduced the iPod, in 2001, because Jobs looked at the existing music players on the market and concluded that they “truly sucked.” Smart phones started coming out in the nineteen-nineties. Jobs introduced the iPhone in 2007, more than a decade later, because, Isaacson writes, “he had noticed something odd about the cell phones on the market: They all stank, just like portable music players used to.”
I know I must sound like a broken record at this point, but I feel like the distinction between invention (creation of a new thing) and innovation (commercialization of an invention) is a great way to understand how things really come to be.