On returning to Facebook, the lack of perfect policies, perfect politicians, and perfect politics, and my desire for healthy discussion amongst all points of interest along the political spectrumRead More
I am an unduly blessed person. My talk, "Shakespeare in Dev" from Øredev this year was, in part, about just that: how fortunate I've been. I mentioned, for instance, that I got my really-real start in "computers" due to a mix-up in paperwork. I was intended to be working in a warehouse, at Compaq, unboxing returned computers and putting them on a conveyor belt, from 11 PM to 7 AM. I didn't escape the night-shift, but the paperwork mix-up did put me on the other end of that conveyor belt, in charge of refurbishing the returned computers. I had not seen the inside of a computer, but for a brief experiment in swapping the 5.25" floppy drive in my 486SX for a 3.5" floppy drive, about ten years earlier.
A similar opportunity came about six months ago. Co-worker Jesse Cravens asked me if I'd be interested in co-writing a book with him about Ember JS, a book we're hoping to have finished before year's end. That, and Jesse's extensive conference speaking experience (he's done 12 this year, and he's not done), led Øredev to invite him to speak at their conference this year. Jesse was kind enough to ask if they'd be interested in having the two of us co-present on our book, and the Øredev organizers were beyond kind in extending the invitation to include me.
They like to ask their speakers, many of whom they're flying into Sweden from all over the world, to do two talks, so as to get their money's worth. The theme of the conference this year was "The Arts," the corollaries between the programmer's craft and the artist's, as well as the pure inspiration that could be drawn from the arts.
I submitted a few ideas for talks that I could do, trying to stick to that theme, and the one they chose was "Shakespeare in Dev." It was pitched as a survey, a crash-course in the user of story-telling in user experience/interaction design.
The trouble was, right off the bat, I felt like a fraud. I have some experience. I have some knowledge. But I didn't feel like I would be able to talk for 50 minutes to a crowd of people who could easily be ten times as knowledgable and experienced in user experience design. I decided I would study up. I'd pick up all the books, and become an expert over the summer.
To start, this was a bad plan. This was a really bad plan. I have two children and a full-time job for Pete's sake. I barely had time to write a talk—I didn't, really, let alone read 15 books first and then write a talk. Not to mention that part-way into this time period, I decided to take on some freelance work. Goodbye free time.
This worked out for the best, though. If I had found or made the time to read all 15 books and synthesized them into a succinct overview of the topic of story-telling as a design tool, I would have succeeded in creating a painfully boring talk. It might have been insightful. It might have been well-informed. It would have been boring. I wouldn't have provided much, if anything, that the audience couldn't have had by reading the same books.
Thankfully, at the last minute (I finally finished up my freelance work just a couple weeks before the conference), inspiration struck. I realized that the aspect of story-telling that was most interesting to me was the fact that it's a discipline-agnostic tool. It's not a super-specialized tool that you have to spend years honing in order to be good at it. It's not like oil-painting, or using Photoshop, or mastering object-oriented programming. I believe it's a nearly universal skill to all humans. Some of us are better at it than others, and you certainly can hone it to become a skill like oil-painting. But at basic proficiency, the average person can tell a story that can have a profound impact on the world around them.
I was interested in how this could affect the dynamic among a team working on a software project. It's difficult for everyone on the project to have an impact, or to even participate in the same discussion, much of the time. Story, I thought, might be just the thing, just the tool—one that everyone on the team can implement with roughly equivalent skill—to be shared by the whole team, giving everyone equal impact on the work to be done.
While I was thinking about all of this, I reflected on my own career, on the saga of developer versus designer versus QA reviewer versus project manager. I remembered the day it all clicked, the day it all began to make sense to me, because of something a designer named Michael Chang said to me, a story he'd told me.
That was it!
I started over, after spending four or five nights on the talk, about half the time I had left. But I felt hopeful, now, finally, that I had something to say. If I'm telling my own story, there's no right or wrong. There's nobody with more expertise in the room on the topic. And, best of all, I was now telling a story, not just talking about telling stories.
This was my first time speaking at a conference. I was terrified. I've spoken publicly plenty of times, at company functions, at church. I've acted in plays and skits in front of large groups. I've played guitar and sung in front of relatively large crowds a couple times at least. Nonetheless, I was terrified. I barely slept the night before, and my stomach was upset the whole day of my talk. I barely ate.
And to put a little bow on it, my two talks—my personal talk and my joint presentation with Jesse—were back-to-back. Well, at least I'd be done afterward!
Oh, and they put me in the big conference hall—the same one they were using for the keynote presentations.
I'm told it went well. Watching the video, I'm pretty happy with what came out. It's not perfect, by any means, but I do think it's about the best I can do right now.
If you're reading this and considering talking at a conference, here's what I wish I'd known, or thought about:
- Have something to say. Figure out what someone attending your talk needs to know, or what you most want them to know, and figure out whether it really matters. If not, start over. While you're at it, have something unique to say. If the conference attendee could get your content somewhere else, why would they have paid likely thousands of dollars in conference admission and travel expenses to hear it? If your content isn't unique in some way, start over.
- Learn how to use a microphone, and be aware of it. Don't be shy. Make it easy for your audience to hear you. Don't blow your nose, scratch your beard, etc. into the mic. If you know you have a cold, work out with your sound guy how you're going to mute the mic when you need to blow your nose, use your hankerchief, etc.
- Don't compare yourself to other speakers. This should be an idea meritocracy, if there ever was one.
- Do entertain. Don't be boring. You don't have to tap-dance, but it doesn't hurt to relax a bit, share something personal, crack a joke (but don't force it) or generally demonstrate that you don't take yourself too seriously.
- Your slides matter. You may not be a designer. You may fancy yourself a designer. Either way, get some help. Ugly, confusing slides are going to distract your audience. NO CLIP ART. NO CHEESY STOCK PHOTOGRAPHY.
- Don't look at your own slides. Definitely don't read them. Look at your audience. Presenter's notes are okay, but don't read them. Practice your talk enough that you pretty much have the outline memorized. Put the outline in your presenter's notes. Use them to help you remember where you are, maybe even specific data points (any actual hard numbers you need to get exactly right, quotations, etc.), but do not read them out loud.
- Keep your slides simple. You don't need transitions, builds, and all that stuff. Once you're good at using slides without that stuff, you'll have a better idea of when to use them. Don't use sound or video unless you absolutely need them. The audience wants to hear you, not something canned that they could have heard without spending all that money to get here.
- Bring some water. If you're nervous, it can save your life.
- Consider the possibility that people are going to ask questions afterward, either while you're still on-stage or otherwise. If you can't answer a few questions well, you can squander whatever credibility you built up during your talk. You can end up discrediting yourself and your talk.
- Give yourself a break. It's harder these days. With Twitter and such, you can't fail in private anymore, but neither can anyone else. Your first talk might suck. That might be enough for you to figure out this isn't your deal. It might be motivation to practice. Either way, it won't end your life or career. And, you might not suck! You definitely might not suck as much as you think you did!
For reference, you can watch my talk, "Shakespeare in Dev," here.
Pro tip: If…
- You're in, say, the top third of the S&P 500
- Your last report card wasn't so good, or you think your next one might not be
- Wall Street shuts down for a day or two due to a storm
Let's just say maybe it's a good time to start buttering up your LinkedIn "friends."
Just ask Scott Forstall and John Browett, who, at least by the end of the year in Forstall's case, no longer work for Apple.
This is big.
Big enough to pull me out of an uplanned hiatus from the blog. Family emergency. My mother-in-law is recovering quite well, if slowly, from cardiac arrest while competing in a—not her first—triathlon.
Big enough that it brings a pretty obvious final answer to a series I'd been working on that could have been titled, "Who the hell is in charge of User Experience at Apple?" (Part 1: "What's missing from Apple's Org Chart?" & Part 2: "Apple and The CXO")
It's clearer now than it has ever been who it was that answered for user experience design across Apple. Just look at the lanugage of that press release, on what activities are being transfered from Forstall to Ive:
Jony Ive will provide leadership and direction for Human Interface (HI) across the company in addition to his role as the leader of Industrial Design. His incredible design aesthetic has been the driving force behind the look and feel of Apple’s products for more than a decade.
Who's the CXO at Apple? Well, now we know. It's Jony Ive. The oh-so-obvious, but oh-so-wrong answer that so many people would have offered for so many years now has become the right answer.
And I think this might be the biggest news since October 5th, 2011.
I don't think that for all his time as VP (or senior VP) of iOS Forstall called all the UX shots. I believe that when Steve Jobs was alive, this—along with whatever else he cared about that day—was Jobs's purview. But when you phrase it in the business organization classic definition of "a throat to choke," that throat was Forstall's, I think, for some time now.
But the most dramatic subplot of this whole story is Sir Jony Ive's. Ive doesn't have any UI/UX design under his belt, at least not any that anyone knows of. He has seen great success designing hardware for Apple for over two decades, but from what we can see from the outside he hasn't touched a pixel.
This could go two ways. In the first scenario, Ive could turn out to be an incredible UI/UX designer as well as an industrial designer, or, perhaps more likely, he could prove to be able to lead a team of UI/UX designers effectively. In the second, he could terrible, or even only mediocre at the job, either of which would have the same outcome.
Apple is more vulnerable than it has been in some time. If it turns out that Ive is no good at this, Apple's reputation will be shaken like it hasn't been since the Newton. People already have high expectations of the man that was knighted for being such a design badass. People are looking for someone on whom to pin the Steve Jobs legacy. If Ive fails, "beleaguered," will be the nice thing the press says about Apple, which will certainly catch the attention of both the customers and Wall Street.
There is a hell of a lot (around $604 per share at the moment, not to put too fine a point on it, but, of course, Wall Street is closed for a few days) riding on that young man right now. For once I don't envy him.
This must be a first. PetaPixel reports that Ive will be designing a very limited edition Leica camera.
I don't recall Apple every loaning out key team-members like this. Ever.
This is for a charity event, and it does involve Bono, so all of the "just this one time" flags are flying, but, this is very unusual.
If I were paranoid, I'd say Ive was looking for greener pastures. There was kerfuffle last year that Ive really wanted to leave Apple and move back home to the UK, so that his children could go to school there. Of course, this May, when he was knighted, he said he's staying put.
Apple doesn't let stuff like this happen without thinking it through. Friends who work there have reported that they can't speak at any public event as an Apple employee without clearance from marketing. This is calculated. More than likely, this is Apple sacrificing a bit of Ive's time for a good cause, and a tax write-off.
Quite possibly, this is Apple and Ive saying to the rest of the industry, "well, we've lapped you enough times we're going to take a break now."
I don't know about you, but one major facet of my cognition until about, oh, say the age of 20, 25 maybe, was the conspiracy theory.
Unseen forces pooling their resources and efforts to thwart me were the most obvious, and therefore best explanation for so, so many frustrating parts of my youth.
The craziest thing, though, is when you find out your instincts were right.
As a young man of about 11, I vividly remember sussing out that the programmers who produced much of Nintendo's content intentionally made the games nearly impossible to finish—"conquer" in the parlance of the time—in order to make them last longer.
As an older, obviously wiser man of 20-ish, I decided that my logic was flawed. Surely if you finished the game faster you would rush out to buy a new one sooner, and therefore give Nintendo more money. Therefore, my childish paranoia of mean game developers was just that.
According to that article, I was right. The first time. Sure, you might not be as quick to go out and buy a new game if the current game you were playing took longer to complete. However, the more investment you made in that game before completing it, the greater your perceived success, the more dopamine discharged into your pre-pubescent nervous system, the more hooked you became, the more likely you were to go out and buy another game or two.
It was a bet on the long tail.
And those bastards knew what they were doing. Grab any male between 30 and 45 and say, "but our princess is in another castle" and look for eye-twitching and phantom thumb movements. Ask him to mime entering the Konami Code.
UPDATE: It's in the zeitgeist today. Just got this amazing Pac-Man-as-Kafka-story SMBC comic link from a co-worker.
Previously, on Bash Modern Quantity…
Coming up on a year ago, I asked the Internet "What's missing from Apple's Org Chart?". My premise went…
- Apple's biggest advantage over its competitors is its superior user experience,
- this superior user experience is the result of having a strong UX team at Apple and that
- a key to maintaining or growing this team and its strength would be strong, empowered leadership.
After lots of digging I could only find evidence of a director-level position within the UX discipline at Apple (also here). No vice presidents. No senior vice presidents. Nobody with a C in their title. It seemed obvious enough that Steve Jobs would have seen himself as the C-level representation of UX concerns at Apple, but it seemed equally obvious—to me, at least—that Tim Cook is not similarly capable of wearing that hat. It seemed to me it was time to appoint a high-level head of user experience design at Apple.
This week, on Twitter…
The crux of the problem is that building great experiences is everyone’s responsibility and nobody’s job.
If anyone was to have a CXO, wouldn't it be Apple?
Well, I think they do have a CXO, of sorts, and I'll tell you who it is. Well, actually, I'll let Steve Jobs tell you what he told Fast Company:
Think of it this way. If you look at your own body, your cells are specialized, but every single one of them has the master plan for the whole body. We think our company will be the best possible company if every single person working here understands the whole master plan and can use that as a yardstick to make decisions against. We think a lot of little and medium and big decisions will be made better if all our people know that.
John Siracusa, if he's reading this, just thought the phrase, "hippie-dippy," and who can blame him? This sounds like idealist, weirdo, airy Steve Jobs rambling, doesn't it? But here's the science behind it.
James Allworth thinks "Steve Jobs Solved The Innovator's Dilemma." I think he's right. And I think this is a big part of how he did it.
In case you aren't familiar with The Innovator's Dilemma [yes, that's a dirty, dirty affiliate link], it was the 1997 Harvard Business School Publishing release by Clayton Christensen wherein he coined the term "disruptive innovation." Disruption theory is beyond[me and] the scope of this post, but it describes the vicious cycle in which what we would call a startup can become a big, slow-moving beast of a corporation, and can, therefore, stagnate, stop innovating, and fail to thrive while another startup comes along and steals its market. In short, it's not enough to come up with an incredible product. You have to keep coming up with incredible products, even if the new ones threaten sales of your old ones, or even your current, successful products. It means taking some risks, getting into markets you don't have any proven ground in and not holding onto anything too tightly. It's being able to change what your company is and does when the market changes, or, preferably, before the market changes. Like turning "Apple Computer," manufacturers of Macintosh personal computers into "Apple," the consumer electronics and media company.
I'll leave it to the Harvard guys'n'gals to go any further with that line of thought, but there's a nugget within there that's germane to our topic (no, I haven't forgotten what it was). How do you keep your finger so close to the pulse of the market that you know how and when to change what your company is and does? This is where the Venn diagram of "User Experience Design" and "Business Model Innovation" overlap, and I'm not the only one who thinks so.
In "The hiring and firing of milkshakes and candy bars," episode 19 of Horace Dediu and Dan Benjamin's "The Critical Path," Dediu describes his own independent arrival at Christensen's theoretical solution to the innovator's dilemma, while observing user experience researchers at work:
The idea is that rather than asking people what they want—showing them things and asking, 'What do you think of that?' you would observe them using the product… It was very useful in identifying why people were clicking in the wrong places. This was a process of cleaning up the interface and finding out where people might be led astray. And I remember trying to actually suggest that method—and I was learning about this at a time before I knew job-to-be-done theory at all, I mean, it was actually before the second book was published, which I think is where it was introduced, in The Innovator's Solution [TQB: yes, another affiliate link]—and so it sort of clicked in my mind… that observation of actual behavior is more important than asking wishes, or asking of people what they want."
This is job-to-be-done theory: the idea that you can predict a market's behavior by looking at why your customer wants your product—what your customer hires your product to do—and optimizing your product to do that job well. If you're really good at this, you can figure out that customers are hiring unlikely products to do certain jobs because there are no better options, in which case you've just found an invisible untapped market. Or you might figure out that a sizable portion of the market is hiring a particular product because it's the best suited to do the job for which they've hired it, but that it's not really getting the job done. It's a "successful" product in terms of metrics such as sales or brand recognition, but customers may ultimately be very frustrated with it, even if they aren't aware of their frustration. This is how RIM's wildly "popular" BlackBerry could be toppled, among several others, in such short order by such an inexperienced little company such as Apple.
And how do you find out what your customer has hired your product to do? As Dediu said, you do user research, in the tradition of the user experience designer.
Obviously, then, I'm all the more justified in my cry for a C-level representative of the UX discipline at Apple, right?
I don't think so.
I think I was right when I said, "Steve Jobs was the de facto [head] of UX at Apple," but I think I was only half right. Whereas Steiger put it so poignantly, as quoted earlier in this article, "building great experiences is everyone’s responsibility and nobody’s job," I think at Apple building great experiences is everyone’s responsibility and everyone's job, especially if you have a C in your title. I think this is what Steve Jobs was talking about with his each-cell-knowing-the-master-plan analogy.
The executive leadership at Apple has been in charge of this for years. Think about keynote events. Who does the demos? Sure, while he was alive, Steve Jobs did the lion's share (yes, an intentional pun), but come on. Steve Jobs doesn't sit on the bench. More and more, though, even while he was still doing the majority of demos, executives of the top several levels demoed their hardware and software. As far back as 2000 you'd see these guys in the promotional videos released alongside the G3 Cube or the first aluminum PowerBooks. Yes, I realize that even Microsoft executives demo their own software, but I challenge you to compare those demos favorably. On one side you'll get a lot of boilerplate, stiff, clearly-rehearsed deliveries of speeds and feeds. On the other you'll hear someone speak with obvious first-hand, deep knowledge of the practical benefits of what they're showing you—the improvements to the user experience.
Not enough to convince you that the executive leadership at Apple is the apparent co-CXO of the company? How about this one, quite possibly the most important UX design datail in the history of Apple, the feature that could be credited for bringing Apple back to life: the iPod's click wheel? It was invented by Sr. VP of World Marketing, Phil Schiller.
This is the body-and-cell analogy quoted above. I don't think Steve Jobs tried to hide his solution to the innovator's dilemma, I think he just phrased it in ways he knew his competitors would never even try to understand. Here he is spilling the beans in Steve Jobs, by Walter Isaacson,
My passion has been to build an enduring company where people were motivated to make great products. Everything else was secondary. Sure, it was great to make a profit, because that was what allowed you to make great products. But the products, not the profits, were the motivation. Sculley flipped these priorities to where the goal was to make money. It's a subtle difference, but it ends up meaning everything.
Sounds a lot like one of Steve Jobs's heroes, Walt Disney:
We don't make movies to make money. We make money to make more movies.
It also sounds a lot like something one of the other cells in the Apple body—Jon Ive—was quoted saying to Wired:
We are really pleased with our revenues, but our goal isn't to make money. It sounds a little flippant, but it's the truth. Our goal and what makes us excited is to make great products. If we are successful people will like them and if we are operationally competent, we will make money.
That's good user experience design summed up quite nicely by someone who neither came from a UX background nor occupies a UX role at Apple. People often credit Ive with all things design at Apple, but he and his team are industrial designers. To be sure, what he does is a major part of the experience in an Apple product, but he doesn't work alone, or even head the division. Ive doesn't likely call any shots when it comes to pixels.
At most places, a user experience designer, if that title even exists, works in the domain of pixels. If it's a really enlightened company, they might get to sit at the table when decisions about hardware or services are being made. At Apple, they don't stop at pixels, they don't stop at power buttons and they don't stop at unibody construction. They don't stop at the packaging, and they don't even stop at the store display. They keep going. It's why you can buy most items in an Apple store right from your phone, without having to stop and wait in a checkout line. It's why you can get first-class support in person at the Genius Bar. It's why I haven't had to call them more than once in a decade, and why I never heard hold music that one time I did.
It's way too late for that header, isn't it?
This seemingly fussy little organizational detail may hold half of the secrets to Apple's wild success. They don't have a CXO because they don't need one. They don't need one because they've infused their very business model with the concerns, the metrics and even the techniques of user experience design.
Horace Dediu, responding via Twitter:
@thomasqbrady That's right. The CXO's job description is a "value" or priority that should be embedded in every employee.
As much as Twitter's recent API "warnings" have made us want it to be so, App.net is not, according to this excellent essay by Orian Marx, "How App.net Can Change Everything", just another Twitter alternative.
It's a platform. It's very close to what I was just begging new Yahoo! CEO Marissa Mayer for, when I said,
Whereas a Twitter client is the current Hello World2, give us a platform that makes a Twitter-like service the "MyFirstWebApp.html" experience. Do for web application development what Blogger and Wordpress did for content development.
Along those lines, Marx suggests we call the social application currently featured at alpha.app.net "Alpha," to distinguish it from the infrastrcture?the platform?on which it was built, which is App.net. If this truly is the direction in which Dalton, et al are headed, I think good things are in store.
Not many people are funny for 40 years. Even fewer people are successfully in charge of funny for 40 years. Saturday Night Live has been relevant and funny (most of the time, anyway) for over 40 years because of the brilliant oversigut of Lorne Michaels.
That's why I'm saying, whatever you're doing right now, stop it. Go listen to this episode of Alec Baldwin's podcast, "Here's The Thing," or, if you prefer, read it.
This is probably the part that hit me hardest with its head-smacking insight:
No one believes that we do what we do here in six days ‘cause there’s not much an approval process.
Out of context that sentence structure is a little weird, so I'll rephrase. Saturday Night Live can do in a week what most production teams can't do in a month because there's not much of an approval process.
He goes on:
Exactly. With the movie business, because it’s way better run as is primetime television, every paragraph is scrutinized and reviewed and I say it every week, we don’t go on because we’re ready, we go on because it’s 11:30. It somehow focusses people and I trust that process.
And to sum it up with a gut-punch:
The pace of "SNL" was like think of it, do it, and then think of something else. And that puts the creative people in charge.
And there it is. Looking back at my career so far, this was the difference between the places where I saw creativity thrive and where I saw it writhing in agony. It's a big part of why I just can't stomach most companies with more than a hundred employees.
Ken Segall, in Insanely Simple(that's a dirty, dirty affiliate link), argues that this is one of the ways Steve Jobs kept Apple from acting like a "big company."
P.S. I can't not include this quote:
Producing, for me anyway, is like an invisible art. If you’re any good at it you leave no fingerprints.
So if you're any kind of Mac nerd you've by now seen numerous photos of early iPhone prototypes now made public domain by inclusion as evidence in the ongoing Apple v. Samsung case.
The Verge featured write-ups, with galleries, on the 26th and the 30th and NetworkWorld today posts a deposition from Douglas Satzger, an industrial design lead who worked at Apple at the time the iPhone was being developed.
There are a few interesting things to me about all this. On the snarky end, the inexcusably poor coverage has been a bit of a surprise. The number of headlines and even whole articles accusing Apple of ripping off Sony design, having clearly not read any of the words in the source materials that weren't in one of the pictures of the prototypes is appalling. It's pretty clear, if you bother to read any of this, that a designer (or some designers) was (were) asked to design something in Sony's style.
The most striking thing of all, to me, is the design itself. This composition from The Verge tells the story best:
It's clear, looking at that 2005 design, that Apple envisioned the iPhone as we know it now—the iPhone 4 and 4s industrial design—before they designed the original iPhone and the 3G/3GS.
I'm very impressed by a company that can not only devise what is, in their estimation, the perfect design and eventually realize it in a shipping product, but can also ship iterative, real-world-constraints-compatible versions on the way there—iterative, real-world versions, by the way, that disrupt entire industries, several at a time. The iPhone was clear two steps forward for Apple, despite the one-step back design and capabilities of the first generation.
Apple had to make some compromises to get that design to market. They had to choose: do we make something that is as powerful as we want, but is maybe a tad way-too-gigantic, or do we sacrifice some power to get the right size? What can we ship now that will be a good jumping off point for the next version, which can be another step toward the product we dream of shipping?
Two of my favorite Steve Jobs quotes come to mind.
And, of course, "Real artists ship."
The latest episode of the Talk Show, “The Next Big Thing (feat. MG Siegler)” got me thinking about Yahoo!, and what Marissa Mayer could do to make it a great company again.
I’m sure Ms. Mayer is full of great ideas for what to do with Yahoo!. I’m sure, too, that she’s receiving far more suggestions than she requires. But I haven’t seen anyone suggesting this one.
In case you need/want it, here’s a TL;DR jump.
From my woefully under-informed position, it seems that Yahoo! is currently crammed into a corner while Google, Microsoft, Amazon and maybe even the likes of Apple and Thompson Reuters take up most of the dance floor. Trying to displace Google in the search market seems a tall order. Yahoo! is currently partnered with Microsoft for search capabilities, which makes competing with them on any of their other hobbies awkward. As John Gruber points out in this episode, Yahoo!’s content generation/syndication causes seem to have given up to Google News and real players like Thompson Reuters. A mobile play—at the top of anyone’s list these days—seems crazy, too, as that club has a line wrapping around the corner. Only one of the companies in that list above doesn’t have a mobile platform/device.
What does Yahoo! do really well? What could they do even better? What could they do better than anybody else?
Before I let loose this cerebral flatulence, let me point out the obvious flaw in my idea: I don’t know how to monetize it. That’s your job. I’m just an idea guy. The only dollar signs in any of the textbooks involved in my higher education were either in variable names or world problems having to do with the cost of a ticket for a train leaving Chicago at 7:20 AM, whilst another train left Philadelphia at 7:35 AM.
John Gruber and MG Siegler set a goal for Yahoo! in that episode(paraphrased):
Get an app on the home screen of every mobile device.
I have no idea what app Yahoo! could create that would achieve that goal. I do have an idea that could easily, though, net them several apps on that home screen.
Yahoo! and The Web
For the past few years, if you were learning to write software for just about any platform, the second application you’d write, after a “Hello World” is a Twitter client. For many years before that, it was a Flickr client. Yahoo! has always been one of the companies that best understood how the Web really needed to work—that you didn’t really have a service until you had an API.
Far as I can tell, Yahoo! has a nigh unchallenged stronghold on API-enabled online content platforms. If you’re Apple, apparently, and you want stock market data, or weather data, or sports scores and news, you turn to Yahoo!. That’s a ringing endorsement. What I definitely don’t understand is how this part of your business works. At Polycom I worked with a team of lawyers and outreach people to try to contact someone at Yahoo! to negotiate a similar contract to embed Yahoo! data in Polycom products. I scoured the Yahoo! site for contact information. There’s a single, unpromising form. I found phone numbers on message boards and called them all. Our lawyers called people. One of our outreach people tried to call in a favor from a college friend. We could never get a return call from Yahoo!. Apparently Yahoo! doesn’t make money from this, or they really like exclusivity.
Yahoo! has an oddly quiet, but impressive technology story. Someone lured Douglas Crockford there in 2005, though he left for PayPal this May. While there, Crockford worked on YUI!, one of the web’s first big UI kits. These days you can choose from a few dozen: jQuery UI, Twitter Bootstrap, Zurb Foundation, a couple from Sencha, etc. Not only was YUI! one of the first sets of interface elements, it was one of the very first libraries to include highly interactive, animated, AJAX-powered interface elements in a web UI library.
There are lots of labs-type projects at Yahoo! that seem to be waiting to be discovered. Yahoo! Pipes is something I can’t believe hasn’t taken off. This is another great example of Yahoo! understanding the true nature of the web: a network of semantically rich objects with APIs to connect them.
Peanut Butter, Meet Jelly
If you’re building a web app right now, one of the most difficult stages is the one in which you pick your technology stack. There’s always the big ugly framework shoot-out chart, wherein you narrow down your giant list of of frameworks to the few that really have all the features you’re excited about, to the two that actually support all your requirements (including accessibility, localizability, etc.).
Then you get to go figure out where you’re going to get your data. Licensing said data is often a headache.
Next you get to constantly deal with the, “Shouldn’t we just make a native application?” question, that no one with a trustworthy opinion can answer in any final way, yet. We are at a crossroads. This may lead you down the path of tools such as PhoneGap to achieve “native” installation as an app.
In far more cases than one would hope this lands you with:
- A web application
- A framework like jQuery
- A framework like jQuery UI
- Possibly another framework for mobile, like jQuery Mobile
- A “native” app built with something like PhoneGap, which requires
- A fork of your web application to make use of device features
While we’re on the topic, let’s get something out of the way. I don’t think anyone denies that an application built with native application frameworks—whether we’re talking about an application binary written for OS X, iOS, Android, Windows, or whatever you’re using—will outperform a web application running as though native, at least not yet. The reality for many of us, though, is that supporting the repository of codebases necessary to produce native applications for two, three, four or eight native platforms is just not an option. Leveraging web technologies as cross-platform development tools might lead to less-than-the-absolute-best performance and user experience, but that’s a trade-off many people are willing to live with in order to reach 2x, 3x and larger audiences.
Here it is, my complementary billion-dollar idea.
Become the platform for web application development.
A plan so simple as to sound ridiculous. If we were talking about just about any other company, it would be ridiculous. I think Yahoo!, though, is uniquely equipped to do this.
Give us a one-stop-shop for web application development. You already have most of what we need. Make each of the tools best-of-breed, stack them up and get a PhoneGap-like tool online. Keep offering YUI!, Mojito, Manhattan and the like, but also market an integrated toolset that looks like a single all-encompassing toolset that includes a license to integrate Yahoo! data services like Yahoo! Finance, Yahoo! Weather, Fantasy Sports and the rest. Give us the promise of tools like Sencha Touch and PhoneGap, but deliver what they haven't so far: a dependable release schedule that's in lock-step with the platforms we're targeting, giving us new features as they're available, not tens of months later. Give us a development platform in the sky—perhaps another Cocktail—that makes it feasible to share a core library of web application logic across instances tailored for use as a web app, as a native app and as a service for someone else to integrate with Pipes. A one-click build server that spits out web apps, native binaries and SaaS servers. Whereas a Twitter client is the current Hello World2, give us a platform that makes a Twitter-like service the "MyFirstWebApp.html" experience. Do for web application development what Blogger and Wordpress did for content development.
I know I can’t be the first person to come up with that idea. And, Yahoo!, while you may not be the first company to attempt to do such a thing, you’re uniqely positioned to do it.