Results tagged “technology”

Talking to Steve Case

February 25, 2014

I had the chance to interview Steve Case for Social Media Week the other day, and though it was a brief conversation, I was really pleased with how it went. Steve's earliest work on Quantum Link, a predecessor to what would someday become AOL, was formative in my understanding of what computers could be used for, and it was great to get to talk to him in some depth about that.

At the other end of the spectrum, it was also truly refreshing to talk to a tech billionaire who recognizes the social obligations that the tech industry and its leaders have to their communities. Whether it was talking about how to truly address the high unemployment rate for which the tech industry bears some responsibility, or discussing immigration in a broader context than simply importing more programmers, or more fundamental issues of inclusion and opportunity, Steve didn't shy away from any of it, and I think it makes the conversation well worth watching.

There's a peculiar and unsettling feeling that arises when looking up background information for a piece and finding a blog post I wrote more than fourteen years ago as one of the top results.

One of the dead links from that post led to the text of the message Steve sent to Quantum Link subscribers just before the service shut down in November 1994:

Dear Members,

As you know, QLink was originally launched in November, 1985. In the years that followed you, as our loyal members, have helped us build a unique online community for Commodore computer users. I want to thank each of you for your contribution, your support and your feedback over the years.

The computing industry has changed dramatically since those first days of online communications. Commodore ceased to produce Commodore brand computers in 1993. Sadly, the company has recently closed its doors entirely. The Commodore computer, once a leader in the industry, has been replaced by faster, more powerful systems. Many software vendors no longer support the Commodore operating system.

Now we find, with great regret, that we simply can no longer support the QLink service. It has become impossible for us to maintain the product up to a standard of quality that we can be proud of. Many of you I'm sure have noticed a diminished level of product quality in the last few months due to these technical limitations. Without technical support from the industry, we are not able to add new services, fix existing problems, or prevent new ones. Therefore we have made the sad decision to discontinue QLink as of November 1, 1994.

We would like to thank each of you for your long and continued support and, if at all possible, keep you as part of our online community.

If you now have the ability to use America Online (PC-DOS, Windows or Macintosh), we invite you to convert your membership to one of these other systems. For details on what these versions have to offer and the system requirements needed to run them, see the document in this area entitled "Converting to America Online."

For details on the last month of service for QLink, important dates and billing information, see the document in this area entitled "Your Final Bill."

We have enjoyed serving you. Thanks again.

Sincerely,
Steve Case

<PRESS F5 FOR MENU>

People Connection

Also courtesy of the Web Archive is this old page that captured many details of the Quantum Link experience.

Quantum Link Music Astoundingly, the full-screen loading images that we watched while waiting for Quantum Link content to download at 300 baud were only 368×240 in resolution. A few highlights of images that I remember especially well include the People Connection (chat) and Music screens.

And of course, the one image I saw most often was the main menu, which is both completely analogous to, and completely different from, the home screen on my phone that I use every day.

quantum-link-menu.gif

Respecting Cinema in the Digital Age

August 8, 2013

I'm definitely one of those guys you'd think of as a tech nerd. I spend all day jumping on and off different social networks, I've got tons of followers on Twitter, and I've been blogging here for almost a decade and a half.

But when it comes to film, despite all my gadget-wielding bonafides, I've been something of a purist. I've never had my phone on during a movie, let alone texted or talked. I've never even tried to watch a movie on my phone, and barely have done so with an iPad when on a plane. My Kickstarter history betrays a predilection for backing independent works that tend to be about artists or marginalized folks, like dream hampton's recent TransParent. So I'm okay with technology being used to engage with film, but I've never personally been interested in mediating film through technology.

Recent days have brought a debate that's forced us all to reckon with the fact that lots of people are bringing phones to public theaters, along with their concomitant light and noise issues, and the overall potential distraction of texting. As someone who's never done it, it's a bit inexplicable to me, but the reality is that millions of Americans are doing this every single day.

Amazingly, this behavior is going on despite the stern warnings before films, despite the rise of independent theatres like Alamo Drafthouse (which we should praise even though they try to distract movie patrons with food and alcohol), and despite the increasing number of other options for watching films at home.

In short, what we've done to encourage reverent, single-focus movie watching hasn't worked.

So we should have the courage of our convictions. If we believe, truly, that a viewing experience without second screens or distracting sounds or lights, is vastly superior to any other way of experiencing this art form, then let's bet on it. Let's let people choose, and offer up screenings where people are allowed any manner of digital diversions during the show.

Unless we're egregiously wrong, we'd only have to offer these alternate experiences for a few months, as people would come slinking back to the superior screenings almost immediately. A medium that's weathered the arrival of smellovision and digital 3D is certainly robust and resilient enough to withstand a test of whether people want to give it their undivided attention.

Light and Sound

Cinema has never resisted technological innovation; It's where people first discovered moving pictures and color images and stereo sound. And the stories we discover there, shared with a crowd of strangers all moved by the same dreamlike images, have withstood time and crossed cultures to knit together people all over the world, even with their different cultural standards and social norms.

The most important thing we can do for film, and for the film industry, is to make sure we're accommodating new viewers, bringing them in to the magic of the movies, and making them fans for life. It might take them a little while to understand why we care so much about the experience, but surely if we can meet them halfway, they'll eagerly make the rest of the leap by themselves.

Ten Tips Guaranteed to Improve Your Startup Success

March 28, 2013

Having had the good fortune to work with a broad range of entrepreneurs and get a front-row seat to the foundations of their success, I thought it'd be good to share 10 key tips that I've found work 100% of the time to increase your odds of startup success. Try to execute on as many of these as you can!

  1. Be raised with access to clean drinking water and sanitation. (Every tech billionaire I've ever spoken to has a toilet!)
  2. Try to be born in a region that is politically and militarily stable.
  3. Grow up with a family that is as steady and secure as possible.
  4. Have access to at least a basic free education in core subjects.
  5. Avoid being abused by family members, loved ones, friends or acquaintances during the formative years of your life.
  6. Be fluent in English, or have time to dedicate to continuously improving your language skills.
  7. Make sure there's enough disposable income available to support your learning technology at a younger age.
  8. If you must be a member of an underrepresented community or a woman, get comfortable with suppressing your identity. If not, follow a numbingly conventional definition of dominant masculinity.
  9. Be within a narrow range of physical norms for appearance and ability, as defined by the comfort level of strangers.
  10. Practice articulating your cultural, technological or social aspirations exclusively in economic terms.

By following these ten simple tips, you'll massively increase the odds of success of your startup! I guarantee it, or your money back.

The Ascendance of Tech Execs

January 25, 2013

One of the weirdest things about the tech industry is that, despite its reverence for the Cult of the Coder, pretty much the only way a programmer or engineer gets to be senior management or in charge of a company as its CEO is by founding it. The classic pattern is that a techie founds or cofounds a technology startup, and then it either doesn't succeed and the VCs and board push them out, or it does succeed and once the founder's gotten sufficiently rich, they're replaced by a business person who's usually got a management or finance background.

But some recent counter-examples have given me a bit of hope. At Etsy, Chad Dickerson was named CEO after coming up the ranks as CTO, and this happened despite the fact that he didn't start the company. Even more notably, Marissa Mayer was named CEO of Yahoo which is remarkable since she's not only a product person, but someone with serious engineering chops. (Full disclosure, my Activate co-founder Michael Wolf is on Yahoo's board and our company is thus involved with Yahoo.)

Oddly, though, there are very, very few examples of tech companies where a founder leaves or a mature company is looking for new leadership and the successor that's named is someone who earned their opportunity for the position through technical work or development. Being a coder or engineer or having a background that's technical shouldn't put a ceiling on how one can advance through an organization, and I'm hoping there are lots more examples like these that I've somehow missed. Where are the CTOs who've become CEOs in major tech companies?

Rebuilding the Web We Lost

December 18, 2012

We have the obligation to never speak of our concerns without suggesting our solutions. I've been truly gratified to watch the response to The Web We Lost over the last few days; It's become one of the most popular things I've ever written and has inspired great responses.

But the most important question we can ask is: How do we rebuild the positive aspects of the web we lost? There are a few starting points, building on conversations we've been having for years. Let's look at the responsibilities we must accept if we're going to return the web to the values that a generation of creators cared about.

  • Take responsibility and accept blame. The biggest reason the social web drifted from many of the core values of that early era was the insularity and arrogance of many of us who created the tools of the time. I was certainly guilty of this, and many of my peers were as well. We took it as a self-evident and obvious goal that people would even want to participate in this medium, instead of doing the hard work necessary to make it a welcoming and rewarding place for the rest of the world. We favored obscure internecine battles about technical minutia over the hard, humbling work of engaging a billion people in connecting online, and setting the stage for the billions to come. To surpass the current generation of dominant social networks and apps, which have unsurprisingly become arrogant and inflexible during their own era of success, we'll have to return to being as hungry and as humble as we were when the web was young. Because last time, we were both naive and self-absorbed enough that we deserved to fail.
  • Don't just meet the UX standards, raise the bar. Obviously, the single biggest reason that the new era of social apps and sites have succeeded where the early efforts did not is because of their massively superior user experience, from the front-end user interfaces to the back-end performance. The expected thing to do would be to hope that a new generation of user-respecting apps came along and matched the best that Facebook and Twitter and Pinterest to have to offer. But actually, due to the profound entrenchment that these platforms already have across culture, the new apps have to be an order of magnitude better in user experience. The good news is, as the rest of the web transitions from making pages to making streams, they'll all be revisiting the tools and technologies they use to connect, and that'll form a big opportunity for new players to participate.
  • Rethink funding fundamentals. As we've seen over and over, the giant social networks seem to inevitably piss off their user bases by changing product features and terms of service in ways that catalyze huge waves of user-generated discontent. But the fundamental reason these sites refused to accommodate so many user demands is because of economics. Those sites make their revenues on models dictated by the terms of funding from the firms that backed them. But as we've discussed before, it's possible to fund contemporary startups either without venture capital, or with a level of efficiency that allows mom and pop startups to reach web scale. To be clear, venture funding powered much of the first wave of social startups and were a big reason they were able to achieve many of their successes, so VC will be part of the ecosystem in the next wave as well. But the terms and dynamics can be profoundly different, supporting startups that are intentionally less efficient, perhaps even making use of the skills of blue collar coders to provide a lot of people will good, solid middle-class jobs instead of optimizing, as current companies do, for making a small number of people enormously wealthy.
  • Explore architectural changes. One of the fundamental reasons that the economics of doing a startup at web scale are different is because of the proliferation of cloud computing and very, very high-performance, reliable open-source components that provide advanced functionality which was prohibitively expensive a decade ago. Instead of backing up a truckload of Dell servers to a data center and then installing a few hundred thousand dollars worth of Oracle software, we can pick and choose a few components off the shelf to get started. More importantly, consumers will start to be able to use the cloud themselves, which removes the current constraint around having to build single, centralized services to provide a great consumer experience. Today, big social apps have to spend millions of dollars handling DMCA takedown requests and FBI investigations into illegal content and in general fighting the web's fundamental desire to be centralized. New apps don't need to obey those constraints.
  • Outflank by pursuing talent outside the obvious. The current wave of the social web doesn't just demonstrate its arrogance through its product decisions. The people involved in creating these platforms are hired from a narrow band of privileged graduates from a small number of top-tier schools, overwhelmingly male and focused narrowly on the traditional Silicon Valley geography. By constrast, the next wave of apps can harken back to many of the best of the early social startups, which often featured mixed-gender founding teams, attracted talent from geographically diverse regions (Flickr was born in Canada!) and were often created by people with liberal arts degrees or even no degree at all. Aside from being the responsible thing to do, having a diverse team generates a variety of unexpected product features and innovations that don't come from the groupthink of homogenous cultures.
  • Exploit their weakness: Insularity. Another way of looking at the exclusionary tendencies of typical Silicon Valley startups is by considering the extraordinary privilege of most tech tycoons as a weakness to be exploited. Whether it's Mark Zuckerberg's unique level of privilege limiting his ability to understand why a single, universal public identity might ruin people's lives, or the tendency to launch apps first to a small, clubby circle of insiders, new startups don't have to repeat these mistakes. And by broadening their appeal from the start, new apps and networks can outflank the big players, paying attention to audiences that hadn't been properly respected last time. That insularity even extends to the tech industry typically ignoring the world of policy and regulations and government until it's too late. While the big tech players have formed their own RIAA, the best case is that they'll focus on overall issues like spectrum policy and net neutrality, ignoring the coming reality of policy changes that will try to protect regular users.
  • Dont' trust the trade press. Another essential step for breaking out of the current tech industry's predictable patterns will be for entrepreneurs and creators to educate themselves about the true history of the tech industry and its products. Our business tends to follow a few simple, repeating cycles, like moving from centralization to decentralization and back, or from interoperable communications to silos and back. But as we've discussed, you can't trust the tech press to teach you about the tech industry, so you'll have to know your shit. Fortunately, a lot of us old-timers are still around, and still answer our emails sometimes, so it's possible to just ask. Imagine if Instagram had simply asked the folks who used to work at Flickr, "Did you ever change your terms of service? What freaked people out?" And even better, we can blog our own progress, because if you didn't blog it, it didn't happen. In that way, we form our own community of practice, our own new peer review process for what we learn about making the web work the right way.
  • Create public spaces. Right now, all of the places we can assemble on the web in any kind of numbers are privately owned. And privately-owned public spaces aren't real public spaces. They don't allow for the play and the chaos and the creativity and brilliance that only arise in spaces that don't exist purely to generate profit. And they're susceptible to being gradually gaslighted by the companies that own them.

Overall, there are lots of ways that the current generation of social sites are vulnerable. There are users that the current tech industry considers undesirable, and technology choices that are considered taboo, and traditions around hiring and product strategy that force them to concede huge opportunities right out of the gate.

As is obvious from the responses I've gotten, many, many people care about a social web that honors certain human and creative values. As I've spent years thinking about the right way to write for this blog, and to build ThinkUp, and to sit on the board at Stack Exchange, and to advise clients at Activate, and to work on all the other stuff I do, I just keep running into the fact that there's a huge opportunity to make a great new generation of human-friendly apps with positive social values.

These new companies will be recognizable in that they'll impact culture and media and government and society, and that they'll invent great new technologies. They'll still make a bunch of money for the people who found them. But they'll look different, both in terms of the people who make them, and the people they serve. And they'll be more durable, not optimized based on current fashions in financing, but because they're built on the accurate belief that there are people who care deeply about the web they use, the works they create, the connections they make, and the humans on the other side of those connections.

The Blue Collar Coder

October 5, 2012

Much of the conversation about the shortage of technology talent in the United States focuses on how we can encourage more young people to go to college to become Computer Science graduates. Those programs are admirable and should be encouraged, but I suggest we need to focus on some other key areas in order to encourage the sustainability of our tech industry:

  • Education which teaches mid-level programming as a skilled trade, suitable for apprenticeship and advancement in a way that parallels traditional trade skills like HVAC or welding
  • Less of a focus on "the next Zuckerberg", in favor of encouraging solid middle-class tech jobs that may be entrepreneurial, but are primarily focused on creating and maintaining technology infrastructure in non-tech companies
  • Changing the conversation about recruiting technologists from the existing narrow priesthood of highly-skilled experts constantly chasing new technologies to productive workers getting the most out of widely-deployed platforms and frameworks

Put another way, our industry can grow in a very meaningful way by giving lots of young people at a high school level the knowledge they need to learn jQuery straight out of high school, or teaching maintenance on a MySQL database at a trade school without having to get a graduate degree in computer science. That's not to say that CS students aren't also important — we'll need the breakthroughs and innovations they discover. But someone has to run that intranet app at an insurance company, and somebody has to maintain the internal iOS app at a law firm, and those are solid, respectable jobs that are as key to our economy as a 22-year-old trying to pivot and iterate their way into an acqu-hire.

High Tech Vo Tech

High schools have long offered vocational education, preparing graduates for practical careers by making them proficient in valuable technical skill sets which they can put to use directly in the job market right after graduation. Vocational-technical schools (vo-tech) provide trained workers in important fields such as healthcare, construction trades, and core business functions like accounting. For a significant number of my high school peers, vo-tech was the best path to a professional job that would pay well over the duration of an entire career.

Now it's time that vo-tech programs broadly add internet and web technologies to the mix. We need web dev vo-tech.

I'm happy about other efforts being made to teach kids to become tech entrepreneurs; As I write this I'm a few blocks from the Academy for Software Engineering. And it's enormously valuable to teach that school's students about coding and building companies.

But in other schools in America, and outside of big cities like my own, and for kids who aren't going to go all-in by attending a tech-focused high school, we need better options. There are many small-town jobs to be built around hands-on technology implementations.

Part of our challenge is that the tech sector has to acknowledge and accept that a broad swath of jobs in the middle of our industry require skills but need not be predicated on a full liberal arts education at a high-end university. The Stanford CS grads are always going to be fine; It's the people who can't go into the same trade as their dad, or who are smart but not interested in the eating-ramen-and-working-100-hours-a-week startup orthodoxy who we need to bring along with us into tech.

Middle Class Jobs

Though I know there are many more implications to choosing the phrase "blue collar" to describe these jobs, it's a deliberate choice. First, there's a broad and noble history of blue collar workers organize to strengthen workers' rights and improve working conditions for their peers; It's a tradition we'll do well to maintain in the tech world.

More importantly, though, we must confront the fact that our current investment infrastructure for tech companies optimizes for a distribution of opportunity and wealth that looks almost feudal. As I mentioned broadly in To Less Efficient Startups, venture capital today generally strives to make a handful of early founders and employees of a company enormously wealthy (alongside the investors, of course), and then to have a subset of employees profit when there's a liquidity event.

But that's a recipe for continued income inequality. I am proud of, and impressed by, Craigslist's ability to serve hundreds of millions of users with a few dozen employees. But I want the next Craigslist to optimize for providing dozens of jobs in each of the towns it serves, and I want educators in those cities to prepare young people to step into those jobs.

Public education serves many roles in society, from the intrinsic social value of having an educated populace to make decisions about elections to the indispensable role it serves in introducing many kids to the arts, music, science and other fundamental aspects of culture.

Today, most Americans also rely on our public schools to prepare their children for their careers, too. And if we in the tech industry want to keep claiming that we'll continue to be the biggest driver of those new jobs, then we have to engage in a significant conversation about how the public high schools of our country can help prepare just as many future employees of our companies as the handful of highly regarded computer science programs in the country do today.

Why you can't trust tech press to teach you about the tech industry

April 30, 2012

If there were one lesson I'd want to impress upon people who are interested in succeeding in the technology industry, it would be, as I've said before, know your shit. Know the discipline you're in, know the history of those who've done your kind of work before, understand the lessons of their efforts, and in general look beyond the things that are making noise right now in order to understand bigger patterns of how technology works, both literally and socially.

This is a difficult challenge, because today's media about the technology industry will not teach entrepreneurs and creators what they need to know about the history of the technology industry.

I don't just mean this in the obvious way — nobody thinks you can earn a PhD in computer science by reading a tech blog. But I mean the broader landscape of sites that attract attention from technology developers and startup aficionados are woefully myopic in their understanding and perspective of the disciplines they cover. [Disclaimer: This post mentions lots of sites that write about tech; I write for Wired (ostensibly a competitor) and advise Vox Media (parent of The Verge, mentioned below), as explained on my about page.]

Open For Comment

Let's take one example from a month ago. A blogger named Saud Alhawawi reported (judging by Google's translation) that Google is going to introduce a blog commenting system powered by their Google+ platform. If you work at a company which makes tools for feedback on sites, or if you care about the quality of comments on the web, this would be important news, so it's a great thing that it got picked up by WebProNews and TheNextWeb.

Given that Google generally refuses to comment on such pronouncements, and therefore would be unlikely to confirm or deny Alhawawi's blog post, the burden is thus on the rest of the tech blogosphere to explain to their readers the implications and importance that such a product would have, if Google were to launch it.

Fortunately, we have a very good record of how the major tech blogs covered this story, if they did. Techmeme has admirably preserved links to the many pieces written a month ago about this story. As you might expect, most were regurgitating the original stories, with a few mentioning Alhawawi's source post. These reposts showed up all over the place: 9to5 Google, BetaBeat, Business Insider, CNET (which oddly credits ReadWriteWeb but links to TNW), DailyTech, MarketingLand, Marketing Pilgrim, MarketingVox, MemeBurn, SlashGear, The Verge and VentureBeat.

Lots of linking with just the barest amount of original reporting, which is actually a fairly efficient way of getting a story out. But while I admire many of the smart people who work at a lot of these outlets, apparently no one who was linking to this story has more than the slightest bit of knowledge about the discipline they were covering.

What's Missing?

As you might expect, nearly every story mentioned that Facebook has a commenting widget similar to what Google is presumably creating. Google and Facebook are competitors, so that's a wise inclusion. Most also mentioned DIsqus, and sure, that's relevant since they're a big independent player. I don't expect that these stories would be comprehensive overviews of the commenting space, so it's fine that other minor players might get overlooked.

What is ridiculous, and absurd, is that not a single one of these outlets mentioned that Google itself had provided this exact type of commenting functionality and then shut it down. Google provided this service for years. And that last Google commenting service, called Friend Connect, was shut down just three weeks prior to this news about a new commenting service being launched.

That's insane. Whether you're a user trying to understand if it's worth trusting a commenting service, a developer judging whether to build on its API, an entrepreneur deciding if you should incorporate the service or worry about competing with it, or an investor who wanted to evaluate Google's seriousness about the space, the single most salient fact about Google's attempt to create this new product was omitted from every single story that covered it.

Worse, the sites themselves suffered for this omission — when everyone is covering the exact same story, if one site had gone with a headline that said "Google's New Commenting Service: The Secret History of How They've Failed Before!" they could have actually gotten more page views and distinguished themselves from the endless TheNextWeb regurgitation.

This isn't a case where a few lesser outlets omitted a minor point about a headline. It's a case where a story that was interesting enough to earn a full Techmeme pile-on was lacking in coverage that would be necessary for understanding the story at even the most superficial level. As you might expect, a few of the larger outlets have big enough audiences that their commenter communities were able to add the missing salient facts to the story, but on both The Verge and Business Insider, the comments which mentioned Friend Connect were buried in their respective threads and, as of a month later, not highlighted in the original posts.

Do Your Homework

Fortunately, whether or not Google makes a commenting widget isn't that big a deal on its own. Maybe they will or maybe they won't, and maybe it'll fail again or maybe it won't. But the key lesson to take away here is that we know a few things are wrong with the trade press in the technology world:

  • In tech financial coverage, there is a focus on valuation, deals and funding instead of markets, costs, profits, losses, revenues and sustainability.
  • In tech executive coverage, there is a focus on personalities and drama instead of capabilities and execution.
  • In tech product coverage, there is a focus on features and announcements instead of evaluating whether a product is meaningful and worthwhile.
  • Technology trade press doesn't treat our industry as a business, so much as a "scene"; If our industry had magazines, we'd have a lot of People but no Variety, a Rolling Stone, but no Billboard.

There are many more examples of the flaws, but these are obvious ones. What we may not know, though is that there's another flaw:
* For all but the biggest tech stories, any individual article likely lacks enough information to make a decision about the topic of that article.

Imagine if Apple launched a new version of the iPad and a story did not mention that any prior versions of the iPad existed. This is the level of analysis we frequently get from second-tier tech stories in our industry. And that's true despite the fact that technology trade press is actually getting better.

We need a tech industry that values history, perspective, and a long-term view. Today, we don't have that. But I'm optimistic, because I see that people who do value those things have a decided advantage over the course of their careers. One place to start is by filling in the blanks on the stories we read ourselves, perhaps by making use of a comment form?

There's No Such Thing as "Cyberbullying"

October 1, 2010

For more than a decade, an intellectually bankrupt habit of maligning new media has reared its head in traditional media outlets, perpetuating a false impression of technology being bad for society. Worse, this tendency masks the actual social ills that are to blame for these awful actions, by creating the facade that technology is to blame when it is more likely the fault of racism, homophobia, classism, or intolerance.

Some recent examples:

  • The Associated Press wrote about the suicide of Tyler Clementi after a dormroom hookup of his was broadcast by some of his acquaintances. Geoff Mulvihill and Samantha Henry wrote:
    The Associated Press found at least 12 cases in the U.S. since 2003 in which children and young adults between 11 and 18 killed themselves after falling victim to some form of "cyberbullying" — teasing, harassing or intimidating with pictures or words distributed online or via text message.
  • The New York Times extends the demonization even further, with a six-person debate on cyberbullying that never once questions the rhetorical premise of the word "cyberbullying" itself. Searching the New York Times archive generates no results for "bibliobullying" or even "telebullying", despite their own definition of "cyberbullying" including text messages sent from phones.

This isn't new territory; danah boyd covered the dishonesty of this term thoroughly on her own blog years ago. But the persistence of this descriptor demonstrates a consistent agenda focused on blaming these horrible displays of intolerance or inhuman unkindess on technology.

When I had my own nose broken by a bully who assaulted me when I was in the seventh grade, it took me some time to figure out the source of his enmity, since the attacker was a guy I barely knew. As it turned out, he had misheard a phone conversation that several kids had conference called in to. I've either forgotten or never knew most of the details of what the conversation was about, but at no time did the school administrators refer to the incident as telebullying, or blame the phone for causing it. They also didn't blame the locker that my nose was smashed in to, presumably because school lockers are a technology of sufficient vintage as to be immune from idiotic epithets.

Why They Made Up This Word

It's important to note that blaming technology for horrendous, violent displays of homophobia or racism or simple meanness lets adults like parents and teachers absolve themselves of the responsibility to raise kids free from these evils. By creating language like "cyberbullying", they abdicate their own role in the hateful actions, and blame the (presumably mysterious and unknowable) new technologies that their kids use for these awful situations. Somehow, when I was frequently cross-dressing or wearing makeup or identifying as queer as a high schooler, I was still able to be threatened with violence, even though my tormentors had no mobile phones or laptop computers. (I will point out, for nerd cred, that I was the first person in my school to bring a mobile phone or laptop to class.)

I was thinking of this obliquely when Jose Antonio Vargas asked me a bit about my perspective on Hollywood's take on social media as exemplified by the new Facebook film. Despite my own misgivings about many of Facebook's social impacts, I still think old media as exemplified by the Associated Press and the film industry has a concerted agenda to demonize new media and social media, and Facebook and its creators bear the brunt of that in The Social Network. There's also the ugly reality that coining bullshit words like "cyberbullying" will sell papers or page views. I put it more broadly in the Huffington Post piece:

The movie is written in the abstract, based on what they feel Facebook, and the social Web, represent. It's exoticism. It's the 1940s, when you had a white actor in yellow-face play a Chinese character, you know? Those foreigners talk like this, and it's why they're inscrutable and evil.

The truth of it is, calling the cruelty that kids show to one another, based on race or gender identity or class or any other imaginary difference, by a name like "cyberbullying" is a cop-out. It's a group of parents, school administrators and lazy reporters working together to shirk their own responsibility for the meanspirited, hateful, incomprehensible things their own kids do.

And it's a myth. There's no such thing as cyberbullying. There's only the cruelty in all of us, and the cowardice of making words to hide from it.

Our Biggest Challenge Yet

April 12, 2010

The White House tweeted that they want feedback on the Grand Challenges in science and technology that face our country. That's not so new. But today, if you reply to the White House's tweet to share your ideas, the White House will actually see your response.

Wait, what?

These days, I often sound like a skeptic or a curmudgeon when it comes to the technology industry. But ultimately, I'm profoundly optimistic about what the Internet can be, and today is one of those days where I hope we can demonstrate exactly why so many of us love the web.

For the past several months, I've been leading an effort at Expert Labs to help policy makers use social networks to collect feedback on policy. Today marks our first experiment. To participate, all we have to do is suggest ideas as ambitious as the moon landing or the human genome sequencing, or like the X Prize or the Netflix prize — ideas so inspiring that they prompt a ton of new innovations.

So do it. Just reply to the White House on Twitter or Facebook, and they'll hear your suggestions and if you've got a good idea, they'll use the feedback to help shape policy. The President has eight items on his list of Grand Challenges but there's no reason your idea couldn't be number nine.

This is just a first step, but it's a pretty good one.

How'd We Get Here? Where Next?

It's been a long, interesting road to get to this first tentative experiment in broad-scale policy feedback on social networks. Fundamentally, one of the biggest opportunities has been that the current administration has embraced the President's Open Government Directive, encouraging public feedback using every avenue possible, with a special focus on new technologies.

But if you dive in to the specifics of some of the plans, it's even more remarkable what's going to be possible in the future. For example, the White House's Office of Science & Technology policy posted its own open government plan, which includes a specific nod towards Expert Labs, acknowledging that we can be a small part of their overall effort to allow for public feedback.

And we've been working like crazy to step up to the challenge. Gina has been leading an amazing community that's built one hell of a little app called ThinkTank. It aggregates all those tweets and Facebook replies and will collect them for sharing back with the White House and with the public. It's even matured quickly enough that we're a Google Summer of Code project, with some fantastic proposals coming in from students who want to make ThinkTank even smarter. Gina describes the potential brilliantly in her post on Smarterware, too.

How You Can Help

Here's the thing: I need your help. This is a complicated, unfamiliar new idea to explain to people. So I need help in telling people a few things:

  1. The White House wants to hear policy feedback through channels like Twitter and Facebook.
  2. Expert Labs has built tools that will let them do this.
  3. The success of this first question about the Grand Challenges in science and technology will do a lot to demonstrate how every part of government could use these tools.
  4. This is just the start; We're going to be doing this in bigger and better ways in the future.

So, if you've got a blog, or a Twitter account (and if you don't, what the hell are you doing here?!) please share the word with your readers. Reply to the White House's tweet using hashtag #whgc, and then stay tuned as we start to share our findings with the world.

In Defense of the Punditocracy

August 31, 2009

Michael Arrington. Dave Winer. Tim O'Reilly. Jason Calacanis. Add a few names of your own.

Within the navel-gazing little corner of the tech world that I inhabit, the mere mention of these names are among the most evocative things you can say. As much as any of the companies or tech executives they write about, the pundits who opine each day on the profound and mundane developments in the world of gadgets and the web are a surprisingly polarizing bunch. But it's hard to figure out exactly why that's the case.

Opinions are like...

Interestingly, the consensus on lots of these people (at least when they're not in the room) is pretty negative. For almost all of them, I've had someone say to me flat out "That guy's an asshole" when referring to them. Hearing it for years myself (especially when I didn't really know any of them except by reputation), I was inclined to agree. "Who does that guy think he is? What a hack." Prone to bluster, at times self-important, reflecting our entire industry's frequent lack of real-world perspective, I figured the conventional wisdom about these guys was actually correct. Even if I share all of those traits myself.

Recently, I took a look at my personal experience with most of these men, and the few other high-profile tech pundits with whom I have at least a casual acquaintance. And in nearly every case, they'd been pretty much positive. Sure, I've cringed when the work I've done (either personally or as part of Six Apart) has been criticized or, worse, ignored. But it's hard to find a time when a response to something I did was wildly unfair, or when any factual errors weren't quickly corrected. More importantly, they've consistently been generous and welcoming in encouraging me to speak up not just about the opinions I have about technology or tech companies, but about the way that our industry as a whole needs to evolve.

I've had a bit of time to reflect on it because lately, obviously, I've been engaging in a bit of armchair punditry myself lately. Hopefully I'm not quite so hyperbolic as the worst excesses of contemporary tech punditry, but I've unabashedly been trying to be provocative and ambitious in what I'm writing. And I realize the key difference between me and those who have been the harshest critics of the current reigning powers in tech punditry is that the critics have often put the pundits on a pedestal, and then attack them for being in a position of power, not for any particularly egregious problems with the content of what they're saying. I've said it before: We hate most in others that which we fail to see in ourselves.

Call it arrogance on my part, or naivete, but I have never seen any tech pundit on the web as more qualified to opine than I am, and have never ascribed more power to any blogger just because they have a bigger audience than my site, or because they happen to run a conference that people pay to attend. As a result, their shortcomings don't bother me, and it certainly helped me get over the feeling that I should have strong feelings (positive or negative) about a bunch of guys I barely know. When they're doing good, the tech pundits are just another bunch of good bloggers that I read, and when they're screwing up, that just means more room for me to do what I do.

A Little Perspective

Perhaps the biggest lesson has been from my conversations with those outside of the tech industry. I always ask who they get their tech news from, and what their opinion is of those pundits. Nearly every outsider has said they're very pleased with how the prominent tech pundits represent our industry. Those with a little bit of distance from the petty politics of the tech world are uniformly astonished at how much negativity and even contempt those within the tech industry have for our most prominent voices.

Now, I'm not saying there is nothing to criticize about the work of the major influencers in the world of web technology. You may have noticed that the example names above, along with a dozen others I could have added, will mostly fall into the category of American white male millionaires. That's a demographic with whom I have no quibble ("Some of my best friends are...!"), but that I feel we can safely acknowledge our outreach to this group can be considered a Mission Accomplished, and we can now move on to accommodating the voices of additional groups. But most of my criticisms of their work are, I have found, more criticisms of our industry in general. An emphasis on the novel instead of the meaningful, a tendency to overemphasize minor news and downplay bigger stories, a focus on the technical details of a new technology instead of its social impact — I think the blog posts and conferences that we all participate on demonstrate these flaws as a reflection of the faults of our culture overall. I can't judge any individual too harshly for failing to consistently rise above the culture that surrounds them.

I'll gladly call any of these pundits on the carpet for mistakes they make, or for shortcomings in the work they produce. Hopefully, my track record of arguing for inclusiveness will be a positive nuisance to encourage them to follow the better angels of their nature. And of course, I'll be accused of sucking up to them, even though I have no agenda in defending them except to note that the tactic of quietly insulting the tech pundits has not been particularly effective in diminishing their influence.

But as I've begun to (re-)dabble in punditry, I think it's telling that private conversations (and the occasional ranting blogger) direct so much vitriol at the people who lead much of the conversation in the world of technology. it would seem the more effective form of criticism is obvious, effective and relatively easy: Just do better yourself.

Apple: Secrecy Does Not Scale

July 31, 2009

Apple is justifiably revered in the worlds of technology and culture for creating one of the most powerful brands in the world based on the combination of some key elements: Great user experience and design, and an extraordinary secrecy punctuated by surprising reveals. But the element of secrecy that's been required to maintain Apple's mystique has incurred an increasingly costly price. Apple must transform itself and leave its history of secrecy behind, not just to continue being innovative and to protect the fundamentals of its business, but because the cost of keeping these secrets has become morally and ethically untenable.

Some recent history:

  • Sun Danyong, a young man in Dongguan, China, who worked for Foxconn, one of Apple's most important iPhone suppliers, killed himself after misplacing a prototype iPhone device.
  • Apple prohibited the Google Voice application from being distributed on its iTunes application store with no public explanation of why, a refusal to offer any suggestions that could permit the application to be distributed, and no process for appealing the decision.
  • Apple removed third-party Google Voice-compatible applications by explaining that they violate a policy against applications that duplicate native iPhone functionality, despite this rule being wildly inconsistent in its enforcement. Again, Apple refused to offer any suggestions for how developers could comply with the guidelines, and offered no process for appealing the decision.

The circumstances of Danyong's suicide are murky -- it's possible that he was involved in supplying the iPhone prototype to copycat manufacturers which would create knockoff devices, but the theory has also been advanced that he was merely unable to cope with the stress of the extreme secrecy required for his work. Regardless of the reason for Danyong's death, copycat manufacturers are a fact of doing business in China; It is only the extraordinary veil drawn around the product that makes such disclosures so particularly fraught.

Similarly, every carrier (and nearly every mobile application platform) has some arduous or even capricious limitations on the applications that can be created by developers. But for better or worse, those limitations are spelled out clearly, in a way that developers can anticipate, and decisions to prohibit particular applications are explicit even when they are annoying or offensive to those of us who believe in open platforms.

This means that those of us who support Apple with our dollars and attention are supporting a company that chooses to operate with an extreme and excessive layer of secrecy, even when making reasonable business decisions. This squelching of communication about Apple's products results in customers being unhappy or uncertain of the future value of their purchases, developers being too afraid to bet their livelihoods on a platform whose fundamental opportunities could be destroyed at any time, and suppliers being forced to inflict unreasonable or even inhumane restrictions on their employees. And that's in addition to the incredible stress that Apple employees themselves have had to endure, from missing Christmas to get products ready for MacWorld without even being able to tell family members why they must do so, to public-facing communications staff having to endure the misery of telling developers that their products or businesses are being terminated by fiat, without so much as an explanation.

I'm certain the web's usual contingent of soulless Randists will believe this level of suffering is somehow acceptable despite its moral cost, because The Market has made Apple a success. But there's even a financial argument: Apple spends an enormous amount of money on protecting and obfuscating normal business operations that any other company can do in the open. It's hard to estimate just how much the overhead of this extreme secrecy costs the company, but it's obviously many millions of dollars extra per year. And it will only get more expensive as large-scale realtime communications get more and more commoditized.

The Case for Secrecy

Now, if being ultra-private about announcements has such a terrible cost, then why does Apple go to all the trouble? Apologists would say that Apple gets three significant benefits from its incredible secrecy:

  • An extremely disproportionate amount of extraordinarily favorable press from its "surprise" product launches
  • A significant lead time on the rest of the market being able to copy Apple innovations
  • An intangible benefit to the brand being so tightly controlled by the company
    These benefits are real to some extent today, but in each case, the benefit is almost certainly not viable over the long term. Let's look at why:

"But they get so much free press from the element of surprise in their announcements!" This isn't true -- for almost every major announcement of the past several years, we've known the major points days, or even weeks, in advance. In fact, they earn the majority of their press from the extraordinary appeal of their products in design and user experience, as well as the pure showmanship they put into their signature launch events, which are unequalled thus far in the industry.

"But if they don't keep stuff a secret, other companies will be able to copy them!" Other companies already do copy Apple, and always have. And — dirty little secret — Apple has always copied other companies as well. This is a normal part of the business cycle (indeed, before its current bastardization, the patent system was designed to encourage this behavior), and no amount of secrecy will stop it. More to the point, if the only reason people are buying your product is because it has no viable competitors, then your standing in the marketplace is too tenuous to be defended anyway.

"But people love Apple's brand because it's so micromanaged!" This is the most insidious and inaccurate of all the justifications. In fact, since Apple's brand began to recover in the late 90s, two of the greatest and most influential global brands in the world have emerged: Google and Barack Obama. In both cases, they've embraced openness, transparency, and letting their communities define their brand. Despite my belief in my recent pointed criticisms of Google, it's worth noting that a number of high-profile Googlers responded personally, both privately and publicly, to the issues that I raised, all indicating that they took the discussion to heart. And President Obama has taken his penchant for talking things through to such an extreme that it's nearly become a let's-have-some-beers parody of itself.

In contrast, Apple's employees will be too cowed to publicly respond to this post, though I know they'll see it. Partners are tired of being bullied or facing petulant sanctions for accidental disclosures of relatively innocuous bits of information. And eventually, anyone talented and independent-minded enough to participate in the kind of innovation practiced at Apple is going to chafe at being constrained in how they can express themselves.

Real Artistry

Self expression matters because Apple has always explicitly tied itself to the world of the arts and expression. One of my favorite (possibly apocryphal) Steve Jobs quotes is "Real artists ship", a testament to the fact that an invention that never sees the light of day can't affect anyone. But if we're talking about real artists, then let's consider all of their traits.

Real artists also expose themselves, making themselves vulnerable through honest expression so that their audience can see their humanity, and thus form a connection to something universal in all of us. Apple is still holding on to the centralized, Pravda-style public relations that artists used in 1984 when the Mac was introduced. Back then, giant record labels and a few powerful media outlets could tightly control the flow of information around a tiny cluster of superstars. The superstars of 1984 -- Michael Jackson, Prince, Madonna -- subscribed to the doctrine of doing no interviews or press, and having their only communication with the public happen through tightly-managed events where they had total control.

Today's biggest and most influential artists, from Kanye West to Trent Reznor to Radiohead, are very nearly competing to see who can be most transparent. The immediacy and intimacy with which they communicate and create their works is dramatic, and they encourage their communities to get involved in a ritual that Apple used to encourage: Rip, Mix, Burn.

Jobs as Big Brother

The sad truth is that Apple is still stuck in an anachronistic, 1984 mode of communicating with the world. If Apple doesn't evolve, it'll become a pathetic-looking giant, constantly playing whack-a-mole with information leaks, diminishing its relevance by antagonizing the very creators it has so long sought to identify with. Worse, while the fashions of 1984 might be back in style, the ability to tightly control a message is never going to come in vogue again, and the one thing Apple's brand can't withstand is suddenly becoming uncool. (I'm pretty sure Apple's also had a word or two to say about why today's world shouldn't be like 1984.)

Look Around And Learn

Every company, when facing a serious problem, suddenly starts blogging. From the giant auto manufacturers to troubled banks, it's been astounding to see how frequently companies figure out that embracing transparency yields an enormous improvement in how much their customers and community trust them. When Amazon screwed up by abusing their DRM powers over Kindle owners, they were a little slow to respond, but absolutely flawless in their message when they had Jeff Bezos himself post a simple, straightforward apology to Kindle owners in their own community, complete with open comments for people to respond. And it was an easy leap for Amazon to make -- they have extensive experience not just with consumer-facing blogs, but in talking directly to developers or business partners as well. While much was made of Amazon recalling George Orwell's titles, it's Apple's behavior that is most Orwellian overall.

This lesson isn't entirely lost on Apple; Once in a great while a missive will arrive from on high arrives in the form of a one-page letter from Steve Jobs on a significant issue. And when the debacle of MobileMe's bumbling launch got bad enough, Apple even launched a short-lived blog to address the issue. So it's not impossible that Apple can start to communicate in at least a semi-human, responsive way. Even better, Apple clearly has some parts of its corporate culture that want to do the right thing, as evidenced by its unusual willingness to offer refunds to a variety of disgruntled classes of customers over the years.

But the reason for Apple to embrace some open communications channels isn't merely because of the practical necessity of talking to customers, developers and partners. It's because this is the right thing to do. Apple has long been able to pride itself on being innovative even when the market wasn't demanding bold moves of them. Nothing could be more courageous than for Apple to take a decisive step to redefine a core part of their brand's history to be more in keeping with contemporary communication. Moving from the classic Mac OS to OS X or from PowerPC to Intel would be nothing compared to a transition from ultra-secretive to collaborative and expressive. It would show that Apple has the self-awareness to evolve into a better, more humane organization than they've been in the past.

The reckoning Apple has reached, whether it's admitted or not, is that its secrecy is compromising its humanity. Some of the smartest and most innovative developers on any platform are leaving and taking their creativity with them. The trade press who had embarrassed themselves with their effusive cheering for Apple in the past are rushing to cover absurdities like entire sites being dedicated to Kremlinology about Apple's platform decisions. If losing your cool doesn't move you, Apple, then what about people losing their lives to this domineering, outdated mindset?

It's incumbent upon Apple to do the moral thing here. Treat your employees, customers, suppliers and partner companies better, by letting them participate in the thing most of your products are designed for: Human self-expression. If the ethical argument is unpersuasive, then focus on the long-term viability of your marketing and branding efforts, and realize that a technology company that is determined to prevent information from being spread is an organization at war with itself. Civil wars are expensive, have no winners, and incur lots of casualties.

There is a path out of the current quagmire. Apple can start to see its customers as collaborators, and start to encourage them to use the very Apple products they've purchased as a conduit for sharing messages about the company and its products. Apple's fans have already shown a willingness to create fictitious print, television, and online advertising that exceeds other company's actual efforts in quality while still being slavishly faithful to Apple's brand guidelines. And being an open company doesn't mean that there can't be the occasional big surprise — in fact companies like Google often find it easier to have things "hide in plain site" because so much of what they do is open that the curious often don't dig past the surface to find out what else is going on.

Finally, there is the opportunity for Apple's employees themselves to act as ambassadors for the brand. Frankly, those Geniuses in the Apple stores aren't the most flattering face for the company. But instead of prohibiting all the other thousands of Apple employees from engaging in conversations about their professional lives on the web and in social media, perhaps they could be empowered to express the company's ideas in their own words. That would be an enormous resource that would be unleashed by Apple's evolution into a communicative company.

So Apple: Do the right thing. End your addiction to secrecy.

Google's Microsoft Moment

July 9, 2009

I'm not sure Google's new Chrome OS announcement is that big a deal, or that the eventual product that gets released will actually have that much impact, but it's a useful milestone in marking Google's evolution towards becoming an older company with a distinctly different culture than they used to have.

This is, for lack of a better term, Google's "Microsoft Moment". This is the point when the difference between their internal conception of the company starts to diverge just a bit too far from the public perception of the company, and even starts to diverge from reality. At this inflection point, the reasons for doing new things at Google start to change.

google-microsoft-chrome-480.jpg

Let me be clear: I don't think Google is "turning evil". Hell, I've caught a lot of flack for the fact that basically I don't think Microsoft was evil. But there are some notable trends going on across Google today that could cause the company to compromise its stated values and that will certainly cause people to think Google is being evil, if not corrected. I'll try to outline a few key cultural indicators from around Google.

Designing for corporate synergy, not for users

Google's recent development work on applications for mobile devices has often been delivered exclusively as applications for their own Android platform instead of as iPhone applications, despite the fact that iPhones are roughly forty times more popular in the marketplace. iPhones are also much more popular outside of the United States than Android, further limiting the actual audience served by these applications. Now, it's obviously good company policy to make sure to support Google's own platforms, and Google does an admirable job of using generic open web technologies where possible to avoid having to choose between platforms at all. But choosing to leave the majority of users in a given market unaddressed because they are on a platform that is not part of your corporate goals is short-sighted and leaves a lingering sense of mistrust.

If you look at Microsoft ten years ago, or even as recently as five years ago, they had a tendency to say "Well, we've got a version that works on Windows Mobile." or "This works on Internet Explorer" and feel that they'd done their job for addressing mobile or the web. Or Windows Media Player would connect to XBox but not to any other systems for sharing media. They were putting their corporate agenda ahead of what the marketplace had chosen as its preferred platforms. But after all these years, Microsoft's internal teams have finally started to develop their web or mobile versions of products to work on competitor's browsers and competitor's mobile platforms, recognizing that they have to go where the users are, instead of favoring only the platforms created by their corporate siblings. Google appears to be headed the other way.

Forgetting what the real world uses, and favoring what's convenient for your own business goals is a quick way to have customers think you don't care, and to indicate to partners or developers that pleasing Google is more important than pleasing customers.

Multiple competing product lines: Chrome OS and Android

This is one of the simplest and most obvious examples, after this week's announcements: Google is now offering not one, but two mobile operating systems. While they undoubtedly share code, I can't help but think back to ten years ago, when Microsoft was vehemently protesting about how much code was shared between the Windows NT/Windows 2000 operating systems and the Windows 95/98/ME operating systems. If I make a screen two inches smaller, should I use Android instead of Chrome OS? If the keyboard works with my fingers instead of my thumbs, I should use Chrome OS and not Android? I know Google is convinced its employees are smarter than everyone else in the world, but this is a product management problem, not a computer science problem.

Changing methods of communication

Within Google, I'm sure the perception is that their public-facing communications are still very "Googley". Now, Google does an excellent job of maintaining and using an enormous number of official corporate blogs in dozens of languages for a rapidly-blossoming number of products and initiatives. But despite my admiration for that effort, and their commendable willingness to forgo the usual boring press releases, the way that the company communicates with the public has fundamentally changed, and not necessarily in a more human direction.

In lieu of blog posts or simple word-of-mouth, as helped popularize the Google search engine itself ten years ago, efforts like Chrome are being accompanied by television ads, complete with all of the production values of primetime TV. Instead of launching a new developer initiative by promoting an SDK on their blog, Google is filling convention centers, Apple-style, with day-long developer presentations and an Oprahesque giveaway of free phones under every seat. Instead of white papers, there are highly-produced comic books being distributed to the press to explain the value of Chrome.

Now, I actually support these types of outreach. Getting outside of the insular tech bubble requires higher production values and clearer messaging. But when Google evokes Apple or Microsoft or Oracle in its style of communicating ideas, and when cell phone ads on TV say "Powered by Google", an average consumer's conception of Google essentially shifts to seeing this company not as "those guys who do the search engine" but instead as another consumer electronics company, like Samsung or Sony, but a little more hip.

This would be okay, except that I doubt Google's internal self-image as an organization has changed to reflect this new reality. "We're not like some giant company with flashy TV ads — we're just a bunch of geeks in Mountain View!" And while that might be true for the vast number of engineers who define the company's internal culture, the external impression of Google being just another tech titan like Microsoft will gain footing, making the audience for Google's messages less tolerant of ambiguity and less forgiving of mistakes.

Only the last generation of companies can be evil, not us!

Though it's almost impossible to picture now, in the era when Microsoft was formed, IBM was synonymous with an almost Orwellian dominance of information technology. It's been a full 40 years since the antitrust actions against IBM, and IBM is seen as a bastion of open-sourceness now, but Microsoft's founding mindset clearly was shaped with the idea that "those old guys from the last generation are evil, and we're the nimble, smart upstarts who are going to humanize this industry". Sound familiar?

Though it's hard to believe, the FTC's first investigations against Microsoft began eighteen years ago. When Microsoft reached its apex in terms of public perception and industry respect, with the launch of Windows 95, the culture inside the company still largely saw themselves as upstarts against old, proprietary behemoths. Though Microsoft's headcount has increased fivefold since then, at the time of Windows 95's launch, they had about 17,000 employees.

Google's headcount just passed roughly 20,000 employees. And most of those staff members are firmly convinced that evil, or at least incompetence, is a trait of the last generation's dominant tech player: Microsoft. The idea that developers or customers might start to bristle at their dominance is met with the (true, yet irrelevant) argument about how open their data and platforms are. Eric Schmidt said yesterday that Chrome OS is so open that Microsoft could make Internet Explorer for it, though of course the effort of porting the browser would be prohibitively complex. By neatly inverting the framing of the conversation ("We didn't bundle a browser with our OS, we bundled an OS with our browser!"), Google's avoided having to confront the parallels between this moment in their corporate culture and Microsoft's similar moment of ascendancy 15 years ago.

Still haven't developed Theory of Mind

And finally, as I outlined two years ago, Google still hasn't developed theory of mind. From my piece then:

This shortcoming exists at a deep cultural level within the organization, and it keeps manifesting itself in the decisions that the company makes about its products and services. The flaw is one that is perpetuated by insularity, and will only be remedied by becoming more open to outside ideas and more aware of how people outside the company think, work and live.

Worse, because most of the dedicated detractors of Google have been either competing companies or nutjobs, it's been hard for Googlers to take criticisms seriously. That makes it easy to have defensiveness or dismissal of criticisms become a default response.

Conclusion

Google has made commendable steps towards communicating with those outside of its sphere of influence in the tech world. But the messages will be incomplete or insufficient as long as Google doesn't truly internalize and accept that its public perception is about to change radically. The era of Google as a trusted, "non-evil" startup whose actions are automatically assumed to be benevolent is over.

Years ago, GMail introduced context-sensitive ads and was unfairly pilloried for being anti-privacy or intrusive. And while there have been a few similar hand-slappings along the way, Google's never faced a widespread backlash against their influence or dominance from average consumers yet. Today, protestations of "but it's open source!" are being used to paper over real concerns about data ownership, and the truth is that open code doesn't necessarily imply that average users are in control.

And ultimately, once a tech company becomes dominant in its space, it's susceptible to a kind of reverse Hanlon's razor: Anything caused by stupidity or carelessness will instead be attributed to malice. Similar to the Law of Fail ("Once a web community has decided to dislike an idea, the conversation will shift from criticizing the idea to become a competition about who can be most scathing in their condemnation."), Google is entering the moment where it has to be over-careful not to offend, and extremely attentive to whether they are treading lightly.

Is Google evil? It doesn't matter. They've reached the point of corporate ambition and changing corporate culture that means they're going to be perceived as if they are. Whether they're able to truly internalize that lesson, accept it, and act accordingly will determine if they're able to extend their dominance in the years to come.

(Illustration courtesy of Federico Fieni.)

Related Reading:

Update: There's been a phenomenal reaction to the ideas discussed here. I rounded up a lot of the responses in a follow-up post. But it's also worth noting that a number of people from both within and without Google have pointed out that in many cases, the release of an Android application has preceded its counterpart iPhone equivalent due to delays in Apple's opaque approval process for applications on that platform, or because the Android applications were only created as hobbyist projects by Googlers in their free time. Similarly, a number of people have pointed out significant differences between Chrome OS and Android, such as the primary development environments (HTML5 and Java, respectively), memory limitations for applications, and the distribution model.

While I've certainly not meant to gloss over any of these clarifications as insignificant, and appreciate the additional information, the key argument I'm advancing here is about the overall impact of changes in Google's culture and perception. Many more examples can (and have) been identified to support that larger trend, and I'm pleased that the larger dialogue has focused on that bigger issue, inspiring some great conversation.

Sticking with Last Year's Model

April 22, 2009

Here's the idea: We can fix the false impression that the newest gadgets are the only interesting ones by simply promoting the fact that we're getting a lot out of our existing products.

Last Year's Model

I am lucky — I get to talk to some of the smartest geeks in the world, and to learn from their example about cutting-edge technologies. One of the most interesting things I've seen is that, while so much of the talk in tech circles is about the latest-and-greatest, even alpha geeks often don't run out and buy the newest gadgets and electronics the minute they come out.

But you wouldn't know it from the way we talk about our gadgetry.

Instead, there's an incessant focus on what's just been released on the market, or what's becoming available in the future. It makes even those of us who have great, fancy, expensive devices feel like, well, we're slipping behind.

LYM It ain't necessarily so. I bounced this idea off of a few tech experts I know, and they all agreed that the constant pursuit of novelty over actual value takes a lot of the joy out of loving great technology. So, to help promote the idea of being thoughtful about what we buy, and how long we hold on to it, I created Last Year's Model, with a design from my friend Mike Monteiro of Mule Design.

Today is Earth Day — I don't want to diminish the fact that being thoughtful about our consumption is good for the planet. But It's just as important to me that we really think about what we're doing with these tools and toys.

Fortunately, I'm not alone.

  • Gina Trapani was one of the first people to really encourage me to put the site together, and she's already given a testimonial to the idea for the site and helped spread the #lastyears tag on Twitter with her announcement.
  • Kevin Rose is on board, too, showing that there's no contradiction between loving the latest in technology and still not chasing every new shiny gadget.
  • Joel Johnson at BoingBoing Gadgets a really thoughtful take on the idea.
  • Chris Pirillo's got a personal testimonial of how he's getting the most from his current laptop.

And we've got a ton more examples popping up — I'll be sharing them on my own Twitter account as new ones pop up. You can also join the Facebook Cause to show your support.

I hope you'll participate. I'm very thankful to all my friends who've helped out with shaping this simple little site and the slightly-bigger idea behind it. If you've got a story of how you're getting the most out of the gear you've already got, all you have to do is visit Last Year's Model and share your story.

Freedom From Choice

September 27, 2007

A.J. Jacobs, master of the year-long book stunt, spent a year trying to live by all the rules dictated in the Bible. As stunts go, it's not that interesting to me ("Hey, I grew a beard!"), but one of the lessons he mentioned learning in this Newsweek interview indicates he really did go in with an open mind:

We all talk about freedom of choice, but there’s something very attractive about freedom from choice. Religion provides structure, mooring, anchoring. Should you covet? No. Should you give 10 percent to the needy? Yes. It really structures your life. After my year I felt unmoored, overwhelmed by choice. I have adjusted, but I’m still overwhelmed by choice, as we all are in America.

There's an analogy here about why those who preach simplicity in the realm of technology sound so much like they're preaching religion, and why those who agree with them often take on a near-religious fervor, but I'll leave that as an exercise for the reader.

Evanescence of the Treekillers

August 24, 2007

I like PC Magazine, and I've been reading it for pretty much my whole life, but I still can't help but think that the homepage for opinion columns contains two different Editors-in-Chief's "goodbye" articles. I'm the kind of nerd who still enjoys reading computer magazines, and as often as not I'll grab a PC Mag or PC World or something like that before I hop on a plane, just as a reminder of how interesting it can be to see technology in that context. (Any news or reviews covered by the print issue have almost always been discussed to death online by the time the magazine comes out.)

It's interesting to see how this has played out across the tech magazine space. InfoWorld recently killed their print magazine entirely. And eWeek had a recent print redesign where the magazine now features a narrative-based lead section called "Upfront" that pretty openly apes the New Yorker's Talk of the Town, but in a nerdy context. Surprisingly, it works pretty well, and it makes me enjoy reading the content in print form, as opposed to just skimming online. I'm hoping at least a handful of these magazines find a way to make a go of it in print, now that their audiences aren't relying on print for any time-sensitive tech news.

The Enterprise, Apple, and Insufficient Ambition

August 12, 2007

The Premise: Anyone who creates technologies that aspire to have significant cultural or social impacts on the developed world has to focus on both our lives at home and our lives at work. Anything less is an abdication of potential, or a failure of ambition, and settling for less denies many people the chance to discover tools or technologies that can improve their lives.

I was struck by John Siracusa's 'Stuck on the enterprise', which he wrote a few days ago. His assertion:

Sure, Apple makes periodic overtures in to big business. It even redirects apple.com/enterprise to someplace sensible. But nearly every Apple product or service ostensibly aimed at enterprise customers can also be seen as a natural part of some other, "non-enterprise" market where Apple is strong (e.g., creative professionals).

Unfailingly, Apple markets only to the end user these days. ... What Apple does not do is sell products to corporate IT that are meant for direct use by non-IT employees. That is, desktop PCs, and more recently, cellular phones.

Siracusa then goes on to list a series of enterprise desires for phones that he claims look "quite different than the iPhone", mainly centering around manageability and predictability. This is followed by a contention that these aims are incompatible with usability.

This is, to be blunt, horseshit. It's apologist blathering to cover up a failure of imagination and ambition. And it's saying that people cease to become people when they're at work, and are instead Enterprise Employees. These are the excuses that let the tech industry off the hook for failing to engage as many people as it should be.

This leads to an alarmingly wrongheaded conclusion:

[T]he decision to ignore markets where you must sell to someone other than the end user is pretty high-minded (for a corporation). It's also perhaps the only way to ever create great products, products that customers actually love.

No, this decision is elitist and lazy. Here's the truth: You can meet all the (reasonable) requirements of an Enterprise while still creating a product that delights and inspires the people who make up that organization.

In fact, you have to do so.

The only tools that succeed in an enterprise situation are those which are so compelling that people choose to use them in their free time. Look at email, instant messaging, hell -- look at the telephone. These staples of business communication are so popular because they meet the "I want this as part of my life" threshold. They can even be so good as to inspire addiction, complete with withdrawal in their absence.

Movable Type on an iPhone If you create a tool as powerful as instant messaging, for example, you won't be able to stop adoption in the enterprise -- you'll just need to add enterprise features. And to those who proudly point out that the iPhone is "too cool to ever go to work", you can't also claim that enterprise IT will have to deal with it because it's popular. Unless you want to perpetuate the myth that we somehow transform into emotionless robots when we go to work, you have to acknowledge that Apple's going to make more and more improvements to accommodate them, and that's a good thing.

Of course, I have a dog in this fight. I'd advocated for years that blogging should be an enterprise tool, and helped my company ship Movable Type Enterprise, which was the first is the most popular enterprise blogging app around. I wrote a little bit about why in "Why do you care about business blogs so much?"

For the normal people, the ones who kind of maybe have heard of blogs, but certainly haven't tried them out yet themselves, discovering blogging as part of work will lead them to thinking about how blogs can change every part of their life. It's just like the millions of people who first used a web browser as part of their job, or the people who had an email address at work or school before they ever signed up for Hotmail or Gmail.

When I talk to companies about blogging, I ask them how their Knowledge Management or Enterprise Content Management deployments have succeeded. And they almost invariably mumble a bit about "it's sort of underperforming...". This is the dark outcome of people trying to draw a line between who we are at work and who we are at home. You end up with shoddy, compromised products like KM or groupware. And the folks in IT aren't unfeeling, tyrannical monsters; When I tell them "well, we'll give you LDAP integration, but it'll also have a UI that's easy enough that people choose to use these tools in their free time as a hobby", their eyes light up. They want to delight people, too.

That's the truth of it -- if you don't change the way people work, you can't claim to be changing their lives for the better. In the developed world, we spend most of our waking hours at work, and the impact is enormous. The success of PCs in the enterprise helped indirectly subsidize computers getting cheap enough to buy at home. The requirements for reliability and stability of a lot of enterprise software makes for better consumer user experiences. And of course, most of the shopping on eBay or Amazon or most of the ad-clicking on TMZ or Gizmodo happen while people are at work too. If the anti-enterprise advocates had their way, none of us would have web browsers at work, but we'd still be ideologically pure and stickin' it to the man. Yeah!

Except we'd be sticking it to ourselves, for 8 to 10 hours a day. If you believe in a technology, like I believe in blogging, or you believe in a company, like many fans believe in Apple, then expect more. Don't settle for compromises where we're supposed to have crappy tools for the work we do -- any good craftsman takes pride in using the best tools he can.

And above all, stop making excuses for the arrogant and exclusionary voices that want to limit promising new technologies to just those who can afford to pay for them at home, or who have the interest to chase down the latest tech. Everybody deserves to benefit from this stuff.

Meaningful Catches On

July 20, 2007

Two of the posts I'm most proud of having written last year are Making Something Meaningful and How do we judge our tools?. It looks like the sentiment behind those posts is catching on.

  • Nick Bradbury on Conserving your limited attention: "When I hear someone complaining about all the feeds competing for their attention, I have to wonder why they don't just unsubscribe from most of them."
  • Jeremy Zawodny on Getting off the hype treadmill: "I made an conscious decision to drop virtually all "news" sources from my subscription list that felt like breathless hype machines that provided little new insight."
  • And Steve Rubel, who seems to have gotten a lot of conversations started with the conclusion that "[T]he bigger story in the long run is how these sites change business and our society."
  • Mike Torres captures a related point about insularity, "It used to be fun watching the "A-list" bloggers discover the obvious things that folks outside the U.S., little kids, and even big companies have been tracking for months; sometimes years."
  • There was a nice nod from O'Reilly Radar last week, too.

And of course we visited the blogosphere's reality distortion field yesterday. Now we just have to see if this is just a blip of self-criticism, or if people actually want to change what they pay attention to.

What I do for a living

November 29, 2006

One of the most common questions I get from people who know about Six Apart is "What the hell do you actually do there?" These days, that question's easier than ever to answer, but it involves explaining one of the goofiest parts of my job: My title.

Evangelist Boy

You see, these days my business cards describe me as "Chief Evangelist". On the plus side, it's the first time in the history of the company that I've basically only had one job (though I still help out with as much stuff as I can), but on the downside, the title is fucking ridiculous. I hate the word "evangelist" as a description for people who advocate technology not merely because of its religious connotations, but also because it implies a degree of proselytization that I'd like to think I don't participate in. Most of the time, my job is really just simple education.

Unfortunately, there's no better title to describe this kind of work. So, evangelist it is, and the title has stuck. The last time I saw Guy Kawasaki, I made sure to mention that it's his fault I have a title that makes no sense outside of Silicon Valley. Fortunately, it should be a lot more fun the next time I see Guy, which is at the Global Network of Technology Evangelists event next week.

GNoTE is an interesting organization that is just getting started. At its core, it seems to be a group of people who recognize that technology can have a great impact on people's lives, but only if some of us are dedicated to explaining technologies and in helping make them accessible to a wider range of audiences.

If that sounds interesting to you, and you can get to Santa Clara, join us on Monday for GNoTE's inaugural event. (More event details are on Upcoming,) I'm very flattered to be in the company of counterparts from Yahoo, Amazon, Microsoft, and Sun, among others. As a bonus drinking game, you can take a swig every time the word "evangelist" or some variation thereof is mentioned, and walk out of the place blind stinking drunk!

The Starting Line is not the Finish Line

November 27, 2006

There weren't a whole lot of really new things announced at the Web 2.0 conference, mostly large companies saying what you'd expect. But one of the launches that stood out was stikkit. There are plenty of reviews of the service; I'm not here to talk about that.

I got a chance to talk to the folks behind Stikkit a bit at the event, and I've been friends with them for years. So instead of "hey, what does it do, what are the features?" we ended up talking a little more generally about what starting a business, and launching a product, actually means.

Running

Michael sums it up well on his blog:

Talking to Anil at the conference, I realize something now that I only sort of had at the back of mind before. He described how he just got back from watching the NYC Marathon, and how gruelling it can be just to arrive at the starting line. You need to fly there, take taxis, ferries, subways, then register, warm up, and finally start running. He said "You've just now arrived at the starting line, and your marathon has just begun."

And there's no doubt he's right. I see much more clearly now that we've launched that a lot of attention has to be paid to pacing ourselves, and making sure we're tapping into the collective intelligence of our rapidly growing user base. Some of those little things we put off prior to the launch are now beginning to take center stage, and we're spending good quality time getting things right.

Too often, I see people, especially in the new wave of startups, treating their launch as the finish line. Or putting all their eggs in a single basket -- a big press story or coverage on a prominent blog. Maybe a partnership or endorsement from some company. Any of these things are great (hell, I work on that kind of stuff every day) but none of them, on their own are enough.

Launching something meaningful is about every day, every minute, that happens after that start. Honestly, it makes me feel a lot like when I was talking about getting married: "If you tell people you're engaged, they start talking to you about that one day, and almost never about the other half century you're signing up for."

I am, frankly, tired of reading reviews of new technology that omit the commitment of the team, that don't mention how the success of the product almost feels like life-or-death to the people making it, or ones that ignore the people who make the damn thing happen. I'd settle for one product review that said, "we're not sure which direction this service is going, but the people behind it have a history of making magic happen". The technologies I use most every day were almost all conceived as something else entirely, and evolved into their current, indispensable forms through the dedication of people who were interested in running the marathon, not just entering the race.

(Thanks to David for the photo.)

Life or Death for Web 2.0

October 16, 2006

A month ago, I began a series of posts outlining some common themes:

  • Any system faces danger when it becomes a monoculture
  • Diversity offers many broad-ranging and sometimes unexpected benefits
  • There are many parallels between biological systems and technological networks like social software on the Internet.

In this context, "Web 2.0" isn't an overhyped and under-defined buzzword, but rather an umbrella term describing all of these kinds of social software that make use of Ajax-style design patterns to serve a useful, meaningful purpose.

Today, most individuals and companies making social web applications are existing in a monoculture that robs them of the broad perspectives, influences, and understanding necessary to create a community that's sustainable over the long term. In short:

The lack of diversity in Web 2.0 poses a life-or-death threat to its viability.

Petri DishIf the success and influence of the social web is to continue, we must make it a priority to include the cultures and communities that we've been ignoring, overlooking, or excluding. A failure to broaden our view will ultimately be fatal if uncorrected. How could this be true? To start, let's look at some of the ideas that inform this view, taken from a variety of disciplines including astronomy, biology, sociology and even cooking.

Some Background

No community can thrive without the perspectives of outsiders, especially if it's trying to serve those outsiders. The key to getting good results is understanding the importance of the variety of cultures available. We've all seen that communicating using all the tools of social media can make people's lives better. The reality is, those benefits can apply just as much to one's professional life as to one's personal life.

But the thing that strikes me as equally important is remembering that even the most powerful, influential, or pervasive lines of business are always in a tenuous position. You can have the power of the legal system at your hands, or the ability to talk to almost everyone in the country at home or in their cars, and still end up in a defensive position if you're not able to have a dialogue with your community.

PizzaIn the real world outside of Silicon Valley, people are busy solving problems that we often overlook, trivialize, or deliberately ignore. It's instructive to be immersed in a culture outside of the one where we create new technologies. For us, encouraging everyone to take advantage of social media is a fundamental necessity.

Hundreds or thousands of years ago, the greatest danger that faced societies was the introduction of a foreign culture's physical threats... the greatest threat to cultures today comes from not intermingling. Whether it's expressed in agriculture ("hybrid vigor"), or in the context of a cocktail party (being a "social butterfly"), making an effort to avoid cultural isolation is rewarded by making an individual or a society more healthy. That's not to mention the bonus potential of additional opportunities, higher potential for recognition, a larger market for trade or commercial interests, and a broader audience for communication of messages.

In biology, species with little genetic variation -- or "monocultures" -- are the most vulnerable to catastrophic epidemics. Species that share a single fatal flaw could be wiped out by a virus that can exploit that flaw. Genetic diversity increases the chances that at least some of the species will survive every attack. Building an industry around a monoculture places the entire economy in danger from unanticipated threats. And it's only the adoption and embrace of a broader range of cultures that can help an industry protect itself from that danger, or sustain itself when facing a downturn.

Planet EarthIt leaves me struck that something as big as, well, the whole world can look fragile if you step back far enough to really look at it. And a work that took enormous resources to support, unbelievable imagination to create, and true courage to execute can seem downright ordinary once it becomes ubiquitous.

The Good News

So, are we doomed? I don't think so. It turns out, this kind of groupthink or myopia is actually pretty common, or at least common enough that it can make the news today. From this morning's Washington Post, Shankar Vedantam's article says:

While the instinct for homophily in politics and other areas seems hard-wired, technology may be fueling our nature. Cable television and the Internet have allowed enormous numbers of people in distant areas to form virtual groups that are very similar to what you see in the office cafeteria.

...While there is nothing wrong with being around others who are similar to yourself, both Smith-Lovin and Small said that people and organizations pay a price for homogeneity. In politics, for example, the fact that people rarely have friends with different views makes it difficult to seek common ground or to examine one's positions closely.

So why all these words? Is a post with pics of a petri dish, a pizza pie, and a planet going to help? Well, the truth is, telling people to be more inclusive just because it's the right thing to do just plain doesn't work. I'm hoping that explaining that our self-absorption presents a mortal danger is enough to get people to do the right thing out of enlightened self interest. Fortunately, some people have already made some great steps forward.

When I wrote about what it's like at the Web 2.0 conference last year, I had despaired somewhat, thinking things could never change. Today, they still mostly haven't. But while I was complaining again, some other conversations popped up that started to give me a little bit of hope. "Be the fucking role models the situation calls for." "monocultures produce monotonous culture." "We should be learning from it and improving ourselves, not using the rhetoric of the past to brush off criticisms we're just too lazy or unwilling to deal with."

The people who are most likely to be threatened or insecure about the embrace of diversity are recognizing not just the opportunity of a broader view, but the necessity of it. Sometimes good ideas do rise to the top. All of us who've been in groups that were outside the monoculture have been aware of this danger, but now those on the inside are aware as well. That's real progress, and real cause for optimism.

The truth is, we need to fight monoculture for the same reason many of us abhor DRM, or fight sterile GMO crops, or argue in favor of Creative Commons licenses. The tools of expression, of communication, must be able to reach everyone, they must be able to bear fruit for those who would reuse or recontextualize them, and they must be available for anyone to expand on or build on.

The people in our communities who are most likely to make an unexpected leap, or to add value that we didn't anticipate, are the people who we aren't even making part of our communities. And it's not too late to include them. But if we keep thinking that diversity or rejection of monoculture can wait for version 3.0, we're dooming all of Web 2.0 to fail.

Resources

Most of the content for this post came from my own earlier posts on these topics over the past few weeks. See:

  • A Very Small Planet: Covers Jack Schmitt's remarkable "Blue Marble" photo of the Earth, also seen in this post.
  • Pizza Requires Culture talks of Jeff Varasanos' amazing, obsessive pizza recipe, from which the pizza photo above is taken. A key to his success is in understanding various yeast cultures.
  • Lawyers, Broadcasters, and Bloggers ... Oh My! Talks about some of the audiences outside of the tech world that I've been trying to talk to.
  • Hit the Road is about creating events for non-technical professionals to learn about social media online.
  • The Threat of Extinction previews Steven Johnson's Ghost Map, as well as a host of other books about plague and epidemics. This also inspired me to include Jack Mottram's petri dish photo, which is Creative Commons licensed.
  • Revising the Software Monoculture gives an update on Dan Geer's seminal look at software monoculture.
  • Monoculture Considered Harmful gives some background on the boll weevil infestation that devastated the cotton monoculture of the American South.
1