The 401st Blow :: Thoughts On Media

I Hate Hyperbole

Posted in Technology by Noah Harlan on July 18, 2010

I hate hyperbole.

Today, in response to Apple pointing out that other phones sometimes have issues with their antennas, several other manufacturers released statements. Nokia’s was particularly silly. Largely because of the following quote:

As you would expect from a company focused on connecting people, we prioritize antenna performance over physical design if they are ever in conflict.

Really? You never make a design decision that might impact performance? Of course you have. If you hadn’t then your phones would never drop a call, would have a huge battery and would have a ridiculous looking antenna. Your sexy N97 would now look something like this:

How To Run A Great Company

Posted in Technology, Theory by Noah Harlan on June 1, 2010

Steve Jobs is on stage tonight at D8 talking to Walt Mossberg and Kara Swisher from the Wall Street Journal. He said the following about how Apple is organized and I think it’s a fantastic mindset (via Engadget):

One of the keys to Apple is that Apple is an incredibly collaborative company. You know how many committees we have at Apple? Zero. We’re organized like a startup. We’re the biggest start up on the planet. We meet for 3 hours every morning and talk about all the business, about what’s going on everywhere. We’re great at figuring out how to divide things up into great teams, and we talk to each other. So what I do all day is meet with teams of people. To get great people is to let them have great ideas.

Tagged with: ,

Steve Ballmer Still Lives?

Posted in Software, Technology, Uncategorized by Noah Harlan on May 26, 2010

Today Apple’s market cap surpassed Microsoft’s to make Apple the second largest company in America. At the end of trading today, Apple’s market cap was $222.12 billion versus Microsoft’s $219.18 billion. My only question is:

Why does this man still have a job?

Why does Steve Ballmer still have a job?

Since he took over Microsoft from Bill Gates there has hardly been a single stand out executive move by Microsoft. Nothing that would seem innovative has come to market. And I say ‘come to market’ with care.

People howled that Microsoft had an iPad killer in the works with the HP Slate. The problem? HP realized that Windows Mobile 7 (or whatever hack OS Microsoft tried to develop for the device) wasn’t worth building a platform on so they bought Palm and Palm’s much more sophisticated WebOS platform. The Microsoft Courier dual-screen slate? Yep, that was killed just a couple weeks ago and Bill Gates is now saying they’re focussing on “a number of different tablet projects, with a focus on stylus-based input.”

Where is the vision?

The problem is that Steve Ballmer is exactly the wrong man for the job.

Steve Ballmer is a sales guy. Look around silicon valley and you won’t find many of the big boys being run by sales guys. Steve Jobs, Larry Ellison, Eric Schmidt. None are sales guys. They’re either technical visionaries or efficient operators. When Ballmer took over Microsoft he was running a company that was already ubiquitous. Why would they need a marketer – everyone already knows Windows. They needed someone with a bold vision and technical confidence to see where things should be going. When you are dominant in a market like Microsoft is, you can set the conversation. Don’t believe me? Ask Adobe…

You need a sales and marketing guy when you are Palm, after the development of WebOS and before the purchase by HP. You have a great product, but nobody knows who you are. You either let the world know about your great product, or you can just make creepy videos.

The question that is left is how long Microsoft’s board and shareholders will let the wrong man run their company down.

Let It Beep

Posted in Content, Software, Technology by Noah Harlan on February 27, 2010

I love the stories behind the story. Those moments where you get to hear someone breakdown the thoughts and events that went into something big. Something that they often didn’t know would be big but wound up being profound, or revolutionary, or just plain ubiquitous.

Here is an interview with Jim Reekes. You’ve probably never heard of him. You’ve definitely heard his work. Jim worked at Apple for 12 years starting in 1988 as an engineer specializing in sound. His work included much of the foundational work for things like Quicktime, Final Cut, and the original Sound Manager. What Jim is probably best known for though? He created the Mac startup chime.

I’ve owned Macs since the Mac SE and I now even develop software for Apple mobile devices. I have heard that startup chord thousands of times, maybe tens of thousands. My wife’s laptop had a hard drive freakout last week in the middle of the rain forest and would continually restart, making that chime every time (we fixed the computer without replacing the drive thanks to a copy of the glorious DiskWarrior software we picked up at a Mac shop in Melbourne). Here Jim tells the story of how it came to be and what he thinks about his legacy. He also shares, in the latter part of the interview, the story behind the Sosumi system sound which is surprisingly entertaining.

The video intro is in Dutch but the whole interview is in English.

Tagged with: , , ,

The Evolution Of Interaction

Posted in Technology, Theory by Noah Harlan on January 30, 2010

I guess today is my day for following up on earlier posts… Last week I posted about claim chowder, and in particular the following assertion by PC World’s Bill Snyder about the impending Apple tablet announcement:

[If] you run a small business and want to avoid wasting money and brain cells on superfluous technology, forget about the iSlate or whatever Apple is going to call its tablet computing device. It’s going to be too expensive, it does things you don’t need to do, and it will add a messy layer of complication to your company’s computing infrastructure.

Sure, the tablet we expect Apple to launch on January 27 will probably have more than its share of cool factor. But do you want to spend $1,000 or so for bragging rights? For that price, you could buy two perfectly serviceable Windows netbooks, four iPhones, or–if you want to go the Apple route–cover most of the cost of a 13-inch MacBook Pro, getting proven technology that’s useful right out of the box.

So, was it more claim or more chowder?

Now everyone under the sun has chimed in with their thoughts on the device but indulge me, and allow me to share some thoughts about why this device is more important than you may think.

The iPad changes the way we relate to media and content.

The ENIAC - 1946

In the earliest days of computing we related to technology and the media contained in that technology (why we call it ‘content’ remember) symbolically. Toggle switches would be flipped, lights would turn on and off and a code would be returned to us to decypher. This was, in a way, much like the ancient abacus. It had the potential to process certain linear tasks more efficiently than our own minds and pointed to the possibility of processing power to extend our abilities, to prove out our theories, and to make tangible our imagination. We gained greater efficiency within this structure – cards contain series of input commands replaced step-by-step manual manipulation & early printers could print results that could be studied later – but we remained at a distance from the machine and from the content. We had to adopt the language of the machine, even if it was a language we created.

By the early 1960’s, scientists had merged the teletype technology of the time with cathode ray tube displays and used them as an input/output system for computing. This moved our interaction with computers from primitively symbolic to  linguistic. We could enter commands in language and the computer could send responses in language. We moved into a dialectical relationship to our technology. We could describe what we wanted to have happen and the computer, in response, could describe what happened. It lacked the ability to represent, unless what it was representing was language. That was fine, even amazing, except that it separated the computer from whole parts of the human experience. It was a device for the sciences. Explain a problem and it will explain the answer.

Then, in the 1970’s, researchers began experimenting with new ways to represent information and control your interaction with computers. Out of this research emerged the general purpose Graphical User Interface (GUI) created by the legendary Xerox Parc laboratory. (Yes, for those of you who don’t know the history of computing, it’s that Xerox company. Xerox was also responsible for creating the mouse, the WYSIWYG editor, bitmaps, object oriented programming and ethernet. But yeah, the copier company…) In 1981, Xerox released the Star  (aka: the Xerox 8010 Information System), a computer that used as it’s primary form of interaction a non-textual visual representation of content which you interacted with visually and manually (pointing and clicking). It was this system that a young Steve Jobs and Steve Wosniak saw at the Xerox Parc labs and took them from the early Apple computers to the launch of the single most groundbreaking machine in the history of personal computing. The Macintosh.

The Macintosh

The Mac became wildly successful and was the blueprint for Microsoft’s Windows operating system. Over the following 2-1/2 decades, we related to our content primarily in a visual and mechanical way. However, we remained abstracted from our content and the information contained in our devices. The mouse became second nature, but it was always second nature. Think of all the times you struggled with mouse pads, the mouseball became jammed with lint, the cord got snagged on the keyboard, the button wouldn’t click, or clicked too easily, the line you were dragging wasn’t precisely where you wanted to go. These frustrations were all mechanical in nature and they kept us as a distance from our content. We came up with solutions to each (the mouseball and trackpad disappeared with infrared, the cord vanished with trackpads and bluetooth) but we remained abstracted.

The first commercial touchscreens emerged in the early 1980’s. These primitive devices relied either on infrared grids positioned around the edge of a screen – a solution that was not truly a touchscreen as the sensing device was not part of the screen itself- or on resistive touchscreens which were based on pressure sensitive pads – when you touch a spot on the screen a slight gap between two layers is closed and the screen registers that touch. Both of these systems are, again, mechanical as one requires you to physically block rays of light and the other requires you to physically manipulate a circuit into closing. But then came capacitative touchscreens.

Capacitative touchscreens have a light charge running over the surface and when your finger comes in contact with that surface the electrical conductivity of your body creates a capacitor and the device can respond to that. You are now no longer mechanically manipulating and instead you are gesturing.

This move to the gestural changes how we interact with content. The last physical bridges between us and content in our devices begin to crumble. Gestural control is intuitive, it is fluid and it is no longer needing interpretation. I pinch and it shrinks. I drag and it follows. I tap and it zooms. I remember standing in the Apple store in Soho after the iPhone was first launched and watching people playing with the device. It was a fascinating experience and I made two observations:

1) People who walked in off the street and started playing with it smiled. And I mean smiled immediately. The device amazed them. It was fun to make gestures and to see a computer respond. To make gestures and watch your content respond. When I was in elementary school my parents made my brother take a typing class. He would go to a teacher, then he would come home, put blank stickers on his typewriter and practice touch-typing until he had mastered the skill. In order to have the most basic interaction with a computer he had to learn a new way of communicating and he would then have to channel his thoughts into that communication methodology. With the gestural computing of the iPhone (and a big part of this is the iPhone OS, Apple figured out how to build an user interface to match the technology) you no longer needed to be taught, as you already knew.

2) It was the first technology device I have ever seen that seemed to equally amuse and entertain men and women. Men thought it was cool and women thought it was cute. Why? Because it was comfortable for both. We do think differently, we view the world differently and we interact with the world differently. But we innate knowledge, reinforced by our entire lifetime of experience, as to how to gesture. Yesterday I visited a friend’s house. His one-year-old daughter was in his arms when he answered the door and, when she saw me, she reached out her hand and pressed on my nose. I laughed, she laughed and then she pressed again. She had learned a gesture. She won’t learn to type for years. Which way of communicating will be more intuitive to her…

The iPad - Gestural Computing

The iPad is the gestural idea brought to the next step. It is large enough that we don’t feel constrained and limited – when using an iPhone you have a perpetual sense that you’re watching part of something larger (the rest of the email is off the screen, the rest of the web site is off the screen) and the device must, at it’s core, serve as a phone so it’s operating system gives primacy to it’s phone features. The iPad breaks down those limitations and opens up more surface for our content and more space for our gestures. We now can interact with all our content through direct motion and movement, no longer moving proxy instruments like keys and mice. This proximity to our media makes it more personal and frees our methods of expression.

I played music for a long time and studied jazz through high school and into college. I listened to interviews with great jazz musicians and they would talk about practicing and practicing until the instrument became instinctive. Until they no longer thought about the note they wanted to play but, rather, they thought of the note and the sound came out of the instrument. Musicians at the top of their abilities talk about the instrument becoming part of their body and, when improvising, reaching the point where they can turn off their brain and just connect their soul to their hands. Few people ever have the talent and spend the time to reach that point. The point where they no longer have an idea, think about how to execute it and then execute it but, instead, think and perform. Gestural computing lowers the bar to proficiency. It removes the conscious thought of technique from the act of creation. It opens a whole new world and this week we moved one step further towards it.

Tagged with: , , ,

Claim Chowder

Posted in Software, Technology by Noah Harlan on January 20, 2010

Step 1: Open Mouth, Step 2: Insert Foot

There’s a term called Claim Chowder that was, as far as I can tell, coined by Daring Fireball’s John Gruber. It refers to when someone makes a prediction with an aura of certainty and knowledge that turns out to be horribly wrong. A good example from the film business was last May, when one Wall Street analyst, after seeing 20 minutes of Pixar’s UP, downgraded Disney’s stock. As the New York Times reported:

Richard Greenfield of Pali Research downgraded Disney shares to sell last month, citing a poor outlook for “Up” as a reason. “We doubt younger boys will be that excited by the main character,” he wrote, adding a complaint about the lack of a female lead.

UP did $293 million domestically and $727 million worldwide theatrically.

That is claim chowder.

So next week Apple will be announcing a new product. It is widely expected that it will be some form of tablet computer. Nobody has seen it. Nobody has any specs on it. Nobody knows the price. To borrow William Goldman’s words, nobody knows anything.

But that’s not stopping the claim chowder. PC World published this piece by Bill Snyder today. Mr. Snyder, apparently, is clairvoyant because he seems to know a lot about something he’s never seen. To whit:

[If] you run a small business and want to avoid wasting money and brain cells on superfluous technology, forget about the iSlate or whatever Apple is going to call its tablet computing device. It’s going to be too expensive, it does things you don’t need to do, and it will add a messy layer of complication to your company’s computing infrastructure.

Sure, the tablet we expect Apple to launch on January 27 will probably have more than its share of cool factor. But do you want to spend $1,000 or so for bragging rights? For that price, you could buy two perfectly serviceable Windows netbooks, four iPhones, or–if you want to go the Apple route–cover most of the cost of a 13-inch MacBook Pro, getting proven technology that’s useful right out of the box.

Now he may turn out to be right. I’m not a betting man. But if I were, I wouldn’t bet against Apple.  Let’s take a look at some of Mr. Snyder’s predecessors in the claim chowdering of Apple:

Microsoft CEO Steve Ballmer in 2007 on the iPhone:

There’s no chance that the iPhone is going to get any significant market share. No chance.

John Dvorak writing an article entitled “Apple should pull the plug on the iPhone” on Market Watch, also in 2007:

As for advertising and expensive marketing this is nothing like Apple has ever stepped into. It’s a buzz saw waiting to chop up newbies

The problem here is that while Apple can play the fashion game as well as any company, there is no evidence that it can play it fast enough. These phones go in and out of style so fast that unless Apple has half a dozen variants in the pipeline, its phone, even if immediately successful, will be passé within 3 months.

There is no likelihood that Apple can be successful in a business this competitive.

Stewart Alsop writing in Fortune Magazine in 1997 on Apple’s acquisition of Steve Jobs’ Next Software company:

Let’s get this straight right away: Apple Computer did the wrong thing. On December 20, Apple announced that it would spend $400 million to purchase Steve Jobs’s company, Next Software. The company said it would adopt Next’s NextStep operating system for future versions of the Macintosh computer. Most of the commentary I’ve seen about this decision is off the mark, especially the talk about Jobs coming back to save Apple. That is sheer nonsense. He won’t be anywhere near the company. People seem to have a real desire, perhaps even a need, to make excuses for Apple. Everybody wants to find a way to justify what Apple did.

You can’t justify it. Apple did precisely the wrong thing. Now the only future for the company is to get smaller and smaller until there’s nothing left. In fact, the only sensible conversation to have about Apple is the one in which you argue about how long it will take to die.

[snip]

It takes a long time to kill an $11-billion-a-year company. Apple’s already down to around $8 billion a year. I give it another three years, until the millennium, to fall the rest of the way to the ground.

And another piece from John Dvorak (how does this guy still get work?), this time from the San Francisco Examiner in February 1984 following the debut on the original Macintosh (the first computer with a mouse and graphical user interface – before this, everything was done at the command line):

The nature of the personal computer is simply not fully understood by companies like Apple (or anyone else for that matter). Apple makes the arrogant assumption of thinking that it knows what you want and need. It, unfortunately, leaves the “why” out of the equation – as in “why would I want this?” The Macintosh uses an experimental pointing device called a ‘mouse’. There is no evidence that people want to use these things.

As Samuel Clemens once said, “the reports of my death are greatly exaggerated.” Apple will introduce something next week, it may not change the world, but fair warning to those that bet against them.

Audio From ‘Where Film & Internet Collide’ Panel

Posted in Distribution, Self Promotion, Technology by Noah Harlan on June 8, 2009

I want to thank everyone for coming out yesterday to the panel discussion at the Apple Store in SOHO.  We had a packed house and IndieGoGo did a great job of organizing. Scott Kirsner moderated a very interesting discussion between Gary Hustwit, Chris Roberts and myself. If you’d like to listen Scott recorded the events on his iPhone and has posted the MP3 and streaming audio of it on his Cinematech blog. It’s a very lively and wide-ranging discussion with a lot of concrete advice about where we, as content creators, are going in an increasingly digital world.  Thank you in particular to Slava Rubin and a hat tip to Lance Weiler for asking me to step in.

Charting The Wrong Course Through Digital Waters

Posted in Financing, Policy, Software by Noah Harlan on March 22, 2009

I just returned from a great weekend in Virginia (everyone should spend time in a house built 50 years before the revolution sometime) and on the drive back I listened to the fascinating breakfast conversation at SXSW between producer Ted Hope, filmmaker Lance Weiler, conference organizer and producer Liz Rosenthal, technologist Brian Chirls, outreach guru Caitlin Boyle, filmmaker Brett Gaylor, producer and Filmmaker Mag editor Scott Macaulay, and journalist & film technologist Scott Kirsner. All are very smart people and the conversation is definitely worth a listen (the audio quality is not great but it’s worth soldiering through).

As I listened to them wrestle with questions relating to finding revenue in a digital age, I got the sense that there was a battle that had been fought and had already been lost. The battle was over payments for content. The semi-consensus view, and one I know Lance in particular espouses, is that the days of people paying directly for content (or at least paying up front) are rapidly disappearing and we should step forward into a share economy (I’m not sure that Scott was totally in agreement, but I don’t won’t to put words in anyone’s mouths). There was much discussion of putting your work out for free and then asking for contributions from consumers and this model, I feel, is akin to going back to the shareware model on computers.

Software started as free and then (as is mentioned in passing during the audio, interestingly enough) became a product to be paid for. Through that transition emerged a third tier of software – the product of independent software developers – that was Shareware. Shareware came in a few different varieties:

  • Freeware: Software that was freely distributed and free to be passed around.
  • Shareware: Software that was freely distributed but, if you chose to use it, you were asked to pay (on the honor system) a small amount to the creator
  • Crippleware: Software that was freely distributed but was limited in its features and, if you wanted to unlock the full features, you paid the creator

This system has some analogies to the ideas being explored by a lot of people in the transmedia world, notably in Brett Gaylor’s “RIP: A Remix Manifesto”. We’ve seen variations of Freeware & Shareware espoused through Creative Commons and even Crippleware from people like Nine Inch Nails with their (ok, “his”) release of Ghosts with higher audio quality and greater numbers of tracks being reserved for paying customers.

What interested me though, was that noone looked at the iTunes App Store as an example of how to bring payments back into the system.

Recently one of the most successful developers of Shareware for the Mac, Pangea Software, announced they were abandoning shareware development in favor of the App Store after the staggering success they’ve found on the pay-to-use platform. Numerous independent developers have had similar success. (Full disclosure and self-promotion, I have two Apps on the store now and more coming) I believe that there are several take-aways we can gain from the App Store example:

  • One: When offered a seamless way to pay and affordable, quality content to buy, people will pay for content.
  • Two: A seamless system of purchase & usage is vital to a financial model. The App Store only works because of it’s seamless integration with the iPhone. This is the same lesson the record companies failed to learn and why they were crushed by the iTunes music store.
  • Three: The consumer must remain conditioned to pay for content. One of the biggest threats to payments for content is that consumers begin to assume content is free. Does this mean legally hunting them down? NO! The RIAA has done a terrible job on that front. What it means is keeping people aware that they can get more reliable content, at better quality, and be more supportive of the creators by paying a small amount.
  • Four: We need to coalesce the online market. The single greatest obstacle we have right now is, ironically, the sheer multiplicity of options for where to view content. The App Store works because there is only one. If there were fifty, each with different content, it would be less successful. Blockbuster worked this way when we were bricks & mortar bound. Netflix worked this way when we were DVD-bound. Now we need a new solution. This doesn’t mean there needs to be only one online exhibitor (for why I say exhibitor and not distributor please visit this article on the Filmmaker Magazine blog) but, rather, we need consolidated places to find the content. There are some efforts underway to do just that including SpeedCine and the UK Film Council’s Find Any Film but these are just the beginning.
  • Five: Lastly, and this relates directly to point #2 above, we need to have better ways to move our media around. The tyranny of a particular box as viewing platform undercuts any efforts to simplify the process. Boxee and the Apple TV are both good moves in that direction but Boxee is in a tough fight. The studios decided to hamstring Boxee by forcing HULU to pull its content (a move that even HULU thought was wrong) in a ridiculously narrow-minded attempt to keep control of content (and an approach to DRM that is deeply reminiscent of the RIAA’s moronic and self-destructive resistance to iTunes). Until filmed content can seamlessly move from computer monitor to TV screen and back we are going to be behind the eight-ball, as it were.

These are a lot of things to ask but, if it means that content creators can be paid for their work then it is worth it. We need to embrace and fight for the technological innovations that can support our need to support ourselves. While releasing media for free and asking for contributions may work on a micro-scale and/or for the few, amazingly talented promoter/marketers like Lance and Arin, but for many talented filmmakers, it’s not their best skill and they should still be able to make amazing work, pay back their supporters, and earn a living. I do not believe that throwing in the towel and saying we live in a Pirate Bay world now and that we should give up on paid content is the right attitude and doing so will potentially hamstring future generations of content creators in their endeavors to make lives from their work.

Our App Now Recommended By Apple!

Posted in Self Promotion, Software by Noah Harlan on February 20, 2009

Ok, ok – this is total self-promotion but I’m very pleased to announce that our second iPhone App, the Party Planner, is now being recommended by Apple for your Oscar party according to VentureBeat. We’re on VentureBeat’s front page!

Try out the app now, it’s totally free and available worldwide for the iPhone and iPod touch. Also, check out our other apps for the iPhone and check back as we have some really exciting apps on the way!