Behold: The Future!

A couple of years ago, I stumbled across a few issues of a company newsletter I’d written back in 1999, when I was running a seed fund targeting startups founded by graduate students or undergrads. Each edition of that newsletter included a “here’s something cool and new” segment, where I highlighted such new-fangled ideas as [a search engine called Google](https://www.joshuanewman.com/2009/07/the_new_new_thing/), or [an email pager soon to be called a Blackberry](https://www.joshuanewman.com/2009/07/unripe_berry/).

Recently, however, I came across a few more issues of that newsletter, most of which have also proved the test of time. Consider this short piece, for example, from February 2000, where I float the radical notion that ever-dominant Microsoft might one day face real threats – not from antitrust litigation over their unshakable monopoly (as an antitrust trial was, at that point, underway), but rather from companies moving more and more of the OS to the cloud:

>By now, Bill probably isn’t a happy camper. First, Judge Thomas Penfield Jackson unleashes a scathing findings of fact. Then the states and the Justice Department recommend chopping up Microsoft and monitoring it’s activities. Of course, in the short term, Microsoft will probably come out on top – by all indications, the appeals court will overturn a Jackson ruling before things get too ugly. In the long term, however, Microsoft’s problems are far from over.

>Consider sites like XDrive, which allows users to store documents online, or AnyDay.com, an online personal information manager. Like the new browser-based email sites, these companies are busy moving operating system functionality onto the web. From a user perspective, moving the OS into the browser is great. Imagine being able to sit down in front of any web connected computer and accessing all of your own applications and documents. For anyone who uses a different computer at work and at home, or who travels frequently, an online OS would be the best thing since sliced bread.

>For Microsoft, however, a web OS is bad, bad news. If the only software you need on your computer is a simple web browser, then the new breed of super-cheap internet appliances could very easily give the Windows/Intel cartel a run for their money. And Microsoft can’t effectively enter into the web OS competition themselves without cannibalizing their traditional OS sales base.

>Sure, Microsoft won’t be out of business any time soon. But like the smug mainframe manufacturers who poo-pooed PCs, Bill and Co. may not always be top dog. On the other hand, perhaps we here at Paradigm Blue should have faith in Microsoft’s ability to pull through. After all, it is a student startup.

Crunched

TechCrunch is more or less required reading in the tech world. Yet slogging through the huge number of posts previously took me nearly twenty minutes each day.

Recently, however, I started routing TechCrunch’s RSS feed through [FeedBurner](http://feedburner.google.com/), to create a daily summary email – each of the posts boiled down to just 300 words. Now I can skim the headlines and summaries, and click through to the rare article I want to read in its entirety, all in about two minutes.

Try it yourself.

Filmmaker == Hacker

Having split my professional life between the tech and movie worlds, I’ve always been struck by how similar filmmakers and hackers are. For example, both groups:

  • Think about the world as a collection of fascinating material to be mined / problems to be solved;
  • Disdain things that are boring and have already been done;
  • Distrust tradition/authority as a sufficient rationale in and of itself;
  • Respect competence and support meritocratic structure;
  • Work collaboratively and share ideas and solutions (even with ‘competitors’);
  • Are willing to put in huge amounts of work, even when unpaid, just for the love of the game.
  • And, most importantly, want to share the things they pour their hearts and souls into making with as many people as they possibly can.

In the tech world, that’s easy for hackers to do: they start startups, build stuff on their own terms, and then share their stuff with users by building direct customer relationships.

In the movie world, however, filmmakers haven’t had such a direct route; instead, they’ve traditionally had to rely on studios and distributors to build those relationships for them.

Now, sites like YouTube allow filmmakers to share directly. But those sites also don’t generate real filmmaker revenue. And while filmmakers (like hackers) don’t actually care all that much about getting rich, they do at least want to make enough money making their stuff that they can live comfortably, and show their investors strong enough returns to play again as soon as they come up with their next big idea.

With more and more films being made each year, it seems almost inevitable to me that new solutions will emerge somewhere between the studio and YouTube models – solutions that help filmmakers build broad audiences, profitably, and in ways they directly control.

I’ve been giving that a lot of thought of late. Because it seems like that’s a big problem waiting for a solution – and an equally big business waiting to be built.

In F.lux

If you, like 40% of Americans, sometimes have trouble falling asleep, consider blaming your computer.

Turns out, melatonin (the sleep hormone) is largely regulated by blue light. That makes evolutionary sense, as the sun gives off blue light during the day, while the moon, and fire, both give off much redder light at night. So your body monitors blue light levels, emitting hormones accordingly, to create a circadian rhythm: tired at night and alert during the day.

Problem is, we screw with those signals on both ends: we spend too many of our waking hours inside, getting less blue light than we should; and then we spend much of our post-sunset evening in front of boxes like computer screens, getting too much blue light.

To fix the day side of things, you’d need to spend more time outdoors, with more of your skin exposed to the sun. Which, during the winter, probably entails moving to Hawaii.

But fixing the evening side of the equation is much easier: just download F.lux, a great little piece of freeware for Macs, PCs, and Linux.

In short, after sunset, f.lux changes the color temperature of your display, from its default 6500k (even bluer than the 5000k of daylight) to something between 2700k and 4200k (depending on whether the rest of your room’s lighting is tungsten, halogen, or fluorescent).

Give f.lux a whirl for a week. Though it may take a few days of adjustment – your screen will look awfully pink/orange to you at first – by week’s end, I’m betting you’ll have a tough time using your computer without it.

Kindle UI

And while I’m griping about usability, how to greatly improve the Kindle:

1. With paper books, I (and every other reader I know) will occasionally glance ahead to see how many pages are left in a chapter, to see if it’s worth pushing through for a last few pages before putting the book down mid-chapter. A ‘lines remaining to the end of the chapter’ count, under the current location data, would accomplish the same thing, and would be trivial to implement. (Or, at least, it should be; as the Kindle knows the chapter head locations for use in the table of contents, Amazon must already be encoding this information somehow.)

2. Several Kindle books I’ve read have footnotes. But skip ahead to a footnote, and when you return to the body proper, the Kindle has registered the footnote page as the ‘further read location’. Every time I turn on a Kindle or Kindle app (on iPhone, iPad, or laptop), it then tries to synch to the book to that footnote page, and never again remembers my place in the body text. Surely, Amazon could add the option to set the ‘furthest read location’ to the current one, to easily solve this problem.

What a Tool

The Washington Post reports that a handful of colleges recently dropped the ubiquitous dining hall tray, and found that wasted food decreased by as much as 25-30% as a result.

An excellent result from so small a change, and one that makes intuitive sense: without a tray to pile upon, the amount of food people can carry apparently much better matches the amount they can actually eat.

But that glosses over an interesting question: why do people take more than they want to eat, even if they can carry it? Because, it turns out, they have no idea how much they want. Research has increasingly shown that, across the board, we’re terrible at assessing up front what’s going to make us happy at some point later, even if that just means determining how much food will make us feel pleasantly full fifteen minutes from now.

And, I think, it glosses over a second, even more interesting issue: we hugely underestimate the degree to which our tools affect our behavior. While scientists may not have previously researched trays, they’ve repeatedly researched plates, demonstrating, for example, that manipulating the size of plate on which we serve food changes the amount of that food we eat before feeling full; smaller plates lead to eating smaller portions, though with people thinking they’ve actually eaten more.

Of course, it isn’t just dinner plates and dining hall trays. Indeed, nearly all of modern life seems to operate at the same juncture of manufactured stuff and unclear self-assessment; thus, we make things, which in turn re-make us. Which is to say, we create technology (say, a plate) to assist us with an ill-understood instinctive behavior (eating food), and then find that the technology has led to unexpected consequences in the very behavior itself (how much of the food we eat).

For the behavior of communication, we’ve at least long acknowledged that we’re shaped by our tools – it’s been more than 45 years since McLuhan pointed out that the medium is the message. But as more of our life becomes mediated by technology – how we share with friends, how we find our mates – the effect becomes exponentially greater. We’ve thrown ourselves into this crazy experiment without much thought, and we plow ahead, increasingly unthinkingly, shaped by our tools, unable to self-assess or future-predict, each brand new day.

As most non-tech folks missed this the first time through, a great, in-depth recap of the Stuxnet worm, which secretly derailed Iran’s nuclear program:

The construction of the worm was so advanced, it was “like the arrival of an F-35 into a World War I battlefield,” says Ralph Langner, the computer expert who was the first to sound the alarm about Stuxnet.

Was this us or the Israelis?  I suspect the second, and am extremely impressed either way.

Peak to Peak

There’s a lot of research behind the idea that we measure how well we’re doing in life not by absolute measures, but by relative ones.

Most people would (perhaps obviously) choose to earn $75,000 over $50,000, all else being equal.

Yet change that choice to be between earning $50,000 while your friends and colleagues earn $40,000, or earning $75,000 while your friends and family earn $100,000, and the popular option flips. Most people choose to earn less overall, rather than to earn more overall while still earning less than those around them.

Evolutionarily, we’re wired to look for our standing within a group. We determine how we’re doing by checking how well we compare.

And that, I think, is the danger of Twitter.

Most people’s average days are, well, pretty average. Yet within any given day, at least one relatively interesting thing is likely to happen. That’s the part people tweet about:

“I’m at [fill in the blank interesting place]!”

“Just ran into [fill in the blank important person]!”

“OMG! I love [trendy thing]!”

Basically, you get the highlight reel of all your acquaintances’ lives, 140 characters at a time. All of whom, extrapolating from there, seem to spend their entire lives attending parties, being fabulous, and generally living very well.

But, like in reality TV, the trick is in the editing. You live the entirety of your life (the highs, middles, and lows), and only read about their lives’ peaks.

So, rather than let Twitter depress you with comparison-driven angst, consider a thought experiment I personally enjoy: Tweets that your friends should publish if they were trying to reflect the full balance of their lives, but probably never will:

“Still working on [busywork related to current mind-numbing project]!”

“Eating a tub of Haagen Dazs alone on the couch while watching TV again!”

“Holy crap, I just had really explosive diarrhea, and boy did it burn!”

Wu Wei

While the rumor of a Verizon iPhone floats every six months or so, it appears we may really be coming down the home stretch now. As a seasoned Apple fan (dating back to an original Macintosh, and even passing through a Newton en route to present day), my gut sense confirms that we’ll actually see a Verizon iPhone at the start of 2011. And, when we do, I’ll still be sticking it out with AT&T.

Sure, service on AT&T blows goats. But not because there are fundamental flaws with AT&T’s network. Instead, it’s because all the iPhone users are on the same network, totally overwhelming AT&T’s infrastructure.

A dropped call, for example, isn’t a network flaw – it’s a feature. Let’s say you have network capacity for two calls, and three people dialing away. You can piss off all three by connecting all of their calls, sharing the bandwidth equally, and making all three calls equally unintelligible. Or you can make two calls work perfectly, and simply boot the third off the network. That’s what most cell networks do. So, how do you fix the problem? Increase the capacity of the network. Which, in fact, AT&T has been doing quickly over the past few years. Problem is, Apple has been selling iPhones and iPads even more quickly. Net net, that leaves AT&T’s network falling even further behind.

Of course, the other solution is to keep the network the same, but to get rid of users. Imagine if, as some recent surveys indicate, half of all iPhone owners jumped to Verizon, given the chance.

At that point, AT&T’s network would work great. They’d have tons of bandwidth to spare.

Whereas Verizon’s would suddenly be the network blowing goats, overwhelmed by the huge deluge of incoming iPhone users.

And the kicker: a large percentage of those escapees would have paid hefty termination fees for the privilege of heading to the now shittier network.

In other words, as ever, a combination of sloth and totally ignoring the problem turns out to be the best possible choice.

Innovation Overshoot

Among the laundry list of other features Steve Jobs demonstrated this morning on the brand new 4G iPhone was a secondary, front-facing video camera, allowing users to video-chat with each other.

Amazing! Straight out of the Jetsons!

Or, honestly, not so amazing. At least not to me. While I appreciated the wow factor intellectually, Jobs’ demo didn’t leave me much viscerally impressed. After all, Jess and I already video chat whenever one of us in on the road, using Google Video on our respective MacBooks.

This afternoon, however, I was truly bowled over. I sent a two-page fax. And, as happens each time I use one of those machines, seeing paper going in one end of a fax machine in my office and knowing that a copy was coming out the other end of a fax machine somewhere hundreds of miles away completely boggled my mind.

Obviously, as compared to even plain-text email, the fax machine and its simple transmission protocol is roughly akin to cave painting. Which, perhaps, is why it so impresses me. I can just barely comprehend the engineering involved in faxing, the difficulty of somehow turning my paper into a series of screeches that another machine can translate back to scribbles on a page.

Whereas by the time I think about email – or certainly video conferencing – my mind can’t even begin to grasp the complexity.

As Arthur C. Clarke famously observed, any sufficiently advanced technology is indestinguishable from magic. Which, perhaps, is the problem.

Growing up, I loved magic – learning tricks, watching magicians on TV. But magicians like David Copperfield, whose tricks (I recall seeing him walk through the Great Wall of China) were completely inscrutable, never really stuck with me.

My heart, instead, belonged to Penn & Teller. The plucky pair would gleefully give away the secret to their tricks, then re-perform them. And, the second time through, I’d be doubly impressed, marvelling at the skill and dexterity I suddenly realized that pulling off the tricks required.

So, perhaps, to really appreciate that 4G video chatting, I’d simply need to spend some time puzzling through the technology involved. Apple engineers, if you want to send along a crash course, feel free. And if you really want to wow me, you can send it via fax.