Why Is Email So Complicated? Part 562: People Lie About What They Want

Email software is written by programmers, most of whom think of themselves as software engineers.  They approach their tasks, for the most part, with a highly rational set of techniques.  In the best of cases, this includes a careful analysis of user requirements, a specification of software functionality, and a decomposition of that functionality into a rational, modular architecture.  In short, programmers approach the business of building software as a rational and coherent task.

That's a good thing in general -- it's hard to imagine a better approach! -- but it has its blind spots, mostly in those areas where reality is least rational: the human element.  People can be tricky, inconsistent, and even maddening to a programmer who has, at root, set out to do nothing more than please them.

I learned this lesson rather dramatically back in 1985, when I took my fresh Ph.D. in Human-Computer Interaction and applied it to Carnegie Mellon University's Andrew project, with the mission of building the world's greatest email system.  I had all sorts of grand ideas -- including the vision of multimedia mail that ultimately came to fruition as the MIME standard -- but my team and I quickly figured out that none of our ideas would matter until we made campus email reliable.

It's hard to imagine today, but email in the 1980's was often terribly unreliable.  Carnegie Mellon had it particularly bad, because the college had taken the visionary approach of scattering Sun workstations (running UNIX) all over the campus.  The goal was to give everyone on the campus a super-powerful "3M" machine -- a machine with 1 megabyte of RAM, 1 MIPS of processing power, and a megapixel screen, with a target price of no more than a megapenny, or $10,000.  Quaint as this all sounds now, it was utterly audacious at the time.  (IBM, which was funding the project, didn't yet make a computer that qualified, hence the Suns.)

By the time I joined, the campus had been sprinkled with hundreds of machines, each of which was running the already-venerable sendmail program to deliver email.  Sendmail was quite reliable, but the human ability to understand and customize its configuration file was not, and the net result was that email often got mis-delivered to a local mailbox on a machine that was never used by the intended recipient.  From the user perspective, a signficant fraction of email was never delivered.  People would routinely send email, and then call the recipient on the phone to make sure it had arrived.  No one was going to take any other aspect of email seriously until we made it reliable.

I was blessed with brilliant partners in this project, and together we gradually made the system work reliably.  Periodically our new software would discover caches of ancient, undelivered messages, and would deliver them -- not always to the delight of users who showed up for meetings several months late!  But we persevered, and we told everyone who would listen that we wanted to know about any delivery problems.   We became email delivery detectives, with zero tolerance for delivery failures -- until human nature intervened.

Our sleuthing eventually led us to the office of a message's intended recipient, who closed the door with a nervous look and then admitted that the message was in fact delivered perfectly, but he had found it most convenient, socially, to deny receiving it.

The first time this happened, we just shook our heads.  But when it happened three times in a row, we realized that our system's email delivery was as reliable as it needed to be, if not more so.  Plausible deniability is a great social lubricant, and our users simply didn't want a technology so reliable that claims of delivery failure were obvious lies.

Engineers view email as a technology, but it has also become a vital part of our social structure, which we rely on for navigating the most complex of social situations.  These situations can twist our systems' requirements in ways no engineering analysis would ever suggest.  When people say they want email to be reliable, it's probably more believable than "the check is in the mail" or "I'll respect you in the morning," but that doesn't mean it's true.  People don't want technology to enforce more honesty than they're used to.  Understanding what people really want from their email systems is a subtle endeavor that, after nearly 30 years, I'm still struggling to get right.

Image via discoodoni on Flickr