acm-header
Sign In

Communications of the ACM

Communications of the ACM

Computing For Humans


Communications Editor-in-Chief Moshe Y. Vardi

Gottfried Wilhelm Leibniz (1646–1716) has been called the "Patron Saint of Computing." Leibniz is most famous for the development—independently of Isaac Newton—of the infinitesimal calculus. (In fact, it is his mathematical notation we use today, rather than Newton's.) He was also a prolific inventor of mechanical calculators and developed the binary number system.

Leibniz conceived of a universal mathematical language, lingua characteristica universalis, in which all human knowledge can be expressed, and calculational rules, calculus ratiocinator, carried out by machines to derive all logical relationships. Leibniz's goal was nothing short of prophetic: "Once the characteristic numbers are established for most concepts, mankind will then possess a new instrument that will enhance the capabilities of the mind to a far greater extent than optical instruments strengthen the eyes."

This definition of computing, as an "instrument for the human mind," captures, I believe, the essence of our field. On one hand, our discipline is a technical one, focusing on hardware, software, and their theoretical foundations. On the other hand, the artifacts we build are meant to enhance the human mind. This duality of our field is witnessed by the two pioneers we lost last October: Steve Jobs and Dennis Ritchie, who passed away within a week of each other.

Dennis MacAlistair Ritchie (September 9, 1941–October 12, 2011) was the techies' techie, as the creator of the C programming language and the codeveloper of the Unix operating system. The C language paved the way for C++ and Java, while Unix was the basis for many of today's most widely used operating systems. Before the development of C and Unix, programming—especially systems programming—was tightly connected to the underlying hardware. C and Unix, in contrast, were highly portable. The citation for the 1983 Turing Award that Richie received together with Ken Thompson refers succinctly to "their development of generic operating systems theory." There is no computer scientist who is not familiar with C and Unix, but it is unlikely your cousin heard about them, unless she is also a computer scientist. Undoubtedly, however, your cousin is familiar with Steve Jobs.

Steven Paul "Steve" Jobs (February 24, 1955–October 5, 2011) was the founder of Apple, NeXT, and Pixar. His death received a tremendous amount of worldwide news coverage and is addressed by three articles in this issue of Communications. It is hard to think of anyone in recent memory whose passing received so much global attention. This level of interest is by itself worthy of observation. As Jaron Lanier makes clear in his essay, "The Most Ancient Marketing," Jobs was very much not an engineer. In fact, the title of one of the many essays published in the wake of Jobs' death is "Why Jobs Is No Edison." Yet, it is difficult to point to anyone who had as much impact on computing over the last 30 years as Jobs. In fact, as Genevieve Bell points out in her essay, "Life, Death, and the iPad: Cultural Symbols and Steve Jobs," his impact goes beyond the world of computing, well into the realm of culture. (For a discussion of Job's business strategy, see Michael A. Cusumano's "Technology Strategy and Management."

Undoubtedly, Jobs' uniqueness was his relentless and singular focus on the human side of computing. To start with, the Apple and Apple II were personal computers, and the Mac's claim to fame was its user interface. The sequence of products that revolutionized computing over the past 10 years, the iPod, iPhone, and iPad, was unique in its focus on user experience. In fact, the very term "user experience" was coined at Apple in the mid-1990s. The success of Apple's products over the past decade made this term quite fashionable lately.

Yet the user has not always been and is probably still not at the center of our discipline. A quick perusal of ACM's Special Interest Groups shows that their general focus tends to be quite technical. In fact, one often encounters among computing professionals an attitude that regards the field of human-computer interaction as "soft," implying it is less worthy than the "harder" technical areas. In my own technical areas, databases and formal methods, I almost never encounter papers that pay attention to usability issues.

The almost simultaneous departure and Jobs and Ritchie should remind us of the fundamental duality of computing. As Leibniz prophesied, computing is "an instrument for the human mind." Let us keep the human in the center!

Moshe Y. Vardi, EDITOR-IN-CHIEF


©2011 ACM  0001-0782/11/1200  $10.00

Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee. Request permission to publish from permissions@acm.org or fax (212) 869-0481.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2011 ACM, Inc.


Comments


Anonymous

We lost three pioneers in October. The article neglected to mention John McCarthy, the AI researcher who invented Lisp (one of the most influential languages in the history of computing), and who in fact coined the term "artificial intelligence". Though he's certainly no better known than Ritchie among the general public and far fewer of today's professionals work directly with his inventions, McCarthy's influence on our field is as great than Ritchie's. He will be missed.


Anonymous

John McCarthy passed away after the December issue of Communications went to print. The January issue is dedicated to his memory.

--Moshe Vardi


Anonymous

Leibniz wasn't talking about computers, but a new calculus able to differentiate correct/logical thinking. Also, Jobs and no McCarthy?


CACM Administrator

The following letter was published in the Letters to the Editor of the March 2012 CACM (http://cacm.acm.org/magazines/2012/3/146236).
--CACM Administrator

In his Editor's Letter "Computing for Humans" (Dec. 2011), Moshe Y. Vardi said, "Jobs was very much not an engineer." Sorry, but I knew Jobs fairly well during his NeXT years (for me, 19891993). I've also known some of the greatest engineers of the past century, and Jobs was one of them. I used to say he designed every atom in the NeXT machine, and since the NeXT was useful, Jobs was an engineer. Consider this personal anecdote: Jobs and I were in a meeting at Carnegie Mellon University, circa 1990, and I mentioned I had just received one of the first IBM workstations (from IBM's Austin facility). Wanting to see it (in my hardware lab at the CMU Robotics Institute), the first thing he asked for was a screwdriver, then sat down on the floor and proceeded to disassemble it, writing down part numbers, including, significantly, the first CD-ROM on a PC. He then put it back together, powered it up, and thanked me. Jobs was, in fact, a better engineer than marketing guy or sales guy, and it was what he built that defined him as an engineer.

Robert Thibadeau
Pittsburgh, PA


Displaying all 4 comments

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account
Article Contents: