Those of us of a certain age (i.e., once able to use a slide rule) remember when the university computer (note the singular) was a scientific and engineering shrine, protected by computer operators and secure doors. We acolytes extended offerings of FORTRAN, ALGOL or COBOL via punched card decks, hoping for the blessings that accrued from a syntactically correct program that compiled and executed correctly.
The commonality across all our experiences was the need to husband computer time and plan job submissions carefully, particularly when one’s job might wait in the queue for six to ten hours before entering execution. I distinctly remember spending many evenings laboriously examining my latest printout, identifying each syntax error and tracing the program flow to identify as many logic errors as possible before returning to the keypunch room to create a new punched card deck.
Because computing time was scarce and expensive, we devoted considerable human effort to manual debugging and optimization. (The subject of manual memory overlays before virtual memory shall remain for another day.) Today, of course, my wristwatch contains roughly as much computing power as that vintage university mainframe, and we routinely devote inexpensive computing time to minimize human labor. Or do we?
Yes, we routinely use WIMP interfaces for human-computer interaction, cellular telephony is ubiquitous and embedded computers enhance everyday objects – from microwave ovens to thermostats and running shoes. However, I suspect much of computing is still socially conditioned by its roots in computational paucity to recognize fully the true opportunity afforded by computational plethora.
Many of us are still wed to a stimulus-response model of computing, where humans provide the stimulus and computers respond in preprogrammed ways. For example, traditional web search (traditional indeed–how quickly the new becomes commonplace) requires typed or spoken search terms to initiate a search. In a world of plethora, computing could glean work, personal, and even emotional context, anticipating information queries and computing on behalf rather than in response. My computer could truly become my assistant.
In economics, the Jevon’s paradox posits that a technological increase in the efficiency with which a resource can be used stimulates greater consumption of the resource. So it is with computing. I believe we are just at the cusp the social change made possible by our technological shift from computational paucity to computational plethora.
I find this a fascinating idea, and I agree that computers are playing a far more passive (and therefore ineffectual) role than they could be.
I used to develop computer games, and one particular genre, the real-time strategy, I thought was ripe to have (semi-)intelligent agents acting on your behalf to free you from micro-managing to planning general strategies. (There was actually a game released called Command & Conquer Generals, and I was thrilled to think the Generals were these agents, but unfortunately that was just a title, and the game held little innovation.)
A few years after leaving that sector, I started thinking about how an OS stands to benefit from a similar idea. I think it won't be too long before we see an 'electronic secretary' which tries to predict your needs, make suggestions, and move the computer away from being a tool one uses, to a work partner. I imagine it will take lots of little steps, but it's certainly an area to watch.
My mother has a new desktop: it's eight core, has 6 gigs of RAM, a nifty graphics card, and the latest MS OS. Surely that should make her (electronic) dreams come true? She has problems copying files, finding how to open files (unknown or incorrect extensions), finding/organising her files, etc. These are the simplest of examples where an agent could help and give suggestions.
Some starting behaviours:
* Helping you find the appropriate application to open based on previous use
* Helping you find the appropriate file to open based on previous use
* Informing you of important emails, where important is inferred from your history - which senders/topics/threads have made you stop your work, which have you left unread for a while?
* Informing you if a related email comes in while you're writing a doc or email
* Informing you of new info on the internet connected with your docs/emails, especially on sites you visit often
* Inferring a thumbs-up, thumbs-down for previous heuristic searches based on your use
* Keeping track of time spent on different activities, warning if something important's being left out
Some of these are partially implemented already, for example the most recently used list in Windows. Some I think need direct user input, such as warning about time spent on different activities; but I feel that there's lots of scope for either the software to infer a great deal, or for the user to enter data _occasionally_ while its influence persists longterm.
I could even accept the help coming from a paperclip if it was genuinely useful:)
Sorry for the giant post, just an idea I've been waiting to see implemented for so long!
I found Andy F's comments more provocative than the post which it responded to. Modern (operating) systems are still for the most part relatively passive in that they support the running of applications, and respond to requests made by those applications. But perhaps the time has come to think seriously about environments in which the OS (for lack of a better term) also has more "knowledge" of what its user is doing, what that user's style and sophistication is, and maybe even has a less narrow interface to its applications and to its user.
Andy gives some good examples of possible improvements. Here's another: Is the (by now quite old) notion of a tree-structured file system really right for all users? Could we augment it with some effective way of organizing files automatically which didn't prevent the user from keeping the old way if he chose, while not requiring each application to break as a result?
Displaying all 2 comments