Posts Tagged ‘Obituaries’
I am, and ever will be, a white socks, pocket-protector nerdy engineer — born under the law of thermodynamics, steeped in the steam tables, in love with free-flow dynamics, transformed by Laplace, and propelled by compressible flow.
Neil Armstrong (speaking in 2000)
For those who may ask what they can do to honor Neil, we have a simple request. Honor his example of service, accomplishment and modesty, and the next time you walk outside on a clear night and see the Moon smiling down at you, think of Neil Armstrong and give him a wink.
In what has become a bad couple of months for computer science, John McCarthy, the father of artificial intelligence, died in late October at the age of 84.
McCarthy was a giant in the field of computer science and a seminal figure in artificial intelligence, defining the field for more than five decades. After studying at Caltech and Princeton, with brief professorships at Stanford and Dartmouth College, he went to MIT where he, along with his colleague Marvin Minsky, founded the MIT Artificial Intelligence Project. In 1962, he transferred to Stanford’s newly formed Department of Computer Science, where he founded and led the Stanford Artificial Intelligence Laboratory (SAIL). McCarthy was known for bringing scientific rigor to every aspect of life and for his wry, often self-deprecating sense of humor. This was best exemplified in a personal philosophy he termed “radical optimism” –- a positive outlook so strong that he said “Everything will be OK even if people don’t take my advice.“. Even by the end of his early days at MIT, he was already affectionately referred to as “Uncle John” by his students.
Most remarkable about his contributions are their diversity, their depth, and how they span both theory and practice: logical AI, advances in lambda calculus and the theory of computation, the Lisp programming language, garbage collection, design of the ALGOL programming language, popularising time-sharing, modern program verification (and with the benefit of hindsight it seems that he came remarkably close to denotational semantics), circumscription logic for non-monotonic reasoning and computer chess. He won the ACM Turing Award in 1971, the Kyoto Prize in 1988 and the US National Medal of Science in 1990.
There have been a huge number of excellent obituaries to John McCarthy, for example by SAIL, Stanford, CACM, The New York Times, The Guardian, as well as an excellent article in Nature by Vladimir Lifschitz. There is also much to be mined from his SAIL web pages.
As soon as it works, no one calls it AI any more.
John McCarthy (in response to the over-promises of AI in the late 1970s and early 1980s)
During the media storm over the death of Steve Jobs, there was another, arguably more significant, passing: Dennis M. Ritchie, creator of the C programming language and (with long-time colleague Ken Thompson) the Unix operating system, who was found dead at his home in New Jersey on the 12th October after a long illness.
Dennis Ritchie was a computer scientist who truly “helped shape the digital era“: for his work on C and Unix, he was jointly awarded with Thompson in 1983 the ACM Turing Award (see his Turing Award Lecture: “Reflections on Software Research“), as well as the IEEE Richard W. Hamming Medal (1990) and The National Medal of Technology and Innovation (1998) (with the citation describing their contributions as having “led to enormous advances in computer hardware, software, and networking systems and stimulated growth of an entire industry, thereby enhancing American leadership in the Information Age.”). He was the ‘R’ in K&R C and commonly known by his username dmr. Ritchie was the head of Lucent Technologies (formerly Bell Labs, now part of Alcatel-Lucent) System Software Research Department when he retired in 2007.
Ritchie had a key role in shaping today’s computing environment; his influence rivals, if not surpasses, Steve Jobs’ — it is just less visible. While the response from the tech press was excellent, the collective eulogy from the press at large did not quite do justice to Ritchie’s sweeping influence on the modern world. As Rob Pike, Principal Engineer at Google, who spent 20 years working across the hall from Ritchie at Bell Labs, said:
Pretty much everything on the Web uses those two things: C and Unix. The browsers are written in C. The Unix kernel — that pretty much the entire Internet runs on — is written in C. Web servers are written in C, and if they’re not, they’re written in Java or C++, which are C derivatives, or Python or Ruby, which are implemented in C. And all of the network hardware running these programs I can almost guarantee were written in C.
It’s really hard to overstate how much of the modern information economy is built on the work Dennis did.
Much of my research and teaching has been based upon leveraging Ritchie’s work (or its derivatives). Even my undergraduate dissertation developed a GCC front end for BCPL, a precursor to C (see Ritchie’s Development of the C Language). With computer science being a relatively modern discipline, it is a pleasure to be able to meet and hear the early pioneers speak (for example, Donald Knuth at last year’s BCS/IET Turing Lecture); unfortunately, this type of news may start to come all too frequently (such as the sad news in late October of the death of John McCarthy, the father of AI).
But as Brian Kernighan so eloquently puts it:
There’s that line from Newton about standing on the shoulders of giants…we’re all standing on Dennis’ shoulders.
As I’m sure you are all aware from the avalanche of media attention, Steve Jobs passed away on the 5th October 2011, after stepping down from his role as CEO of Apple in August; he was 56 years old.
There have been numerous extensive obituaries for Jobs, who is widely regarded as one of the most visionary and disruptive technologists the world has seen. Whether or not you appreciate the products he designed, or the socio-technical philosophy he promulgated, it is hard to deny the impact he has had on commercial computing and how we use technology.
While much of the media focus has been on his achievements during his second spell as Apple’s CEO, I think his contributions in the late 1970s (with Steve Wozniak) and 1980s are of more significance to modern computing; for example, the Apple II, the Apple Lisa, the Macintosh, NeXT and Pixar. (N.B. if you interested in the history of modern computing, especially in the 1980s, I highly recommend Steven Levy‘s book Hackers: Heroes of the Computer Revolution)
With the notoriously outspoken free software pioneer Richard Stallman being quick to offer his opinion on Jobs’ death (which was widely regarded as being in poor taste), there have also been a number of interesting discussions critically analysing Jobs’ and Apple’s wider impact on technology and society. Nevertheless, I still think we need people who Think Different:
Reasonable people adapt themselves to the world. Unreasonable people attempt to adapt the world to themselves. All progress, therefore, depends on unreasonable people.
George Bernard Shaw (1856-1950)