Archive for November 2011
In what has become a bad couple of months for computer science, John McCarthy, the father of artificial intelligence, died in late October at the age of 84.
McCarthy was a giant in the field of computer science and a seminal figure in artificial intelligence, defining the field for more than five decades. After studying at Caltech and Princeton, with brief professorships at Stanford and Dartmouth College, he went to MIT where he, along with his colleague Marvin Minsky, founded the MIT Artificial Intelligence Project. In 1962, he transferred to Stanford’s newly formed Department of Computer Science, where he founded and led the Stanford Artificial Intelligence Laboratory (SAIL). McCarthy was known for bringing scientific rigor to every aspect of life and for his wry, often self-deprecating sense of humor. This was best exemplified in a personal philosophy he termed “radical optimism” –- a positive outlook so strong that he said “Everything will be OK even if people don’t take my advice.“. Even by the end of his early days at MIT, he was already affectionately referred to as “Uncle John” by his students.
Most remarkable about his contributions are their diversity, their depth, and how they span both theory and practice: logical AI, advances in lambda calculus and the theory of computation, the Lisp programming language, garbage collection, design of the ALGOL programming language, popularising time-sharing, modern program verification (and with the benefit of hindsight it seems that he came remarkably close to denotational semantics), circumscription logic for non-monotonic reasoning and computer chess. He won the ACM Turing Award in 1971, the Kyoto Prize in 1988 and the US National Medal of Science in 1990.
There have been a huge number of excellent obituaries to John McCarthy, for example by SAIL, Stanford, CACM, The New York Times, The Guardian, as well as an excellent article in Nature by Vladimir Lifschitz. There is also much to be mined from his SAIL web pages.
As soon as it works, no one calls it AI any more.
John McCarthy (in response to the over-promises of AI in the late 1970s and early 1980s)
During the media storm over the death of Steve Jobs, there was another, arguably more significant, passing: Dennis M. Ritchie, creator of the C programming language and (with long-time colleague Ken Thompson) the Unix operating system, who was found dead at his home in New Jersey on the 12th October after a long illness.
Dennis Ritchie was a computer scientist who truly “helped shape the digital era“: for his work on C and Unix, he was jointly awarded with Thompson in 1983 the ACM Turing Award (see his Turing Award Lecture: “Reflections on Software Research“), as well as the IEEE Richard W. Hamming Medal (1990) and The National Medal of Technology and Innovation (1998) (with the citation describing their contributions as having “led to enormous advances in computer hardware, software, and networking systems and stimulated growth of an entire industry, thereby enhancing American leadership in the Information Age.”). He was the ‘R’ in K&R C and commonly known by his username dmr. Ritchie was the head of Lucent Technologies (formerly Bell Labs, now part of Alcatel-Lucent) System Software Research Department when he retired in 2007.
Ritchie had a key role in shaping today’s computing environment; his influence rivals, if not surpasses, Steve Jobs’ — it is just less visible. While the response from the tech press was excellent, the collective eulogy from the press at large did not quite do justice to Ritchie’s sweeping influence on the modern world. As Rob Pike, Principal Engineer at Google, who spent 20 years working across the hall from Ritchie at Bell Labs, said:
Pretty much everything on the Web uses those two things: C and Unix. The browsers are written in C. The Unix kernel — that pretty much the entire Internet runs on — is written in C. Web servers are written in C, and if they’re not, they’re written in Java or C++, which are C derivatives, or Python or Ruby, which are implemented in C. And all of the network hardware running these programs I can almost guarantee were written in C.
It’s really hard to overstate how much of the modern information economy is built on the work Dennis did.
Much of my research and teaching has been based upon leveraging Ritchie’s work (or its derivatives). Even my undergraduate dissertation developed a GCC front end for BCPL, a precursor to C (see Ritchie’s Development of the C Language). With computer science being a relatively modern discipline, it is a pleasure to be able to meet and hear the early pioneers speak (for example, Donald Knuth at last year’s BCS/IET Turing Lecture); unfortunately, this type of news may start to come all too frequently (such as the sad news in late October of the death of John McCarthy, the father of AI).
But as Brian Kernighan so eloquently puts it:
There’s that line from Newton about standing on the shoulders of giants…we’re all standing on Dennis’ shoulders.
Je n’ai fait celle-ci plus longue que parce que je n’ai pas eu le loisir de la faire plus courte.
I would have written a shorter letter, but I did not have the time.
(literally: I made this [letter] very long, because I did not have the leisure to make it shorter.)
Unfortunately, this is how I tend to feel about most things I write or pretty much every talk I give.
On Thursday 24th November, I will be speaking at the 2011 Microsoft UK Partners in Learning Forum, a free one-day conference for teachers and educators at Microsoft HQ, Thames Valley Park in Reading. This year, the workshops and keynotes are all STEM-focused and address the theme of “Teach more, learn more, inspire more“. As with last year’s event held in Manchester, they will also be announcing this year’s Microsoft UK Partners in Learning Teacher Awards.
This year’s keynotes are Ian Livingstone (of Games Workshop and Eidos fame, as well as co-author of the recent NESTA Next Gen. report on the video games and visual effects industry), Alex Bellos (author of the popular science book Alex’s Adventures in Numberland) and Ollie Bray (the National Adviser for Emerging Technologies at Education Scotland).
I am leading one of the featured workshop sessions on how we need to develop and encourage the next generation of technology innovators in the UK. I will be discussing the work of the Computing at School (CAS) working group, as well as highlighting the importance of computing from both an educational and economic perspective. Since many of the other workshops are more hands-on (such as using the Kinect SDK, building gadgets with .NET Gadgeteer and using Skype in the classroom), I intend to stimulate discussion in my session by drawing attention to some of the problems with the current state of ICT education in the UK (and how we can try and resolve them), as well as how computing is a core STEM discipline.
What do you call a 9999-sided polygon?
While polygons are important in computer graphics, what’s special about a 9999-gon? Well, not much really, but it made me think about the nomenclature for polygons: the word “polygon” comes from the Greek polygōnon, meaning “many-angled”. Polygons are usually named according to the number of sides, combining a Greek-derived numerical prefix with the suffix -gon, e.g. pentagon (5-gon), dodecagon (12-gon) or icosagon (20-gon) — with the triangle, quadrilateral and nonagon (9-gon) being notable exceptions. But beyond decagons, professional mathematicians generally prefer the numeral notation (n-gon).
Nevertheless, a 10000-gon is called a myriagon.
My iPhone 3GS had a little accident today: after nearly two years of faithful service (without a case), some spilt milk in the dairy aisle of Tesco caused a comedy slip (and juggle) which resulted in the following:
While it was most likely excellent viewing on CCTV, we’ll see what Tesco say tomorrow…