Monthly Archives: November 2011

John McCarthy (1927-2011)

In what has become a bad couple of months for computer science, John McCarthy, the father of artificial intelligence, died in late October at the age of 84.

McCarthy was a giant in the field of computer science and a seminal figure in artificial intelligence, defining the field for more than five decades. After studying at Caltech and Princeton, with brief professorships at Stanford and Dartmouth College, he went to MIT where he, along with his colleague Marvin Minsky, founded the MIT Artificial Intelligence Project. In 1962, he transferred to Stanford’s newly formed Department of Computer Science, where he founded and led the Stanford Artificial Intelligence Laboratory (SAIL). McCarthy was known for bringing scientific rigor to every aspect of life and for his wry, often self-deprecating sense of humor. This was best exemplified in a personal philosophy he termed “radical optimism” –- a positive outlook so strong that he said “Everything will be OK even if people don’t take my advice.“. Even by the end of his early days at MIT, he was already affectionately referred to as “Uncle John” by his students.

Most remarkable about his contributions are their diversity, their depth, and how they span both theory and practice: logical AI, advances in lambda calculus and the theory of computation, the Lisp programming language, garbage collection, design of the ALGOL programming language, popularising time-sharing, modern program verification (and with the benefit of hindsight it seems that he came remarkably close to denotational semantics), circumscription logic for non-monotonic reasoning and computer chess. He won the ACM Turing Award in 1971, the Kyoto Prize in 1988 and the US National Medal of Science in 1990.

There have been a huge number of excellent obituaries to John McCarthy, for example by SAIL, Stanford, CACM, The New York Times, The Guardian, as well as an excellent article in Nature by Vladimir Lifschitz. There is also much to be mined from his SAIL web pages.

John McCarthy at Stanford (1974)

As soon as it works, no one calls it AI any more.

John McCarthy (in response to the over-promises of AI in the late 1970s and early 1980s)

Tagged , , , , , , , ,

Dennis M. Ritchie (1941-2011)

During the media storm over the death of Steve Jobs, there was another, arguably more significant, passing: Dennis M. Ritchie, creator of the C programming language and (with long-time colleague Ken Thompson) the Unix operating system, who was found dead at his home in New Jersey on the 12th October after a long illness.

Dennis M. Ritchie

Dennis Ritchie was a computer scientist who truly “helped shape the digital era“: for his work on C and Unix, he was jointly awarded with Thompson in 1983 the ACM Turing Award (see his Turing Award Lecture: “Reflections on Software Research“), as well as the IEEE Richard W. Hamming Medal (1990) and The National Medal of Technology and Innovation (1998) (with the citation describing their contributions as having “led to enormous advances in computer hardware, software, and networking systems and stimulated growth of an entire industry, thereby enhancing American leadership in the Information Age.”). He was the ‘R’ in K&R C and commonly known by his username dmr. Ritchie was the head of Lucent Technologies (formerly Bell Labs, now part of Alcatel-Lucent) System Software Research Department when he retired in 2007.

Ritchie had a key role in shaping today’s computing environment; his influence rivals, if not surpasses, Steve Jobs’ — it is just less visible. While the response from the tech press was excellent, the collective eulogy from the press at large did not quite do justice to Ritchie’s sweeping influence on the modern world. As Rob Pike, Principal Engineer at Google, who spent 20 years working across the hall from Ritchie at Bell Labs, said:

Pretty much everything on the Web uses those two things: C and Unix. The browsers are written in C. The Unix kernel — that pretty much the entire Internet runs on — is written in C. Web servers are written in C, and if they’re not, they’re written in Java or C++, which are C derivatives, or Python or Ruby, which are implemented in C. And all of the network hardware running these programs I can almost guarantee were written in C.

It’s really hard to overstate how much of the modern information economy is built on the work Dennis did.

Much of my research and teaching has been based upon leveraging Ritchie’s work (or its derivatives). Even my undergraduate dissertation developed a GCC front end for BCPL, a precursor to C (see Ritchie’s Development of the C Language). With computer science being a relatively modern discipline, it is a pleasure to be able to meet and hear the early pioneers speak (for example, Donald Knuth at last year’s BCS/IET Turing Lecture); unfortunately, this type of news may start to come all too frequently (such as the sad news in late October of the death of John McCarthy, the father of AI).

But as Brian Kernighan so eloquently puts it:

There’s that line from Newton about standing on the shoulders of giants…we’re all standing on Dennis’ shoulders.

Tagged , , , , ,

Apologies for the length

Je n’ai fait celle-ci plus longue que parce que je n’ai pas eu le loisir de la faire plus courte.

Lettres Provinciales (Letter XVI)
Blaise Pascal (1623-1662)

or translated:

I would have written a shorter letter, but I did not have the time.
(literally: I made this [letter] very long, because I did not have the leisure to make it shorter.)

(Such statements have also been attributed to Cicero, Mark Twain and T.S. Eliot, amongst others.)

Unfortunately, this is how I tend to feel about most things I write or pretty much every talk I give.

Tagged ,

Proof by intimidation

xkcd982

First year students, beware! The axiom of choice may be used on Monday…

(As always, a big thanks to xkcd.)

Tagged , , ,

The importance of playing

As you may already be aware, Richard Feynman is a hero of mine. I highly recommend his book, Surely You’re Joking, Mr. Feynman!, an edited collection of reminiscences published in 1985. While there are light-hearted anecdotes about safe-cracking, art, languages and samba, what he says about playing and actually doing the things you love has always resonated with me:

But when it came time to do some research, I couldn’t get to work. I was a little tired; I was not interested; I couldn’t do research!

Then I had another thought: Physics disgusts me a little bit now, but I used to enjoy doing physics. Why did I enjoy it? I used to play with it. I used to do whatever I felt like doing — it didn’t have to do with whether it was important for the development of nuclear physics, but whether it was interesting and amusing for me to play with. When I was in high school, I’d see water running out of a faucet growing narrower, and wonder if I could figure out what determines that curve. I found it was rather easy to do. I didn’t have to do it; it wasn’t important for the future of science; somebody else had already done it. That didn’t make any difference. I’d invent things and play with things for my own entertainment.

So I got this new attitude. Now that I am burned out and I’ll never accomplish anything, I’ve got this nice position at the university teaching classes which I rather enjoy, and just like I read the Arabian Nights for pleasure, I’m going to play with physics, whenever I want to, without worrying about any importance whatsoever.

Within a week I was in the cafeteria and some guy, fooling around, throws a plate in the air. As the plate went up in the air I saw it wobble, and I noticed the red medallion of Cornell on the plate going around. It was pretty obvious to me that the medallion went around faster than the wobbling.

I had nothing to do, so I start to figure out the motion of the rotating plate. I discover that when the angle is very slight, the medallion rotates twice as fast as the wobble rate — two to one. It came out of a complicated equation! Then I thought, “Is there some way I can see in a more fundamental way, by looking at the forces or the dynamics, why it’s two to one?”

I don’t remember how I did it, but I ultimately worked out what the motion of the mass particles is, and how all the accelerations balance to make it come out two to one.

I still remember going to Hans Bethe and saying, “Hey, Hans! I noticed something interesting. Here the plate goes around so, and the reason it’s two to one is…” and I showed him the accelerations.

He says, “Feynman, that’s pretty interesting, but what’s the importance of it? Why are you doing it?”

“Hah!” I say. “There’s no importance whatsoever. I’m just doing it for the fun of it.” His reaction didn’t discourage me; I had made up my mind I was going to enjoy physics and do whatever I liked.

I went on to work out equations of wobbles. Then I thought about how electron orbits start to move in relativity. Then there’s the Dirac Equation in electrodynamics. And then quantum electrodynamics. And before I knew it (it was a very short time) I was “playing” — working, really — with the same old problem that I loved so much, that I had stopped working on when I went to Los Alamos: my thesis-type problems; all those old-fashioned, wonderful things.

It was effortless. It was easy to play with these things. It was like uncorking a bottle: Everything flowed out effortlessly. I almost tried to resist it! There was no importance to what I was doing, but ultimately there was. The diagrams and the whole business that I got the Nobel Prize for came from that piddling around with the wobbling plate.

I truly believe in the importance of playing and solving problems that you find interesting; in fact, that’s why I do research. While I strive to adhere to this philosophy, it is not always possible — especially considering the academic research environment in which we currently reside, with its minimum publishable unit (as well as the shadow of the Research Excellence Framework in 2014).

But the main point I wanted to draw from this extended Feynman quote is how important it is in education to stimulate curiosity by playing: tinkering, fiddling and finding interesting real-world problems to solve. And while Feynman was talking about physics, I think this is especially relevant for computing: it is crucial that we give students something to play with! It should be trivial to engage students in computing and technology, but I think this is something that is (in general) missing from UK schools.

However, at a TeachMeet I attended in Reading last night as part of the 2011 Microsoft UK Partners in Learning Forum (where I am giving a talk today), I met teachers who were showcasing incredible examples of innovative teaching to engage students in computing and technology. But as with talking to members of the Computing at School (CAS) group, it is very easy to preach to the converted; it is therefore crucial that this “network of excellence” interacts with people who are currently outside of the network who need help and support.

So let’s try and get a Raspberry Pi, Arduino, LEGO Mindstorms, .NET Gadgeteer, Kinect et al. into the hands of kids at school (as well as exposing them to Scratch, Kodu, Alice, Greenfoot, etc) so they can program, hack and solve interesting problems.

But more importantly, play.

Tagged , , , ,

Microsoft UK Partners in Learning Forum 2011

mspilfuk2011

On Thursday 24th November, I will be speaking at the 2011 Microsoft UK Partners in Learning Forum, a free one-day conference for teachers and educators at Microsoft HQ, Thames Valley Park in Reading. This year, the workshops and keynotes are all STEM-focused and address the theme of “Teach more, learn more, inspire more“. As with last year’s event held in Manchester, they will also be announcing this year’s Microsoft UK Partners in Learning Teacher Awards.

This year’s keynotes are Ian Livingstone (of Games Workshop and Eidos fame, as well as co-author of the recent NESTA Next Gen. report on the video games and visual effects industry), Alex Bellos (author of the popular science book Alex’s Adventures in Numberland) and Ollie Bray (the National Adviser for Emerging Technologies at Education Scotland).

I am leading one of the featured workshop sessions on how we need to develop and encourage the next generation of technology innovators in the UK. I will be discussing the work of the Computing at School (CAS) working group, as well as highlighting the importance of computing from both an educational and economic perspective. Since many of the other workshops are more hands-on (such as using the Kinect SDK, building gadgets with .NET Gadgeteer and using Skype in the classroom), I intend to stimulate discussion in my session by drawing attention to some of the problems with the current state of ICT education in the UK (and how we can try and resolve them), as well as how computing is a core STEM discipline.

There are still a few places remaining, including the informal TeachMeet the night before. I hope to see you there!

Tagged , , ,

A 9999-sided polygon

What do you call a 9999-sided polygon?

A nonanonacontanonactanonaliagon.

While polygons are important in computer graphics, what’s special about a 9999-gon? Well, not much really, but it made me think about the nomenclature for polygons: the word “polygon” comes from the Greek polygōnon, meaning “many-angled”. Polygons are usually named according to the number of sides, combining a Greek-derived numerical prefix with the suffix -gon, e.g. pentagon (5-gon), dodecagon (12-gon) or icosagon (20-gon) — with the triangle, quadrilateral and nonagon (9-gon) being notable exceptions. But beyond decagons, professional mathematicians generally prefer the numeral notation (n-gon).

Nevertheless, a 10000-gon is called a myriagon.

Tagged ,

An iPhone tragedy

My iPhone 3GS had a little accident today: after nearly two years of faithful service (without a case), some spilt milk in the dairy aisle of Tesco caused a comedy slip (and juggle) which resulted in the following:

Broken iPhone

While it was most likely excellent viewing on CCTV, we’ll see what Tesco say tomorrow…

Tagged ,
Follow

Get every new post delivered to your Inbox.

Join 351 other followers