I don’t think solving this problem will do much for Angela and Brian.

(see more excellent examples of the world’s finest academic writing at *Thanks, Textbooks*)

Santa must have a way to hash humans. There's no way he's using names. There'd be too many collisions on the Naughty list.

—

InfoSec Taylor Swift (@SwiftOnSecurity) December 25, 2014

While there are alternative explanations for how the naughty/nice list is generated, hashing is important: Santa could be using a Bloom filter, in which false positive matches are possible, but false negatives are not (i.e. a query returns either “possibly in set” or “definitely not in set”, thus it has a 100% recall rate).

And while we’re on this subject, remember Santa’s delivery route represents a nested Travelling Salesman Problem, compounded by the naughty/nice list changing every year…

(Merry Christmas…and watch out!)

(read the full NYT review here)

If you enjoy mathematics as well as the BBC quiz series Pointless, hosted by Alexander Armstrong and Richard Osman, then you will most definitely enjoy the following blog post by Mathistopheles:

This article is based on a gloriously irrelevant mathematical sequence that is derived (rather appropriately) from the episodes of the television show Pointless. It is the sort of idea that has me scribbling calculations on the back of envelopes for hours on end, despite there being absolutely no hope of an outcome that could in any way justify this investment of effort. In this first part, I introduce the sequence and explain how it is related to some well-known mathematical objects called Markov chains. In the vain hope that I might convey my enthusiasm for this topic to others, I have tried to write this piece in a fairly accessible way. Almost no mathematical knowledge is assumed, beyond a rough idea of what probability is.

It provides a clear and accessible analysis of how the quiz show works by using directed graphs, matrices and Markov chains: read the full post here.

An integer is either a perfect square or its square root is irrational. Essentially: when you compute the square root of an integer, there are either *no* figures to the right of the decimal or there are an *infinite number* of figures to right of the decimal and they don’t repeat. There’s no middle ground — you can’t hope, for example, that the decimal expansion might stop or repeat after a hundred or so terms.

The proof of this theorem is surprisingly simple, not much harder than the familiar proof that the square root of 2 is irrational.

Suppose is a fraction in lowest terms, i.e. and are co-prime (i.e. their gcd is 1), and is a solution to where is an integer and is an integer. Then:

and so:

Now the right side of the equation above is an integer, so the left side must be an integer as well. But is relatively prime to , and so is relatively prime to . The only way could be an integer is for to equal 1 or -1. And so must be an integer.

Another way to get the same result is to assume is an irreducible fraction and is not an integer (i.e. ), and consider . Clearly and are co-prime and the denominator , so is not an integer.

So what we said about square roots extends to cube roots and in fact to all integer roots (for example, the fifth root of an integer is either an integer or an irrational number). In other words: **no (non-integer) fraction, when raised to a power, can produce an integer**.

(reblogged from John D. Cook’s blog)

Somehow, I don’t think solving this problem will do much for the couple…

(HT Thanks, Textbooks)

Inspired by a recent discussion on the wonders of , I started thinking about how easy it would be to generate prime numbers in . Well, unsurprisingly, it was presented as an example by Knuth using trial division in *The TeXbook* (download) in 1984:

\documentclass{article} \newif\ifprime \newif\ifunknown % boolean variables \newcount\n \newcount\p \newcount\d \newcount\a % integer variables \def\primes#1{2,~3% assume that #1 is at least 3 \n=#1 \advance\n by-2 % n more to go \p=5 % odd primes starting with p \loop\ifnum\n>0 \printifprime\advance\p by2 \repeat} \def\printp{, % we will invoke \printp if p is prime \ifnum\n=1 and~\fi % ‘and’ precedes the last value \number\p \advance\n by -1 } \def\printifprime{\testprimality \ifprime\printp\fi} \def\testprimality{{\d=3 \global\primetrue \loop\trialdivision \ifunknown\advance\d by2 \repeat}} \def\trialdivision{\a=\p \divide\a by\d \ifnum\a>\d \unknowntrue\else\unknownfalse\fi \multiply\a by\d \ifnum\a=\p \global\primefalse\unknownfalse\fi} \begin{document} % usage The first 100 prime numbers are:~\primes{100} \end{document}

You can also do it by sieving; check out the examples in my GitHub repo.

Found Turing’s plaque today near King’s College, Cambridge (his alma mater).

There has been much discussion online of yesterday’s CiF article by Simon Jenkins (*For Britain’s pupils, maths is even more pointless than Latin*). Click-bait aside, he has been here before; ignoring the derivation of the now-pervasive “*x is the new Latin*” meme, as well as overlooking the majority of the straw men and other logic fallacies, the main thrust of the article presents a false dichotomy. It appears to reiterate an antiquated *Two Cultures*-type of divide between mathematics and “creativity and social and emotional capacities” (which also frequently crops up in discussions on programming and computer science education). Furthermore, it implies the drive to reform mathematics education in the UK is ultimately misguided, with few jobs requiring advanced mathematical skills (STEM agenda? No thank you!), and we would be better served by focusing on numeracy as well as encouraging “key industries”:

If British schools are to be slaves to Gove’s economic dogma, they should be turning out accountants, lawyers, administrators and salespeople. That is where the money is. Britain needs literate and presentable young people, sensitive to culture and the world around them, skilled in health, entertainment, finance, the law and citizenship. The truth is that Gove, like most of Cameron’s ministers, is an old socialist planner at heart.

Now, this is not to say that there are no issues with mathematics education in the UK; ACME has been arguing for a mathematics curriculum fit for the 21st century, supported by Ofsted and reports highlighting the importance of mathematics in the other sciences. Conrad Wolfram has long maintained we have the wrong focus in how we teach mathematics — in a similar way for computer science, contexts and problems must come first. I have long maintained it is socially acceptable to be bad at mathematics — it is rare for people to publicly admit they are unable to read or write, but happily proclaim a lifelong inability to perform basic calculations.

Jenkins has thus thrown together a ragbag of prejudices (a love of the arts, a dislike of international education markers, a sympathy for progressive education) with personal anecdote and concocted an argument completely detached from reality. As epitomised by this quote:

I learned maths. I found it tough and enjoyable. Algebra, trigonometry, differential calculus, logarithms and primes held no mystery, but they were even more pointless than Latin and Greek. Only a handful of my contemporaries went on to use maths afterwards.

…which reminds me of this xkcd comic:

From an article by Edward Frenkel in today’s New York Times:

Many mathematicians, when pressed, admit to being Platonists. The great logician Kurt Gödel argued that mathematical concepts and ideas “form an objective reality of their own, which we cannot create or change, but only perceive and describe”. But if this is true, how do humans manage to access this hidden reality?

We don’t know. But one fanciful possibility is that we live in a computer simulation based on the laws of mathematics — not in what we commonly take to be the real world. According to this theory, some highly advanced computer programmer of the future has devised this simulation, and we are unknowingly part of it. Thus when we discover a mathematical truth, we are simply discovering aspects of the code that the programmer used.

This hypothesis is by no means new; in *Are you living in a computer simulation*, Nick Bostrum argues that one of the following propositions is true:

- the human species is very likely to go extinct before reaching a “posthuman” stage;
- any posthuman civilisation is extremely unlikely to run a significant number of simulations of their evolutionary history (or variations thereof);
- we are almost certainly living in a computer simulation.

Also see: *Constraints on the Universe as a Numerical Simulation*.

In 1990, Spanish philosopher Jon Perez Laraudogoitia submitted an article to the journal Mind entitled “*This Article Should Not Be Rejected by Mind*”. In it, he argued:

- If statement 1 in this argument is trivially true, then this article should be accepted.
- If statement 1 were false, then its antecedent (“statement 1 in this argument is trivially true”) would be true, which means that statement 1 itself would be true, a contradiction. So statement 1 must be true.
- But that seems wrong, since Mind is a serious journal and shouldn’t publish trivial truths.
- That means statement 1 must be either false or a non-trivial truth. We know it can’t be false (#2), so it must be a non-trivial truth, and its antecedent (“statement 1 in this argument is trivially true”) is false.
- What then is the truth value of its consequent, “this article should be accepted”? If this were false then Mind shouldn’t publish the article; that can’t be right, since the article consists of a non-trivial truth and its justification.
- So the consequent must be true, and Mind should publish the article.

They published it. “*This is, I believe, the first article in the whole history of philosophy the content of which is concerned exclusively with its own self, or, in other words, which is totally self-referential*”, Laraudogoitia wrote. “*The reason why it is published is because in it there is a proof that it should not be rejected and that is all*”.

(*reblogged from Futility Closet*)

Catch *The Joy of Logic* by Dave Cliff on iPlayer before it disappears! Programme blurb:

A sharp, witty, mind-expanding and exuberant foray into the world of logic with computer scientist Professor Dave Cliff. Following in the footsteps of the award-winning ‘The Joy of Stats’ and its sequel, ‘Tails You Win — The Science of Chance’, this film takes viewers on a new rollercoaster ride through philosophy, maths, science and technology — all of which, under the bonnet, run on logic.Wielding the same wit and wisdom, animation and gleeful nerdery as its predecessors, this film journeys from Aristotle to Alice in Wonderland, sci-fi to supercomputers to tell the fascinating story of the quest for certainty and the fundamentals of sound reasoning itself.

Dave Cliff, professor of computer science and engineering at Bristol University, is no abstract theoretician. 15 years ago he combined logic and a bit of maths to write one of the first computer programs to outperform humans at trading stocks and shares. Giving away the software for free, he says, was not his most logical move…

With the help of 25 seven-year-olds, Professor Cliff creates, for the first time ever, a computer made entirely of children, running on nothing but logic. We also meet the world’s brainiest whizz-kids, competing at the International Olympiad of Informatics in Brisbane, Australia.

‘The Joy of Logic’ also hails logic’s all-time heroes: George Boole who moved logic beyond philosophy to mathematics; Bertrand Russell, who took 360+ pages but heroically proved that 1 + 1 = 2; Kurt Godel, who brought logic to its knees by demonstrating that some truths are unprovable; and Alan Turing, who, with what Cliff calls an ‘almost exquisite paradox’, was inspired by this huge setback to logic to conceive the computer.

Ultimately, the film asks, can humans really stay ahead? Could today’s generation of logical computing machines be smarter than us? What does that tell us about our own brains, and just how ‘logical’ we really are…?

(you might also like this *In Our Time* programme on the history of logic from 2010 or this BBC Science Café programme on logic I was a guest on in 2011)

Everyone likes algorithms, especially novel sorting algorithms. So the basis for this “sleep sort” is simple: take the first element of the array (of positive integers), fork a new process which sleeps seconds then displays that number. Repeat for the next element.

#!/bin/bash function f() { sleep "$1" echo "$1" } while [ -n "$1" ] do f "$1" & shift done wait

Calculation of the average (and worst-case) complexity of this algorithm is left as an exercise to the reader; you might also enjoy bogosort and stooge sort (as well as some dancing).

A useful video (start from c.2mins) for explaining Diffie-Hellman(-Merkle) key exchange: a method to allow two parties that have no prior knowledge of each other to jointly establish a shared secret key over an insecure communications channel. This key can then be used to encrypt subsequent communications using a symmetric key cipher.

It also highlights the cryptographic value of a one-way function and the difficulty of finding the discrete logarithm.

(*N.B.* There is also some relevant content — albeit for a younger audience — in Chris Bishop‘s 2008 Royal Institution Christmas Lectures)

The problem of distinguishing prime numbers from composite numbers and of resolving the latter into their prime factors is known to be one of the most important and useful in arithmetic. It has engaged the industry and wisdom of ancient and modern geometers to such an extent that it would be superfluous to discuss the problem at length…Further, the dignity of the science itself seems to require that every possible means be explored for the solution of a problem so elegant and so celebrated.

Disquisitiones Arithmeticae(1801)

Carl Friedrich Gauss(1777-1855)

Big whorls have little whorls

That feed on their velocity,

And little whorls have lesser whorls

And so on to viscosity.

The real trouble with this world of ours is not that it is an unreasonable world, nor even that it is a reasonable one. The commonest kind of trouble is that it is nearly reasonable, but not quite.

Life is not an illogicality; yet it is a trap for logicians. It looks just a little more mathematical and regular than it is; its exactitude is obvious, but its inexactitude is hidden; its wildness lies in wait.Chapter VI: The Paradoxes of Christianity, Orthodoxy (1908)

G. K. Chesterton(1874-1936)