Robin Milner: Pioneering Computer Scientist#
Obituary#
by Michael FourmanPublished in The Independent, Wednesday, 14 April 2010
Robin Milner was a great man, an inspiring colleague, and a dear friend. He died suddenly of a heart attack, when walking with his daughter to see her off at Cambridge station, just three days after the funeral of his much-loved wife, Lucy.
Milner already stood out at Eton; Tam Dalyell recalls him as "precociously clever, confident, but not insufferable – relaxed and nice." He rejected injustice and brutality and had a gift for mathematics. Milner's love of mathematics was fostered by his "modern tutor", Thomas Hutchinson-Smyth, a "most gifted teacher of mathematics and quite extraordinarily congenial to those as clever as Robin."
Milner thought deeply about people, as well as mathematics. Before the fall of the apartheid regime in South Africa, he led a debate in the Laboratory for Foundations of Computer Science through the moral maze surrounding academic interactions with individuals and institutions working under that regime. Just as in his technical work, he helped others to untangle difficult issues and to construct a common framework within which we could all think clearly.
It was at Cambridge, in 1956, that he first met with computer programming. His response was characteristically decisive: "Programming was not a very beautiful thing. I resolved I would never go near a computer in my life."
However, Milner was interested in tools, as prosthetic devices that serve to extend our reach. He was intensely practical, as evidenced by the wooden geometric shapes on the light-strings, and the comprehensive collection of chisels, gimlets and awls ranged neatly in the utility room – a spotlessly clean workshop that also served as the laundry in his Cambridge home. Milner later became fascinated with computers as tools. He realised, very early, that that they were not just mere calculating devices, but could become "tools of the mind".
After military service in Suez, he spent three years at Cambridge, taking Part II in both Mathematics and Moral Sciences; then a year as a mathematics teacher at Marylebone High School. He joined Ferranti in 1960; there he engaged again with computers, contributing to the development of the early high-level programming languages, Algol and Atlas Autocode. In 1963 Milner moved to academia, first as a lecturer at the City University, then, from 1968, as a senior research assistant at Swansea.
At Swansea he began to publish. Milner had been thinking about "program schemes", an early operational model of computer programs. He went to Oxford to hear Dana Scott, who had been thinking about computable functionals – a more abstract model, which focused on the meaning, or semantics, of a program. He was then invited to Stanford, where he spent two years (1971-73) as a research associate with the Artificial Intelligence Project. Milner developed an implementation of Scott's logic of computable functions, LCF, a tool for the mind, to link Scott's theory to practice. He also started to worry about parallel programs, which, he felt, Scott's semantics didn't fully address.
In 1973 Milner returned to the UK to take up a lectureship at Edinburgh. He developed Edinburgh LCF as a tool for performing rigorous proofs about programs. He developed his ideas on parallel programs into a more general mathematical model of communicating agents, the Calculus of Communicating Systems (CCS), and then the pi-calculus, a bold extension intended to model the mobility of interactive systems.
But he found that the programming languages of the day were clumsy tools: they made it unnecessarily difficult to implement his ideas. Milner decided to design and implement a better programming language. He saw this as a tool-building tool: a meta-tool. Milner's language, ML, is a Meta-Language for programming. Like Milner's workshop, ML provides a small number of well-chosen tools, with which skilled hands can build almost anything – and it is a delight to use. It stands as a beacon, which shows that programming can indeed be beautiful.
For these early works – three strands: proof, concurrency, ML – Milner was awarded many honours, including, in 1991, the ACM A.M. Turing Award, computing's highest honour. It sat inconspicuously in his kitchen, between a vase and a bowl of fruit.
Milner used ML to good effect. In addition to its intellectual cohesion his work had practical impact.
Edinburgh LCF showed how ML can be used to build "proof-assistants" that allow a human user to develop a complex proof in a "dialogue with a proof system" that keeps track of the details and ensures rigour. Some mathematical proofs are now too complex for pencil and paper. Milner's methods have been used to build machine-checked proofs – for example, a proof of the celebrated four-colour theorem.
The Concurrency Workbench, implementing CCS, demonstrated that we can effectively analyse the behaviours of complex distributed systems. It inspired the design of tools now used to ensure the correctness of control software for critical embedded applications such as nuclear plant controllers, and flight systems for aeroplanes such as the Airbus.
ML was way ahead of its time. The world is slowly catching up. ML has influenced many languages including Java, Scala, as used by Twitter, and Microsoft's recent F#. ML is built on clean and well-articulated mathematical ideas, so it is relatively easy to remix and reuse the ideas. No serious language designer can now ignore this example of good design.
In 1994 Milner helped to establish the Informatics Planning Unit at Edinburgh. This was a new enterprise, formed as a coalition of previously warring tribes, brought together around a vision of a new science of information. In 1995, having seeded this vision at Edinburgh, Milner moved to take up the newly established Chair of Computer Science at Cambridge.
In accepting his honorary doctorate from the University of Bologna in 1997, Milner reflected on his work:
"Every tool designed by man is a prosthetic device, and for every prosthetic device there is a means of control. Many tools are physical, and their control is manual. In contrast, computers are the most complex tools ever invented, and they are tools of the mind; the means of control is hardly muscular – it is primarily linguistic. Quite simply, the versatility of computers is exactly equal to the versatility of the languages by which we prescribe their behaviour, and this appears to be unbounded."
He then went on to articulate his vision of informatics:
"Computing is not only about a computer's internal action; it is about the way a computer behaves as part of a larger system. The terms in which the computer behaviour is prescribed must harmonise with the way we describe information flow in such systems. Thus computing expands into informatics, the science of information and communication... it can provide an exact view of the world which is not the province of any previously existing science."
In 2001, Milner retired, but he did not stop. The pi-calculus had been applied to new examples of interaction that he had not anticipated; but he now saw further examples. He started work on a new calculus, of bigraphs, that would account for all discrete interactive behaviour. Bigraphs model the concurrent and interactive behaviours of populations of mobile communicating agents. Applications, to date, include computational systems biology, where the agents include genes and proteins, as well as the internet, where the agents include people, computers, and programs.
In 2006 he was appointed to the prestigious Blaise Pascal International Research Chair in Paris, where he taught courses on bigraphs, and wrote the core of his last book: The Space and Motion of Communicating Agents. Milner returned (part-time) to Edinburgh in 2009, appointed to the Chair of Computer Science.
A new generation of researchers is already enthused with Milner's latest ideas. He was working with them on new projects to implement tools for bigraphs. He was interviewed as "Geek of the Week" just a few weeks before his death, and said:
"I would dearly love to follow this kind of work through to the front line of design and experience, but I don't think I'll be around for long enough. I'm making up for it by listening and talking to those who will be!"
Milner's listening and talking are sorely missed. He would always add wisdom and insight to any discussion. He ploughed his own furrow, but welcomed company and generously shared the fertile ground he uncovered. He was a most gifted teacher, a kind mentor and gentle critic, and quite extraordinarily congenial.
He is survived by his daughter, Chloë, and son, Barney.
Arthur John Robin Gorell Milner, computer scientist: born Yealmpton, Devon 13 January 1934; married Lucy (died 2010; one son, one daughter, one son deceased); died Cambridge 20 March 2010.