Monday, August 25, 2014

Reading Robert Macfarlane by the internet

There's an old fashioned pleasure to reading on a wet August afternoon. Robert Macfarlane's The Old Ways takes you – in imagination – out into the wilder (or indeed not so wild) parts of Britain. What increases the pleasure and depth of the experience is reading with the internet by your side. With the internet we get so much further …

First, the maps – Ordnance Survey if you have paid for them, or Google maps if not. Walking along the Broomway in (or rather off) Essex takes you to that footpath along the sands, right next to the "DANGER AREA" signs. In the Hebrides, we can find the islands on Google maps – and satellite view - unnamed, but unmistakable from his descriptions. And then to wikipedia to see what gannets look like and read about the peculiar anatomy that sustains their deep dives into  the ocean, making the story of the gannet that pierced the hull of the boat but kept it plugged entirely believable.

And then onto Harris itself. Trying to negotiate the walk he makes – again we have topography and named lakes, but no hill names – but hills cast shadows on the satellite picture, and photographs culled from somewhere (even street view was here) show the picture from the ground. Then we can look at Steve Dilworth's art and read about what Iain Sinclair says about him.

Awe-inspiring.

Cloud Refactoring

A draft review of “Cloud refactoring: automated transitioning to cloud-based services” by Young-Woo Kwon and Eli Tilevich, from Automated Software Engineering, (2014) 21:345–372, DOI 10.1007/s10515-013-0136-9; to appear in Computing Reviews.

Refactoring is the process of changing how a program works without changing what it does, but even before the term was coined, it was practised by programmers as "program restructuring" from the early 1950s. Refactoring itself came to prominence with work by Opdyke, Griswold and Johnson in the early 1990s and was popularised by Fowler's 1999 book. Refactoring is done to improve code for a range of reasons: to make it conform to coding standards, to make it easier to read, or to prepare the code for modification or enhancement.

Whatever the case, refactorings of large code bases can be infeasible without automated – or semi-automated – tools, and many IDEs incorporate refactorings for a variety of languages, although it is perhaps most developed in the IDEs for the Java language, including IntelliJ and Eclipse. Refactoring "in the small" is the preserve of the developer, and may be done as a part of his or her day to day development process; larger-scale refactoring is often anticipated, but perhaps harder to justify as part of an aggressive release cycle, unless, of course, there is some concrete gain to be made. What better example could there be of this than migrating an existing system to the cloud?

Taking a system and moving it to the cloud must be the right thing to do: it provides scalability, resilience, and also fits the zeitgeist. However, as the authors make very clear, it is not without difficulties. It is important to preserve functionality – the system should not change what it does – but also it should maintain non-functional properties like efficiency (e.g. latency and throughput) and questions of resilience are more acute in a distributed/cloud setting. In common with many other refactoring tool builders, the authors propose a two-pronged approach to the problem: first, they give an analysis to provide recommendations of how systems might be refactored and then they develop a tool to implement the refactorings identified. Their first phase combines static clustering analysis with runtime profiling to identify potential services which can be migrated, and their second implements refactorings that make this change, through introducing the appropriate interfaces and proxies and at the same time adding fault-handling facilities to deal with the additional risks introduced by moving to a more distributed application platform.

These refactorings are included in Eclipse and available through its standard refactoring workflow, and the work is demonstrated through two small examples and a larger case study performed for GE Research. One of the key questions faced by designers of a cloud refactoring tool and their users alike is how much of the refactoring workflow should be automated. In discussing a number of examples the authors say that "we selected appropriate classes for the reason of the performance, call-by-reference, and meaning of features", thus making clear the role for the domain-aware engineer in the process. It would have been interesting to have heard more about the view of the developers of the software about the results of the analysis: were the services identified meaningful to them? would they have structured things somewhat differently from the analyses of the tool?

In summary, this paper makes a clear and well-described contribution to what is a fast moving field: Google Scholar, for example, gives over 1000 references matching the keywords "cloud" and "refactor", and this number can only be set to grow as migrating cloud proves to be more of a challenge than its advocates suggest.

Saturday, July 19, 2014

In honour of Robert Wyatt

Picture the scene … you're having an evening in the pub, with the usual music in the background: guitars, 4-4 bass beat, when suddenly it changes: it's a song with more complicated rhythms and  dissonant notes. What is more, it's sung in a lugubrious style by someone who sounds like a real person, rather than an identikit American drawl. The singer is Robert Wyatt, our honorary graduand today, and picking music to catch your attention like this has come to be called "Wyatting" in his honour.

Canterbury is know historically for the cathedral, and Chaucer’s Canterbury Tales, but for those of us who grew up in the 60s and 70s, it was also the home of the Cantebury scene – musicians whose careers started out in the city in the late 60s. The scene may have been no more that a group of boys whose families had eclectic record collections … record collections that included 20C classical – Hindemith but “nothing German … not long after the war” – and jazz – Max Roach and John Coltrane, perhaps – but those boys went on – jointly and separately – to be a key part of the creative explosion that was the 1960s, and Robert was there at its centre.

Right from the start, Robert was one of a very select group – others are Justin Bieber, Dave Clark, Karen Carpenter, Levon Helm and Ringo Starr – of singing drummers. As he tells it, drumming was something you could do while listening to a Max Roach record, and what's more you could get started by drumming with rolled up newspapers. You could also sing along, too, and teach yourself to be a musician. All of this meant that at school – the Simon Langton Boys School, here in Canterbury – Robert was rather left behind, and instead he became one of the Wilde Flowers, who played in Canterbury and further afield for a couple of years. That is “Wilde” with an “e”, in honour of Oscar – an example of the wordplay that “lets words collide” running through all Robert’s work.

In 1966 Robert helped to form Soft Machine, with David Allen (from Herne Bay, and introduced to Roger as “someone else who has grown their hair long”), Kevin Ayers and Mike Ratledge. Soft Machine never managed to break through to the rock or pop mainstream, but were hugely influential, first for their gentle English surrealism, and later for their cool, stripped down, fusion jazz rock. They cut their teeth – and sharpened up their act – playing support to Jimi Hendrix in a long 1968 tour of the USA. Robert observes: when you're in front of 5,000 young Texans waiting to hear Hendrix, you don't mess about; … it also helps if you don't have a guitarist in the band! This tour brought them American success, which was mirrored on the continent, where their jazz style found a sympathetic ear. In the UK, they became the first rock group to play at the Proms.

Soft Machine broke up, reformed to make their landmark ”Third” album 2LPs with four side-long pieces. and finally Robert left them for good. He formed a larger group – Matching Mole – a bilingual pun on “machine molle” / soft machine (groan!) but not long after that in an accidental fall from a window in 1973 he received injuries that have meant that he now uses a wheelchair. This took him – in his words – into just another way to be, and it meant that he had time to slow down and think about how to sing. His first record after that – Rock Bottom – sets the pattern for his work since: slow songs, striking tunes and rhythms, usually based on keyboard melodies. A reworking of the Neil Diamond song I'm a Believer – first sung by the Monkees – gave him his first hit, but that is only one side of his work: he's a great musical collaborator, and played with a huge range of people through the 1970s and 80s, from free jazzers like Carla Bley to avant gardists and more traditional rock stars, often with a political message. His most remembered song from that time is his version of Elvis Costello's Shipbuilding: an elegiac meditation on the Falklands War and its effect on the parts of the county that had been hit hardest by the Thatcher government's policies.

Looking back over Robert's musical career – which started at around the same time as the University of Kent, and shares its 50th anniversary – it is hard to think of more than a handful of musicians who have been able to keep their music as vital and original as it was when they began. His recent work has seen collaborations with the Brodsky Quartet - also honorary Kent graduates - Brian Eno and a re-recording of a set of songs by the electronica group Hot Chip. 

Recognition for his work has come in many forms: he has been a guest editor of Radio 4's Today programme, he is an honorary doctor of the university of Liège, and he is a petit fils ubu from the collège de pataphisique. Locally, he is celebrated in a life-size stencil by the street artist Stewey in Canterbury's Dover Street, on the site of the Beehive club where he played at the start of his career. 

In its turn, the University of Kent would like to record its appreciation of Robert's work. For his musical achievements and influence over the last fifty years, most honourable deputy vice-chancellor, to you and to the whole university I present Robert Wyatt to be admitted to the Degree of Doctor of Music, honoris causa.

[Text of the oration for Robert Wyatt's honorary degree award, Canterbury, 18-7-14].

Thursday, July 3, 2014

Fifteen facts about EPSRC

Thanks very much to Alex Hulkes for his visit and presentation about EPSRC in general and their ICT programmes in particular. Here are fifteen things that I learned.
  1. EPSRC has about £2.5bn worth of live projects at any time.
  2. “We have to do applied research …” because it is part of EPSRC's Royal Charter.
  3. Terminology: EPSRC has two kinds of theme: “capabilities” correspond to specific research areas, while “challenges” are cross-cutting themes like energy or global uncertainty.
  4. Terminology (2): for EPSRC “interdisciplinary” means belonging to more than one of their research areas.
  5. 75% of PGR funding is DTP plus CASE, and so not subject to particular shaping or direction.
  6. Pathways to impact: either say how you will achieve impact, or say that it doesn’t make sense for your research to have impact (at this point). 
  7. It’s good if you can say how your research project fits in with what EPSRC are currently funding, as long as it’s not a straight duplication of work that’s already funded.
  8. Developing leaders (fellowships): while it’s important to have a good research project, that’s a necessary rather than a sufficient condition: you need to be able to convince that you are a research leader.
  9. These schemes are less popular than earlier fellowships schemes, perhaps because of the difficulty of getting evidence of leadership potential down on paper.
  10. In ICT, EPSRC wants to keep developing and encouraging new research areas. It also wants greater collaboration with other areas
  11. It’s also keen to get ambitious proposals: of the funding for responsive – rather than directed – research, some 50% of the cash is in longer/larger grants.
  12. Proposals will get at least 3 reviews for panel.
  13. There's no correlation between the number of reviewers and the success/failure of the proposal.
  14. The PI response is important: the factual response has an effect when (and only when!) you can provide evidence that shows that an objection doesn't hold.
  15. Success rates (at least for ICT) are constant across areas set to grow/stay constant/ shrink. On the other hand the “grow” area has received more applications.

Tuesday, June 17, 2014

Review of "Certified Programming with Dependent Types"


Chilpala's text Certified Programming with Dependent Types is an outstanding introduction to how programs can be guaranteed to be correct, by means of the Coq theorem prover – programs that are, in his terminology, “certified”. While machine-assisted proof has been possible for more than a quarter of a century, it is only in the last five years that a substantial body of fully-formal proofs has been delivered. These include mathematical results – such as Gonthier's proof of the four colour theorem – and also those in the domain of theoretical computer science. Indeed, the “POPLmark challenge” http://www.seas.upenn.edu/~plclub/poplmark/ has set a benchmark for proof mechanisation in language metatheory. So, this text is timely in providing an accessible introduction to the area; but what is it that makes it stand out?

Firstly, Chilpala gives an excellent introduction to the area, explaining the background of the different approaches to theorem proving in systems such as ACL2 and PVS among others, as well as the Coq system that is the subject of the book. Complementing this is a discussion of the logics implemented by the various systems. He argues cogently for a theory that supports dependent types, under which the result types of functions can depend upon the values of inputs. Dependently typed systems support a dual interpretation: objects can be seen as values belonging to types, or equivalently as proofs for propositions – the so called ‘Curry Howard’ isomorphism. Allowing the interpretations to co-exist gives a type system that can express program pre-conditions, or alternatively a logic in which a functional programming language can be used to build proofs. Coq implements a powerful dependently typed theory that has a functional programming foundation, proof checking in a secure kernel (according to the ‘de Bruijn principle’), advanced proof automation through a tactic language Ltac and a principle of reflection.

Secondly, Chilpala opens the exposition in Chapter 2 with a worked set of examples that concentrate on language evaluators and compilers. Rather than covering all the material needed first, he plunges into the exposition, giving a survey of what is possible, and saying that “it is expected that most readers will not understand what exactly is going on here”. Does this approach work? It is clear reading this that anyone reading the chapter needs to understand a language like Haskell or ML, but with that knowledge it is possible to gain a good sense of how the system is used in practice, and so I would personally endorse it. After all, it's possible to skip on to chapter 3 and follow a sequential approach if this proves to be too forbidding.

Thirdly, in contrast to some introductions, the book promises a “pragmatic” approach to proof construction or engineering. This is welcome, since Chilpala acknowledges that Coq is a large system that has grown in complexity over the last twenty years. Does he deliver on his promise? The book is divided into four main sections: the first two cover the fundamental technical material, namely “basic programming and proving” and “programming with dependent types” in some 250 pages, but the remaining two sections (130pp in total) cover “proof engineering” and “the big picture” and it is in these two that he is able to cover the pragmatics of proof. He covers not only common approaches like logic programming in proof search but the problems of larger-scale proof, such as the evolvability and robustness of proofs. This is forcefully communicated through a set of “anti patterns” which mitigate against well-structured and evolvable proofs, and provides strategies for avoiding these.

The book doesn’t contain exercises, but these can be found on the book website, contributed by readers. The website also provides access to the full text of the book, as well as the Coq code for all the chapters, as well as supporting libraries of code and tactics. It is very clearly written, and tha author has a direct, approachable style. If you want to find out more about using the Coq system for building realistic, large-scale, proofs, particularly for certifying programs, then I recommend this highly.

Saturday, May 10, 2014

PROWESS mid-term workshop

I'm just back from a trip to Borås in Sweden, where we held a one day workshop for the PROWESS project to showcase what it has achieved, just over half way through the project. The aim of PROWESS is to bring the benefits of property-based testing to testing web services. We're a consortium of universities from the UK (Sheffield and Kent), Sweden (Chalmers) and Spain (A Coruña, UPM); the Swedish research organisation SP,  and SMEs from those three countries too: Erlang Solutions (UK), Quviq (Sweden) and Interoud (Spain).

As well as attendees from the project we had more than twenty others from companies in Sweden, Denmark, the UK and the Netherlands and universities in Denmark, Sweden, the UK and Brazil. The day went very well, with project members like myself being pleased to see how the separate strands of the work are coming together, and that the interactions we're having informally are turning into practical tools.

More importantly, we got positive reports back from the external attendees too, who were able to give us really helpful suggestions about how we could extend and build on what we had done. The prize for the coolest talk has to go to Benjamin Vedder, whose demo of fault injection for a quadcopter using a QuickCheck model stole the show [OK, the fault injection only happened in a simulation – to the relief of the front row – but still fun to see the use of QuickCheck in Erlang used to inject faults in to C++ embedded systems!]
Benjamin's quadcopter

The morning programme gave an overview of the project and property-based testing in QuickCheck, as well as introducing the VoDKATV platform from Interoud, which provided a case study running through the more detailed presentations in the afternoon
The afternoon gave participants a chance to talk in more detail about specific technical innovations in the project:
  • Inference of state machines from QuickCheck traces – Kirill Bogdanov, University of Sheffield
  • Automating Property-based Testing of Evolving Web Services – Huiqing Li, University of Kent and Laura Castro, University of A Coruña.
  • Fault injection – Benjamin Vedder, SP.
  • More-bugs -- how to not find the same bug over and over again – Ulf Norell, Quviq.
  • A Property-based Load Testing Framework – Diana Corbacho, Erlang Solutions Ltd and Clara Benac Earle, UPM, Madrid.
  • Smother: Extended code coverage metrics for Erlang – Ramsay Taylor. University of Sheffield
  • Automatic complexity analysis – Nick Smallbone, Chalmers University
The full programme for the day, including talk abstracts, is here; and the slides for the presentations are here.

I'd recommend this kind of open day event to anyone planning a complex research project, as it gives you all a chance to get some invaluable external insights into the work that you're doing. It's also a real fillip to the project to see the tangible progress that has been made, and also to enthuse project members to do even better in the second half of the project.


Tuesday, March 25, 2014

Writing your thesis and preparing for your viva

Writing your thesis


The first question to ask yourself is "who am I writing for?" Ok, you're writing for yourself and your supervisor, but you and (s)he know the work too well. Think instead of the examiners, and others who will read it now and in the future.

Sticking with the examiner for the moment, you should realise that the examiners want to pass you: you make it simpler for her if she can find all the evidence she needs as easily as possible. In fact, this is just as applicable when you're writing papers in general, and particularly for when you're drafting a research grant application. So, here are some specific points 
  • write it as you would say it: that means keep the language straightforward … just because it's written, there's no need to use longer words, or more complicated constructions;
  • if in doubt, include it: it's not a problem to read something that you know (within reason) but is is a problem to come across a TLA that you don't understand;
  • give a clear statement of contribution of the work: what have you done that goes beyond the state of the art (which you will explain in a "related work" chapter) … also it's worth saying where to find those contributions in the thesis (by giving section cross-references)
  • if some of the work has already been published in jointly authored papers, it's very good for you to give a clear statement of your contribution to the papers;
  • make it easy to read the thesis by including the right signposting, which means that anyone reading the thesis knows precisely what you do, where; in particular, I'd advise you to include:
    • an overview of the whole thing in the introduction
    • a summary at the beginning of each chapter, section by section
    • and a conclusion at the end summarising the contribution of the chapter
To get another perspective on what to aim for in writing, you can look at the criteria for the CPHC/BCS Distinguished Dissertations competition, which aims to reward the best theses in CS in the UK. This prize looks for
  • whether the thesis makes a noteworthy contribution to the subject, and what that contribution is;
  • the standard of exposition;
  • how well it places its results in the context of computer science as a whole; and
  • how easily a computer scientist with significantly different interests would be able to grasp its essentials.
As you're writing there are people who will give you feedback. At Kent someone from your supervision panel will help, as well as your supervisor, it's worth checking early on who from your panel will do this.

A cardinal rule in getting feedback is only ask for feedback on things that are final, albeit final first drafts. There's nothing worse than giving feedback on something and then being told "oh yes, I was going to do that" … it should have been done (preferably) or at least there be some note to the effect that it's still to do. Why do I say this? The first reading is always the most effective … when you read for a second time, it's inevitably going to be shaped by the results of the first read. 

Finally, make sure that you've fixed the problems pointed out first time before you ask someone to read it again!

Preparing for your viva


First, make sure you check when it is, and keep in touch with the internal examiner and supervisor about what they need you to do specifically to prepare for the viva.

Candidates are often (usually?) asked to give a summary of the main contributions of the thesis at the start of the viva: this could be a 10 minute oral presentation (no slides) or (sometimes) with slides …  either way, it's worth be prepared, and check with your supervisor about this.

Read the thesis again in the days before the viva, so that you're familiar with what is where … it will probably be a few months since you submitted it, and so it will have gone out of your mind. While you're reading thorough, it makes sense to keep track of typos that you find, so that you can fix them in the final version. You can go armed with errata lists, either personal or to share with the examiners, that you compile while reading it.

Will your supervisor be present? The Kent rules are they are only there with your explicit permission, and this is entirely up to you. At other universities the rules might be different, and other people might be there – this might be an independent chair of the proceedings, for example – or the viva might be recorded. It may be that you would like to invite your supervisor to be present after the viva is over,  if there are corrections to be discussed once the decision has been made.

Another way of preparing is to give a seminar to your research group: that will help you think about what your contribution is, and also how the parts of the thesis fit together, something that may well be a question in the viva itself. I've heard of people having a mock viva, but I'm not sure that's such a good idea … each viva will be different, and preparing by reading and giving a seminar should be enough.

Wear something comfortable … and take some water with you.

Probably you'll have access to a whiteboard, but just in case not it's useful to take along some paper if you want to draw diagrams or otherwise explain things visually or symbolically. 

Once the viva gets going,
  • think before you answer … it's no problem to consider what you want to say before you say it;
  • if you don't understand a question, ask for clarification … if it's not clear to you then it may well not be clear to the other examiner either, and
  • if things seem to get stuck into a discussion of a particular point, politely ask to move on. 
Finally, some common questions that get asked,

  • as I said earlier, examiners often begin by asking you to give a summary of the contribution that your work makers,
  • and this might be followed by going into some more depth about how the bits of the thesis fit together  – it is supposed to be a "body of work" and it's worthwhile thinking about how you would explain that in your case;
  • other questions might ask about your knowledge of work by others in this field,
  • or get you to talk in detail about "your argument on p34" or "your proof on p31" … it's not all high-level and context, but will engage with the details of what you have written, and how you justify it.
So, good luck with the writing and the viva itself!