Irish Computer Science Leaving Certificate Curriculum Consultation Update

Last Tuesday I attended a consultation session for the Leaving Certificate Computer Science Curriculum. This is Ireland’s shot at putting CS on the pre-university curriculum, specifically the Irish Senior Cycle – which leads right up to where secondary school and university meet. I am particularly interested in this as I teach, research, and am pretty much obsessed with CS1 – the first programming course that CS majors take at university. I am also teaching this year on a new programme at my university, University College Dublin (with support from Microsoft), that is one of the first (if not the first) teacher training programmes specifically for this new curriculum.

The event was hosted by the National Council for Curriculum and Assessment, and was addressed by Irish Minister for Education and Skills, Richard Bruton. It was an engaging and lively day of discussion and it was really good to see so many different stakeholders in attendance. I was in one of I believe 6 (or more) focus groups, and we had university professors, industry leaders (including Apple and Microsoft), current (and former) school teachers, and a member of the curriculum development team in the room, (and I am missing a few people here).

There is another consultation event on September 16 at Maynooth University, hosted by the Computers in Education Society Ireland (CESI). The consultation officially closes on September 22, and a final draft of the curriculum is expected soon thereafter.

‘Supercomputing’ in the curriculum

A recent article on is calling for supercomputing to be put ‘in the curriculum’. In it, Tim Stitt, head of scientific computing at the Earlham Institute, a life science institute in Norwich, UK, says children should be learning supercomputing and data analysis concepts from a young age.

Although I agree in principle, the article doesn’t specify a particular curricula although it does seem to be aimed at pre-university ages. In the article, Stitt claims that current initiatives such as the new computing curriculum introduced in the UK in 2014 which makes it mandatory for children between the ages of five and 16 to be taught computational thinking, may “compound the issue”, as children will be taught serial rather than parallel programming skills, making supercomputing concepts harder to learn later on. Again, I can agree in principle, but the extent to which learning parallel programming after learning ‘normal’ sequential programming is debatable, and will certainly vary considerably from student to student.

I have mixed feeling about the word supercomputing. I can imagine someone saying “Really? You are going to teach supercomputing to kids? Don’t you think that’s a bit much?” I couldn’t blame them for being skeptical. The word itself sounds, well, super. Personally I think that High Performance Computing (HPC) is more down to earth, but I also concede that that may still sound a little ‘super’. I have some experience with this. I am one of many that maintain the Irish Supercomputer List. That project didn’t start off as the Irish Supercomputer List, but we changed the name in order to, quite frankly, be more media ‘friendly’. (Side note – interesting discussion on disseminating scientific work to the media here).  Additionally, the Indian and Russian lists also have the word supercomputing in their names and/or URLs. The Top500 list also used the word supercomputing before they rebranded a few years back. Anyway…

So, what we are really talking about is putting Parallel Computing (or parallel programming) in the curriculum, and therefore opening the door to supercomputing, as almost all HPC installations require parallel programming. In fact the current Top500 Supercomputer List is composed entirely of clusters (86.2%) or Massively Parallel Processors (MPPs – 13.8%). Clusters are parallel computer systems comprising an integrated collection of independent nodes, each of which is a system in its own right, capable of independent operation and derived from products developed and marketed for other stand-alone purposes [1]. MPPs (such as the IBM Blue Gene) on the other hand, are more tightly-integrated. Individual nodes cannot run on their own and they are frequently connected by custom high-performance networks. They key here is that in both cases memory is distributed (as are the cores), thus requiring parallel algorithms (and therefore parallel programming).  Before switching gears I would like to return to the point I opened this paragraph with – we are talking about parallel programming – not necessarily supercomputing – although learning parallel computing is indeed the essential requirement to eventually program supercomputers.

At the university level, there is more than an awareness of the issues that form the core of the argument which is the focus of the article that I started this post with. In particular there are two conferences/workshops that directly address HPC education at university level:

  1. Workshop on Education for High-Performance Computing (EduHPC-16), held in conjunction with SC-16: The International Conference on High Performance Computing, Networking, Storage, and Analysis
  2. The Parallel and Distributed Computing Education for Undergraduate Students Workshop (Euro-EDUPAR 2016), held in conjunction with Euro-Par 2016, the 22nd International European Conference on Parallel and Distributed Computing.

[1] Dongarra, J., Sterling, T., Simon, H. and Strohmaier, E., 2005. High-performance computing: clusters, constellations, MPPs, and future directions. Computing in Science and Engineering, 7(2), pp.51-59

Computer Science: More than just programming and telescopes

James H. Morris, former dean of Carnegie Mellon University’s School of Computer Science had a nice piece in Information Week 13 years ago now, that is as true as if ever was.

In it he argues strongly that CS is more than programming, and that students need a strong sense of empiricism.

 The ability to discern a real phenomenon and distinguish it from myth is vital. Our students learn to measure the performance of people using technology.

I couldn’t agree more. As the former head of a small CS department I oversaw a degree program that was particularly strong on networking – in fact our degree was known for this and popular as a result. (It was technically an IT degree). We had graduates go on to work for Oracle, IBM, and other companies based on their networking experience. I believe it still is the only CS/IT undergrad I know of that has a course on networking in each of all eight semesters. The program is also notable for third year group projects and fourth year individual projects. The fourth year project was equivalent to half of the final year workload! At the time this program had a conservative rank of 17/225 in compliance with the ACM 2008 IT curriculua. This ranking was done using the work of [1] as a guide. The biggest reason for this ranking was having this much project work – the ACM assigns 0.5 points per semester for a capstone experience (out of a possible 4). The inclusion of the fourth year module Principles in Professional Practice and Strategic Business IT also increased the mark relative to many of the other programs.

Morris goes on to state that good CS programs include the liberal arts, mathematics and experimental science. I particularly agree with the first since I enjoyed a liberal arts CS undergrad degree myself. Being at a liberal arts college really allowed me to flex my skills in a diverse environment. As a dual-major (CS and physics) I completed my final year physics project on variable stars, a project that was done almost entirely through a computer – I rarely held my eye to a telescope eyepiece. In fact the stars I was studying normally only became visible on the screen of a computer, too dim to see with the human eye, even through a powerful telescope. But it was the support and interest of my project in the community, many of whom were not in a science program, that made my experience so enjoyable.

I agree with the second point, particularly as I did a MSc in Computational Science where my mathematics skills were really put to the test. I also really put my mathematics skills to use in completing my PhD, which was largely done with a pencil and paper, drawing little squares over and over. My entire thesis including code would at the time fit on a floppy disk (not that I ever tried).

I agree with the third point most in my current research lines of high performance computing and computer science education, both areas where (sometimes slightly different forms of) empiricism is absolutely critical.

James’s final quote was:

As programmed digital devices continue to shrink in size and cost, many of us predict that the computer per se will disappear just as the electric motor disappeared into hundreds of niches in our homes and automobiles. Then we will have a science named after an artifact no one sees. But the essence of the science will still be there, and those who study it will be rewarded not just with riches but with understanding.

This brings me back to astronomy – studying many things we will never see. And of course that brings me to the famous ‘computers and telescopes’ quote (mis?) attributed to Edsger W. Dijkstra:

Computer Science is no more about computers than astronomy is about telescopes.

Dijkstra did say something similar for sure though (EWD 1305):

I don’t need to waste my time with a computer just because I am a computer scientist. [after all, doctors don’t infect themselves with the diseases the study]

[1] B. M., Neupane, B., Hansen, A., & Ofori, R. (2012). Identifying and Evaluating Information Technology Bachelor’s Degree Programs. Proceedings of the 1st Annual conference on Research in information technology (pp. 19-23). ACM.


Choose your own adventure in computational thinking

A recent post by Andy Ko (here) provided several interesting ideas on literacy and coding, all which begin with “If learning to code were like learning to write…” This reminded Mark Guzdial (here) of Mike Horn’s work on computational sticker books (here).  As Mark pointed out, Mike asks the question, “If computational literacy were integrated into our daily lives, how would parent and child do computation while reading a book at bedtime?” This made me think about Choose Your Own Adventure Books.

The Computational Thinking possibilities introduced by CYOA books are many (from Wikipedia):

The stories are formatted so that, after a couple of pages of reading, the protagonist faces two or three options, each of which leads to more options, and then to one of many endings. The number of endings is not set, and varies from as many as 40 in the early titles, to as few as 12 in later adventures. Likewise, there is no clear pattern among the various titles regarding the number of pages per ending, the ratio of good to bad endings, or the reader’s progression backwards and forwards through the pages of the book. This allows for a realistic sense of unpredictability, and leads to the possibility of repeat readings, which is one of the distinguishing features of the books.

What other CT and mathematical concepts could be learned from CYOA books?

  • Algorithms
  • Determinism
  • Logic
  • Conditionals
  • Randomness
  • Flow of Control
  • Branching
  • Probability

What about programming concepts? It wouldn’t take a significant deviation from the traditional CYOA device ‘if this, go to page x, if that, go to page y‘, to move towards:

  • Loops, including infinite
  • Recursion
  • Nesting
  • Inheritance
  • Threading
  • Parallelism

Indeed, CYOA books have already been related to:

I really enjoyed the above page on visualizations, which  led me to the excellent work of Christian Swinehart on multiple data views of twelve CYOA books, a project that took 13 months to complete, and mentions CYOA relationships to hypertext, memory access, finite state machines, and even Easter eggs.

It so happens that Jeff Atwood, founder of (and many other things) describes his own decision to leave his job and start stackoverflow as choosing his own adventure.

Of course, in order to champion CYOA books as a device to instill computational thinking, purists would have to prepare a way to carefully handle the fact that CYOA books contain liberal amounts of Goto statements!

If the above doesn’t convince you that in some respects CYOA books were ahead of their time, this might – they were consciously non-gender specific – over 30 years ago.

For anyone who wants to write their own CYOA, there’s an app for that, which makes many aspects of CYOA authorship difficult (such as tracking down loose ends) much easier.