Teaching a history of computing

Everything I think about or write about has been done to death, but so what. It's still a thing, so I still write.

This isn't directly about computing in schools, but it is, kind of. I've recently thought about how 'ICT' - goodness, I really don't like that term - is being taught, particularly how it's being introduced. To a ten year old. Known to me.

The Cambridge iGCSE syllabus for ICT in 2015 - the examinations due in June and November of that year - makes for an interesting read. This examination is taken in three separate papers, totalling seven hours. Seven hours! If you worked an eight-hour day and took and hour for lunch, how much of the day could you fill writing what you know about 'ICT', let alone answering questions about it? This is two hours of writing and two separate practical tests.

It sounds at least comprehensive, surely. Students are expected to have "a working knowledge of HTML code".

Interestingly, the documentation finds it necessary to note that "No marks will be awarded for using brand names of software packages or hardware." - and that "marks will be awarded for relevant answers which relate to new or emerging technology that has not been specified in the syllabus." I'm beginning to feel quite positive about all this.

For this 2015 examination students will have to "describe recent developments in ICT" know about "Sensors ... in control and measuring applications", the differences between typed of CD media, DVD and Blu-ray; "describe a router and its purpose"; "describe different database structures such as flat files and relational tables including the use of relationships, primary keys and foreign keys"; "explain what is meant by software copyright"; ... amongst many other requirements.

The two two-and-a-half hour practical tests will mark on - just a small selection of requirements - "refine searches using more advanced search techniques"; "include different formats of information from the internet"; proof-reading, page layout and formatting; "identify the structure of external data with different file types" and "export data in common text formats".

Together with this - and remember these are practical tests, not written papers - students taking this examination will have to demonstrate use of external style sheets, links, tables, images and so on.

My only worry here is the time spent on formatting printed material and putting together presentations. I don't think it deserves quite the amount of attention it gets - and in 2015, these skills may be even less in demand than today. I'm not saying they won't be of any use at all, just that an 'ICT' examination might not be the place for them.

Let's wind back to the last year of the junior school. Ten and eleven year olds, for example. They're four to six years away from the GCSE, as it currently exists.

Can we teach all that in four or five or even six years? I think so, yes. The obvious stuff is: don't teach stuff that's out of date - and do try to teach stuff they'll actually be examined on.

But what's troubled me is the underlying stuff about computing, rather than 'ICT'. No, I'm not thinking about data structures and algorithms - I don't think the GSCE is the place for that - but more about the history of computing. History might be the wrong word: foundation, may be?

You might well ask: what's the point of knowing about the history of computing? Who's taking examinations in that? It's GCSE ICT, after all. History isn't in the syllabus.

I think it should be. For two reasons: that it's interesting and useful in itself - and that it might offer a better way into the subject that just sitting twenty kids in front of already dated computers and telling them to do bullet lists. The kids who aren't bored would be frightened off.

Let's imagine we have thirty half hour lessons to fill. This is a class that will eventually head towards the ICT GCSE, but stay with me here - we're imagining. They're ten and eleven. What might those thirty lessons look like?

Let's have most of our lessons without a computer in the room. These kids most likely know what a computer is. They got told off last night for playing too much Minecraft.

Let's start with money, measurement and numbers. Record-keeping and accounts - just what they are, no great detail. Make sure we don't drift into maths too much: we want to stay on 'the use of numbers'. Keep everything about practical use. People buying and selling. People making and counting.

Then a bit on writing numbers: Indian and Arabic systems, paper and the abacus. Get the kids to make a basic abacus. No screens yet, remember! I imagine the kids are either bored to tears by now, or ... not, I suppose. This is also the place for the slide rule, but an abacus seems a bit easier to demonstrate.

Then early analogue computers, especially why they were used - astronomy, calendars and so on. Also, formal systems like grammars and laws: just the concepts of having written systems and rules to describe outcomes.

The mathematics of the medieval Islamic world, yes - but always driven from why it was being done. What did each advance or discovery give us, or allow us to make. The concepts of cryptography and automation, who was using it and why.

I wouldn't get into theoretical computing at all. I'm not quite sure how or where that fits in. Maybe much later.

The Jaquard loom - and then we're into the easy bit: 'modern' analogue computers; Babbage and his engine - make a good story of it all, including the bits where it doesn't go well, methods of storage and instruction; the big bit here is the invention of the general purpose computer.

Next, Lovelace and programming in general concepts. The move from reference tables to programmes to generate desired outputs. There needs to be a firm focus on applicability. 

The end bit is straightforward: Colossus, ENIAC, the Internet. Done.