I’ve always been good at exams. From a pretty early age, I was into quiz books and puzzles, and got used to reading the question and getting a good idea of what an answer would look like. I was even good at programming exams: in my first year at St Andrews, the only marks I dropped in CS1002 were because I forgot to staple my homework together and one of the pages got lost.

I’m currently helping several students with programming exams. Exams that take place on paper. Without computers. Without Google. Without testing, or debugging. Where you solve the same tedious problem as everyone else. Where you get docked marks for syntax errors.

Now, I have an academic background. I worked as a researcher for nearly a decade, writing involved software suites. I then went to work for a proper computer firm and found that my experience of programming was all but useless in the real world. Some of the skills were useful - solving problems and swatting bugs - but I knew nothing of version control and very little about testing. Seven years of hands-on experience counted for virtually nothing, because it wasn’t how grown-ups program computers.

The programming exams my students are taking aren’t even how toddlers program computers. Nobody (or at least, nobody with any sense) has written code on paper since about 1980. You get hold of a computer, you hack something together, you see if it works. When you get an error, you fix it. When you get stuck, you look it up on Google.

In fact, that’s not even where real programmers start. Real programmers start with Google to see if anyone else has already solved the problem.

It’s no wonder students look at their programming modules with a mixture of dread and bafflement: they’re so far removed from reality as to be completely pointless.