?

Log in

No account? Create an account
Back to the future - The Ex-Communicator

> Recent Entries
> Archive
> Friends
> Profile

January 22nd, 2012


Previous Entry Share Next Entry
10:12 am - Back to the future
People might be interested to read the new computing curriculum which I was talking about last week. It's online here. I am a bit of a connoisseur of computing curricula, because I have been on several working groups which developed them, the earliest was under the previous Tory government in the mid-1990s, before the Internet was significant in education. I have also worked with a range of curricula used in different countries. This one is very old fashioned, and it's stress is on what they call computational thinking.
Computational thinking: a mode of thought that goes well beyond software and hardware and that provides a framework within which to reason about systems and problems... computational thinking influences fields such as biology, chemistry, linguistics, psychology, economics and statistics.

What this means in practice is algorithm-based structured problem solving, with reusable modules or procedures, input and output. In my opinion this is a false model of human reasoning (underpinning Evo Psych for example). So I have one issue, with the poor model of problem solving. If you look at the PDF which I have linked to above, pages 5-10 are a rather brief overview of computing as a discipline, and pages 11-14 are a discussion of computational thinking: modularity, modelling and abstraction. Anyone who has worked in philosophy or psychology will appreciate the problems raised by this section. I could go on at length and I might another time.

Pages 15-19 are content, and 20-22 are level descriptions (that is, the hierarchy of learning over time). I have seen this all before. It is the typical computing curriculum you see in the developing world, and it was the curriculum in use in the UK in the 1980s: Algorithms, truth tables, relational operators, binary numbers, two's complement, logic gates, and a bit of new stuff about protocols and packet switching. 'Write a flowchart to show the process involved in making a cup of tea'. Do me a favour! I have seen this dull exercise presented as a 'clever idea' by every blooming old man who writes a curriculum since the 1980s. After all that fanfare it is stodgy stuff. However I can write to it very easily, it's really back to the future, putting aside all the uncomfortable modern stuff. Another issue of course is that this is not mandatory either in whole or part, so schools can offer no use of technology at all.

(22 comments | Leave a comment)

Comments:


[User Picture]
From:wwhyte
Date:January 22nd, 2012 11:41 am (UTC)
(Link)
Interesting critique. What would a good (or modern) curriculum look like?
[User Picture]
From:communicator
Date:January 22nd, 2012 02:34 pm (UTC)
(Link)
The challenge is to make something which is flexible enough to adapt to unforeseen changes, and yet prescriptive enough that it stretches teachers to go out of their comfort zone. And often those comfort zones are quite narrow. So expanding the horizons of teachers, as a lifelong project, has to accompany the development of curriculum. And there has to be some recognition that programming in C# (or whatever) is always going to be a minority interest, so the syllabus has to be flexible enough to allow some bright kid to stretch herself, but not leave the rest of he class with nothing to do. I know this isn't an answer though.
[User Picture]
From:chickenfeet2003
Date:January 22nd, 2012 12:04 pm (UTC)
(Link)
'Write a flowchart to show the process involved in making a cup of tea'

I remember doing that at school c.1972
[User Picture]
From:communicator
Date:January 22nd, 2012 02:39 pm (UTC)
(Link)
It comes up every damn time.
[User Picture]
From:kerravonsen
Date:January 22nd, 2012 12:20 pm (UTC)
(Link)
Oh heck, two's complement numbers, I remember those. An interesting exersize, I suppose, but completely useless unless one intends to design hardware.

Algorithm-based? Flow charts, for goodness' sake? I was at university in the '80s, and that sort of stuff was on its way out. At least we were taught that there was more than one way of approaching it; to start with the procedure (algorithm-based) or start with the data and what you do with it (object-oriented). And then there's functional languages, which are yet another approach, which some find rather brain-bendy until they can grasp the concept of recursion.
[User Picture]
From:communicator
Date:January 22nd, 2012 02:41 pm (UTC)
(Link)
It is hard to know where to start with computing, and I do think the fact that data is represented as on/off bits is important. But I think binary and hexadecimal maths takes up a lot of time in the classroom, for pretty limited rewards.
[User Picture]
From:kerravonsen
Date:January 22nd, 2012 07:52 pm (UTC)
(Link)
Simple binary maths is probably a good idea, to convey the idea of bits and how that works, but two's complement numbers is a waste of time, IMHO.
Hexidecimal probably isn't so much of a waste of time, because there are quite a few things that are represented by hexidecimal numbers - everything from colours in HTML to public keys in encryption.
[User Picture]
From:sugoll
Date:January 22nd, 2012 06:11 pm (UTC)
(Link)
> Oh heck, two's complement numbers, I remember those. An interesting exersize, I suppose, but completely useless unless one intends to design hardware.

I beg to differ, although I grant that it is unbelievably dry, and during a course on computing with approximate numbers, I recall thinking "Why not just buy more bits?" And yet, I've spent many hours re-working the routines that emulate integer division and floating-point operations on procesors that don't support them. It all depends on where you end up, of course: this stuff is vital to an embedded programmer (fixed-point fractional arithmetic! Joy!), but utterly irrelevant to someone working on the web with loose typing.

But that's just a knee-jerk reaction to the "...hardware" comment. I do think it's worth pupils understanding that computers have limited precision, and what that means when errors crop up. And I really can't imagine anyone getting anywhere in computing without understanding binary. Is twos-complement that much further a step?
[User Picture]
From:kerravonsen
Date:January 22nd, 2012 08:00 pm (UTC)
(Link)
My apologies: design hardware and use assembly language.

I do think it's worth pupils understanding that computers have limited precision, and what that means when errors crop up. And I really can't imagine anyone getting anywhere in computing without understanding binary.
Oh, I agree about binary, and you have a very good point about limited precision.

Is twos-complement that much further a step?
I think so, because there are other, more useful/interesting/relevant things they could be learning instead, in the limited time they have to learn stuff.
[User Picture]
From:muuranker
Date:January 22nd, 2012 01:27 pm (UTC)
(Link)
I honestly hadn't read this when I made my comment about flowcharting!

The trouble with flowcharting is that it is not like Lego. With Lego, you do a bit of work, and you end up with a turtle (or a robot) that does stuff. It's fun. Flowchart making a cup of tea, and you, er, don't even have a cup of tea. Fun it is not.

Flowcharting gets to be fun when you have a process to set up, or evaluate and re-design that you can't do without a tool like flowcharting. Children, in my experience, do not set up production lines, and would have no interest in doing so.
[User Picture]
From:happytune
Date:January 22nd, 2012 02:05 pm (UTC)
(Link)
[Splutters into drink laughing]. 'Flowchart is making a cup of tea, and you, er, don't even have a cup of tea.'

[User Picture]
From:happytune
Date:January 22nd, 2012 02:06 pm (UTC)
(Link)
Oo - good question above. What's your fantasy technology curriculum?
[User Picture]
From:communicator
Date:January 22nd, 2012 02:53 pm (UTC)
(Link)
Hey I wrote it in 2009, and it got chucked in the bin by Gove. No, that's not true, that was a general use-of-ICT curriculum. I think programming should be a specialist skill, not something imposed on everyone, and young people should be able to choose to develop themselves heavily in any specialism, through a more flexible curriculum.
[User Picture]
From:Steve Davies
Date:January 22nd, 2012 04:00 pm (UTC)
(Link)
My son has just begun GCSE Computing - it's a pilot, and doesn't look incredibly different from this, so I suspect this idea's been knocking about the exam boards/civil service for a wee while. Before this, the incredibly nerdy kids did AS Computing and the rest did "ICT". So this looks like a good route for the kids who are a bit nerdy like my son. It may not be incredibly exciting, but my son finds it *so* much more preferable to "make a powerpoint presentation about (insert something boring)" and "make sure everything is filed away neatly" which is the lot of those poor wretches on the ICT course.
[User Picture]
From:Steve Davies
Date:January 22nd, 2012 04:04 pm (UTC)
(Link)
There's also music technology for those with a different kind of geekiness
[User Picture]
From:communicator
Date:January 22nd, 2012 04:17 pm (UTC)
(Link)
I taught GCSE computing in 1987 and it was exactly like this, so there has been no change. It's not that anyone anticipated this, but that this is something very shop-worn.
[User Picture]
From:sugoll
Date:January 22nd, 2012 06:17 pm (UTC)
(Link)
A cup of tea is never a good example, because it's too vague. My two preferred examples for demonstrating algorithmic thinking are:

a) Shuffle a pack of cards. Deal them out in a row, face down. Write out the instructions for sorting the cards into order. You may only look at at most two cards at a time. Someone else will be following your instructions, and they won't be told what the instructions are supposed to achieve.

b) An event occurs. Some time later, another event occurs. You need to determine how many seconds elapsed between the two events. How do you do this?

(I now expect communicator to tell me I'd make a terrible, out-moded computing teacher. :-> )
[User Picture]
From:communicator
Date:January 22nd, 2012 06:35 pm (UTC)
(Link)
Sorry I probably would, because you are operating at a level of conceptual abstraction which is beyond most adults, let alone kids. I don't think exercises like that will help most school pupils to make better use of computers in their lives. I think basic ideas like how software comes to exist, and how it gets onto your computer, where the content on the Internet comes from, and how we can see it, when and why you have to connect things together with wires, how information gets from A to B, and in what form it exists.

A young person who leaps over all this and is ready for sort algorithms - great - we need to have better provision for the talented high flyer too.

PS, so I'm not saying you are outmoded but you would be a teacher for kids who are gifted and interested, but not for the mass of 8 year olds or 12 year olds

Edited at 2012-01-22 06:37 pm (UTC)
[User Picture]
From:sugoll
Date:January 22nd, 2012 11:06 pm (UTC)
(Link)
I wasn't suggesting that (a) would be used to teach, say, quicksort. It's more that the concept of "put these cards in order" is simple enough to understand the objective quickly, and that the instructions can be written down in steps. "Make a cup of tea" is too arbitrary; computers are more limited, with a fixed set of operations.

I have to admit, I get confused about the ages group you're addressing. I was thinking secondary school.

Mind you, I got my core information from Fred Learns About Computers...
[User Picture]
From:sugoll
Date:January 22nd, 2012 11:10 pm (UTC)
(Link)
Oh, and I really would be a terrible school teacher...
[User Picture]
From:espresso_addict
Date:January 22nd, 2012 08:13 pm (UTC)
(Link)
I don't even understand the question for (b).
[User Picture]
From:sugoll
Date:January 22nd, 2012 11:09 pm (UTC)
(Link)
Take two events, A and B. Doesn't matter what they are. They occur at times T1 and T2 respectively. What's the expression for the number of seconds between T1 and T2 ?

> Go to Top
LiveJournal.com