"Children are being forced to learn how to use applications, rather than to make them. They are becoming slaves to the user interface and are totally bored by it"
That's a remarkably smart thing for a politician to say about computing education.
Sadly, this will all be reversed in a couple of years by the next minister who will believe that "Children should be taught the essential workplace skills that employers are demanding in the 21st century. Driving schools don't teach their students how to build a car, but how to use one; and schools shouldn't teach their pupils how to build software, but how to use it."
[note: I don't believe this myself, just that the next minister will.]
But could anyone seriously claim that children are leaving school and not able to use computers?
I'd guess that the average teenager would be more capable with e-mail and office applications than students of 5 or 10 years ago. Even if they aren't using them regularly, they're at least familiar with most IT concepts.
You need to be very careful about things like this, you're making big assumptions that could have huge consequences.
What do you think the computer literacy rate is for children who don't have computers at home ? - it's easy to assume that computers are ubiquitous and everyone uses them all the time but it's far from true.
The children most likely to be computer illiterate are also the ones from the most financially and educationally disadvantaged backgrounds. If you cut them off from getting an IT education at school what do you think that does to their employment and social mobility prospects ?
You can't just design educational systems aimed at the "average" student, you need to cater to every student.
> I'd guess that the average teenager would be more capable with e-mail ... than students of 5 or 10 years ago.
I suspect the average teenager today hardly uses e-mail at all. Newer, more efficient methods of communication have developed, and e-mail in an anachronism to most young people in the era of texting, Facebook messaging, and tweets.
Whether swapping a powerful, flexible infrastructure for viewing the world in 140 characters or fewer is progress in a healthy direction is a different question.
Then again, we haven't managed to get secure e-mail standardised, solve the spam problem, or reliably prevent malware transmission over e-mail, despite having several decades to work on these problems. I don't think we in the older generations are in much of a position to lecture anyone on how to communicate effectively.
I'm hoping that if this is a genuine effort to teach some computing real skills to younger people, part of it will include some exposure to these sorts of issues so perhaps they can do a better job of it than we have.
I suspect that you're wrong. My experience seems to be that pupils of secondary age and onwards are well accustomed to using multiple media for various usecases. For talking to friends or family? Facebook or text. That work assignment that their teacher is distributing to everyone? They'll probably send an email.
I suspect the average teenager today hardly uses e-mail at all.
Yes -- I was surprised (and somewhat weirded out) recently to discover that my younger sister, who's in her early twenties, thinks of email exclusively as a formal communications channel, and never emails her friends (she texts or Facebook-messages them instead).
I'd never heard it in quite those terms before, but that echoes the way that my generation might consider e-mail to be for talking to friends and physical letters to be "formal".
This is true, there is allot of differences between a driving school and a normal school.
One is supposed to set you up with life skills whereas the other one teaches you 1 specific skill well enough that you probably aren't going to kill anyone within a week of getting behind the wheel.
I would argue a school course focused around cars for example would contain a component on basic mechanical skills as well as the workings of a combustion engine.
I wouldn't advocate not teaching any applications software at all but this could be taught alongside other work such as writing and communications skills.
I also wouldn't teach "softare development" as such, at least not in the sense of having exams on J2EE or Android development. I would try to focus on teaching that computer software is malleable , many people today see software as a magic lack box that only some sort of sorcerer can create, even with the web. When I first got online all of the internet savvy kids in class knew at least basic HTML and built their own website by copying chunks of script from various places. Now many peoples idea of expressing themselves online is using the discreet boxes provided by facebook and twitter.
I would teach basic python (starting with the REPL and perhaps using PyGame or Panda3D in the advanced classes although possibly through a simplified wrapper library). I would also teach HTML/CSS + Simple JS and show students how to create a website where people could download their programs.
Databases would be introduced later as a means of persisting data from other programs they had previously created.
Even if we did not create many professional programmers I would hope that students would be able to identify more easily problems which could be efficiently solved by computers and also identify appropriate tools to solve them with.
Unfortunately, I really agree with this analogy. I can see this move to programming being beneficial for England's IT sector but I don't think it will be popular when implemented in schools. I hope it is though! Many students will really love learning to code but I can see a far greater proportion of others taking a real dislike to it. Overall, I guess it depends on the methods in which it is taught.
That's a remarkably smart thing for a politician to say about computing education.
I had to check, but that quote was actually from Ian Livingstone.
I'm very pleased he is so heavily involved. In fact, I don't think any of this would be happening without the backing of industry leaders and their complaints. Kids and teachers themselves have been complaining at least since 2005 when I was exposed to "IT education" in school but that obviously wasn't sufficient reason to take action.
It certainly is. Not that I think it is a derivative but to some extent it is what Douglas Rushkoff argues in his book Program or Be Programmed (2010):
"In a digital age, we must learn how to make the software, or risk becoming the software. It is not too difficult or too late to learn the code behind the things we use — or at least to understand that there is code behind their interfaces. Otherwise, we are at the mercy of those who do the programming, the people paying them, or even the technology itself."
I don't agree with everything in the book but it's nevertheless a great read.
I'm from England, I left school in 2008 with 3 GCSEs (which is considered failing high school here) and the absolute biggest problem with education here in England is it's not education it's just doing work / remembering information -- not sure if this affects other countries, I would assume so but I've not experienced any other education system.
I had an absolutely fantastic IT teacher, he really cared about technology and teaching and he was the best teacher I've ever had but he was very restricted by the curriculum. This was a teacher who enjoyed programming and could teach it, he could teach complex (relative to the classes intelligence level) computing things and teach it in a way we could understand, but the material he was forced to teach was awful, absolutely awful. For my entire last year of school my required work was (If I remember correctly (this is how awful it was, I do not remember it well today)) write an explanation / brief about a microsoft access database system, the extent of our freedom was to choose the subject (dvd store, grocery store etc) and this was an entire years worth of work.
What we need is two things, teachers who really care about teaching and providing a real education and a good curriculum that promotes learning not consumption (and subsequent regurgitation) of information. I left high school and in the last 4 years of it I really did not learn anything that is useful to me today, everything I need to know today is stuff I was taught early on, the last 4 years was being given information to consume, information that as soon as I left the classroom was absolutely worthless.
The IT teacher I had left the school after a year to teach in Switzerland, I think that's pretty telling. The system we have pushes good teachers away and the shitty ones who are just content reciting material stick around, I would also assume relevant to this point is a teacher I had of Spanish is still with the school, but he is such a bad teacher they re-assigned him after many years to "cooking". Someone the school deemed qualified and able enough to teach Spanish for many years is now using his full power to teach cooking?
Most of what you say is spot on, but I take issue with your suggestion that "a good curriculum" will solve the problem. What is needed is /no/ set curriculum; it is in the nature of prescriptive documents to become fussier and more prescriptive over time (instruction creep), and it is only a matter of time before even the best-written curriculum devolves into a nightmare document that specifies exactly what lesson must be taught on exactly what day of the year, and hell mend any teacher who dares ignore it. (This is one of the things about the system which pushes the good teachers away.)
A good approach is that used by the Schedules of lecture courses at Cambridge: they are 'minimal for lecturing and maximal for examining'; they are also very short (about two decent-sized paragraphs for a 24-hour course). Obviously university lecturers can work out more for themselves than school teachers can - but that may well be because school teachers are not /trained/ to work anything out for themselves.
It may also be a good idea to get rid of standardised examinations; the exam-board system is a mess, with boards competing (essentially) on who can make their exams the easiest and thus attract more business from schools who want to move up the league tables. Employers know how to compare, say, a 2-2 at Cambridge to a First at Essex; why should the same not apply to school-leavers' results?
the exam-board system is a mess, with boards competing (essentially) on who can make their exams the easiest and thus attract more business from schools
My IT teacher was similarly skilled and equally wanted to teach usefully and interestingly, and he generally managed to get round the curriculum.
Part of this was that he managed to bend the curriculum to be more challenging and interesting (my GCSE project consisted of researching and then writing up methods of bypassing the school's internet filter system - those who preferred to could indeed do stuff like MS Access projects), and part was finding ways to achieve good grades quickly leaving plenty of free time for other areas.
Possibly the reason he was able to do this was that I went to a private school, I don't know if state schools would allow teachers this much flexibility? That said, my school was/is ranked very highly in GCSE/A-Level rankings tables, and took that very seriously, so as far as grades were concerned he was still on a very tight leash.
He was a very unusual (in a good way) teacher though, like none over I ever had or have ever heard of. In his office (connected to the IT rooms) he had a fully stocked minifridge, originally for himself but he also sold to his students at cost. He formerlly was a programmer for the Canadian air force, as well as a DJ, and would generally play techno music through classes (often loud enough for teachers above to take issue), and was regularly getting in trouble with the headmaster for wearing an ear-ring. Unconventional, but the best teacher any of us ever had.
Let me compare with the 1980s in England, because that's what I know ...
I sat 'O' levels (now replaced by GCSEs in 1981 and 'A' levels in 1983. There was no IT teaching in my school at that time. A computer lab was installed in 1980, and they began running the 'O' level computer science exams in 1983 (too late for me) although I got to use the lab's Apple IIs and Systime 525 minicomputer during my 'A' level in general studies. Teaching focused on programming; there were no office apps on those computers. The 'O' level in CS was recognizably about computer science, albeit at an embryonic level, in those days.
In 1987 I began studying for an 'A' level in computer science at evening classes (I'd been at university from 1983-86). The focus was fairly hardcore programming, with some assembly language plus BBC Basic, plus boolean logic, binary and hexadecimal arithmetic, basic data structures (linked lists, trees, hash tables). Applications barely got a look in.
I bailed on the CS 'A' level as too pedestrian and instead went into a graduate-entry conversion degree in 1989. The then CS 'A' level was, I will concede, useful preparation for a CS degree back then.
The point is, the 1980s syllabi had zero content on application usage -- it was all about data processing and programming and the structure of computing machinery. The trend towards teaching IT seems to have cut in some time after 1988, and gone way too far ...
Our Design/Technology teachers took a similar approach - we had a checklist of "How to get 100% in your coursework by ticking the boxes" parts to rush through, and then a real focus on building something cool and learning DT properly. Again, a private school.
My school was the complete opposite to yours then, it's in a poor area and the year before I sat my GCSEs the 5 A - C pass rate for the school was under 20%, the only 2 lower schools in the county were special education schools, although I noticed that last year they were celebrating that they'd achieved something like an 80% pass rate, but I suspect they're just focusing on getting people to pass now (the year I did GCSEs they offered a single test (well, one maths one English) that was the equivalent to 1 GCSE, I assume to boost averages)
This is really great news. Lots of people in the UK have been pummeling on Michael Gove to make this change and it's good to see that it's happening. With Raspberry Pi and it being Alan Turing Year we can be hopeful about the future of computer education in the UK.
This makes me slightly too happy. I lost less than 1% in Computing A-Level and am on track for a 1st at university, yet I barely scraped a B at GCSE ICT and have memories of writing ~sentence an hour. Hopefully this will increase university standards as well, the ability to program in the slightest is not a requirement at most UK universities because not every school even offers Computing A-Level.
I consider ICT a remnant of times when every-one didn't have their own computer, and teaching them just to use one was a genuine skill. And if I ever hear of hierarchical marking ever again I will murder some-one (you can't achieve the hard marks without first achieving the low ones).
Not everyone does have a computer in the UK, many poorer families will have a gaming console and possibly some mobile phones.
Even in families with computers, it is possible that a few years into the future most home computers as we know them will be replaced by ones that do not have any professional productivity software and are not user programmable.
Assuming you're studying computer science at university, then programming is also not a requirement because it's simply not necessary to know before you start the degree.
It isn't currently a requirement, but I assume that this is largely because there isn't a significant supply of potential students who know the fundamentals at the moment.
Increase the supply, and some courses/universities have the option of making it a requirement.
Performance at A-Level mathematics is a better predictor of CS success than performance at A-Level Computing (which does involve programming), hence it's unlikely that universities will make it a requirement.
I see it as a failing of A-Level Computing more than anything. But I found about half the of first year was wasted on me, to break this down:
Com 1001 - Software Engineering Crossover Project
UML, requirements, analysis, design etc.
All covered in A-level computing.
Com1002 - Foundations of Computer Science
Maths, not covered at A-level in any subject.
COM1003 - Java Programming
I struggle to remember which concepts were covered in A-level and which I taught myself but sitting through lectures which taught what a variable was, then what classes and an objects are was painful. There were some golden parts that were enlightening but these were few and far between.
COM1004 - Web and Internet Technology:
Cryptography - not taught at A-level.
Ethics, law, piracy - taught at A-level.
COM1005 - Machines and Intelligence:
The first semester of this was writing about AI because we 'didn't know how to program'.
The second semester was learning systems which was very enjoyable and quite complicated.
COM1006 - Devices and Networks:
Anything that was taught at A-Level (boolean logic etc) was extended on and there was little time spent on basic stuff.
If they had made Computing A-level a requirement I'd have gained just shy of 6 months education I reckon (£4500 for future years), and if Computing A-level was extended upon to improve it then they'd be able to teach even more interesting topics. I think the requirement of Mathematics (which was required at my university, but not all) just shows a general aptitude in the appropriate areas, and only had direct relevance to two modules (one during first year and one in second year).
Any course with too many applicants for its spaces (the top Universities) could require Computing A-Level if it became more mainstream and I believe this would increase the standard of education possible.
EDIT:
To bring my point back round the article. This news makes me happy because a move from ICT towards Computing at GCSE would likely increase numbers taking Computing at A-level and definitely increase the quality of the A-Level. The number of people taking it and the quality of teaching at this level would hopefully remove some of the simpler concepts that Universities are having to teach.
The problem with making something like computing (or any non-standard) A level a requirement is that it immediately limits people who couldn't study it. Not all schools offer computing (or other less popular A levels), and by making it a requirement you're penalising those who had to attend them. In practice, that means rural and less-well-funded schools.
There obviously has to be some prerequisites. Perhaps the solution would be a summer school for students without A Level computing. Most universities offer a foundation year for people who want to study engineering or maths related subjects but don't have the A levels required. However, given that you only need to cover under 1 semester of work, you'd be pushed to convince people that because they didn't study a computing A level they have to add an entire year to their degree.
Probably the best option would just be to make it a strongly required subject, and cover the basics in less time. If you don't have computing A level, you need to work harder.
The reason that Maths A-Level is used is that it measures abstract thinking ability which becomes important for more advanced CS.
What my university (Bristol) did is that the introductory units had two variants - one which assumed programming experience and one which assumed none, that way students could self-select based upon experience which is probably a better solution.
Most degrees actually have a period where they're bringing everyone up to the same level (be it Physics, Psychology or Economics) because there's a lot of variation in school level teaching and that's always going to be there unless the government starts micromanaging every school lesson.
This is true. Additionally, however, at my university I was told by the department admissions head that it isn't a requirement because it limits your set of applicants. The university is interested in people who haven't had the exposure to programming, but are keen and interested. A fair amount of my friends in first year weren't very well versed in programming, but found discrete mathematics very enjoyable.
Let's not forget that programming ability isn't the end goal of a computer science degree.
The school I went to in Scotland only offered computing up to a level known as "Intermediate 2", which was somewhere between a GCSE and an AS level. We did some Java programming, and it was quite an interesting class that provided enough of a basis in basic programming concepts that I was able to continue learning, but that was it.
Higher computing? You had to get a 30 minute bus to a local adult-education type college. Advanced higer computer science? I wouldn't be surprised if fewer than 100 people a year actually sit that exam. I did advanced higher maths, and only 400 people sit the exam a year.
The Scottish curriculum is often quite good (AH maths taught me the first year and a half of my engineering degree's maths modules), but it suffers from a lack of students taking it. It's poorly recognised outside of Scotland (AH's are generally harder than A levels, yet are seen on the same level by most English universities), and the private schools that would give it some clout almost all use the English curriculum.
Absolutely glad to see this, however, there are some serious concerns which are addressed at the bottom of the article
"There are, of course, significant challenges to overcome, specifically with the immediate shortage of computer science teachers."
When I was an ICT teacher I was the only teacher I met who had any programming experience whatsoever. A large number of these teachers would even struggle with GCSE level maths and I do find it difficult to see how to government could train all these teachers to be able to be at a level necessary to teach any programming skills. Unfortunately those people who are qualified very rarely enter teaching.
Anyone who's studied Maths/Physics/Engineering will have had some programming experience, so I'm guessing most Maths teachers will. Whether they want to teach it is another question though.
In case anyone from outside the UK is thinking "Wow, those brits are doing really well here, I wish my country could take such a sensible approach to computing education", well you're kind of right, it's an enormous improvement, but I feel it needs to be made clear just how bad the situation is right now, before the changes come into effect.
I'm a geek who now has a degree in CS, and at school IT lessons were pretty much my least favourite lesson apart from Games. We didn't learnanything. That's not hyperbole, I really doubt anyone learned anything in those lessons. And it wasn't like I was so amazing I knew it all already, everyone knew it all already. I've been in lessons where I already know the content, and what happens is people get stuck and you can help them out. That never happened, everyone already knew what they were doing, because we already wrote our essays in Word, did our class presentations in Powerpoint and our experimental data collation in Excel. We knew it already and the exercises were just asinine.
Everything I knew about CS before I started my degree, I taught myself.
Also, the school I was at was officially an "IT/Maths specialist school" or something like that (edit: https://en.wikipedia.org/wiki/Mathematics_and_Computing_Coll...), and in the year I graduated it won The Sunday Times' State School of the Year award.
I was in the first intake of the "new" style A-Level system, where 4 or 5 AS-Levels were taken in year one, then 3 A-Levels taken forward to year 2.
I dropped CompSci at AS-Level after seeing what a joke it was (took English, Maths and Physics instead), although I had never really done any programming at all. Just HTML.
One of the best/worst things about the "new" system (I say new, as it was new when I did it, but it's now a decade old, of course) is that the introduced "Key Skills" classes, where you had to learn literacy, numeracy and IT.
Apparently, while English and Maths got me out the first two, CS wasn't good enough to get me out of doing Mail Merges. So I didn't go anyway, and my tutor would be sent nastygrams every so often saying I was failing by not turning up. My tutor's response? "Oh, I put them in the bin."
Before I dropped ICT; a project in year 9 involved designing a database. It's usually taught in access, I did it in MySQL. I received low marks because of that.
Yes, and I assume it was to be assessed with what the teacher, and the curriculum, were instructed to assess.
Just because you feel MySQL is better doesn't mean you should get bonus points.
I agree using Access to teach database design is stupid, but that's hopefully something that this will address. Going against the curriculum because you feel you're above the instructed way is a fantastic way to get low marks.
Fortunately when it comes to university there's usually flexibility if you want to go above and beyond the minimum, but you still have to tick the boxes of being able to meet their assessment criteria.
Edit: And what happened in that article is what should've happened, in my opinion. You don't get bonus points for being a smart arse. IT teaching is woefully poor, there's no way that teacher could assess a proper app when the original assignment proposed a mock-app in PowerPoint.
All I can say is, why has it taken them so long?
Both the problem and the solution have been obvious for /years/.
However, I'm glad the gov't have finally obtained some clue on this issue.
I suspect that there are plenty of people qualified to teach programming, it's just that most people currently teaching ICT are not qualified to teach anything (if the idiots I was taught ICT by are at all representative).
If the government dumped less shit on the head of teachers in general, perhaps they might be able to attract competent programmers away from industry and into the teaching rôle. (They would also attract competent people in general, instead of the current situation where about 5% of teachers are competent people who love teaching and haven't quite been driven away yet, and the rest are "those who can't, teach")
There are certainly people willing & able to teach programming, but most of them are not currently teachers, and many IT teachers would probably struggle to teach programming effectively.
The article suggests that they want to make this change in the next academic year, so where are the teachers going to come from?
They could let those people become teachers without having to go through the rigmarole of largely useless training. The next academic year is nearly eight months away; that's /bags/ of time.
The barriers to entry to teaching are just another reason why so few good people go into teaching.
It's a bit unfashionable to point this out, but... the current govt has been around for a year and a half. The previous govt had thirteen years to do something about it.
Oh I agree that the previous govt were dreadful. But I can understand /Labour/ being bereft of clue. I expect better from the Tories. Then again, I suppose they've been a bit busy dealing with economic and european issues.
I worked at a publicly funded inner city technology center in one of the larger UK cities whose mandate was to integrate technology into the curriculum of the surrounding schools. I can not tell you how much potential was wasted. Huge computer labs essentially used as annex class rooms. spyware installed to make sure kids weren't "googling bad things" or "speaking ill of their teachers in email". There were attempts to teach programming but the few kids that were interested had to come after hours and use notepad to write javascript to run in internet explorer from the desktop (not a bad way to learn just how easy it is really, I am just saying there was no official support/classes for such programs).
So in short: I really hope this is a sign of change but I also know that money thrown at these programs is often just squandered on projectors and digital white boards.
edit: upon reflection I think I feel guilty that I did not do more to make things the way I envisioned. I could have easily created a "programming class" image that we flashed the systems with to let them have linux, learn emacs, vim, apache etc. but I think I was too inexperienced to really be sure of myself. I'd love the opportunity now as I could offer a scratch/squeak course, a scheme course, web apps, lua/love for game programming. So many awesome free tools for every type of aptitude.
I've found that ICT has had such a damaging effect on my generation's perception of what CS is and what the people that do it must be like. I'm really glad to see it go. It's great to know that people will get a chance to have some insight into a world which a lot of people would never encounter.
I'd like to think that this was a result of the e-petition started a few months ago too.
I feel the same! It's probably because we have a vested interest in this area, but putting that aside it's nice to see something coming from the government that I think is a genuinely good thing (for a change?).
I think this is a great move. Although it would be interesting to see how many people actually take the new subject.
The problem at the moment is as the article says, just get used to Microsoft GUIs and not know anything behind them. And what happens is people do other things in the lesson, as they get bored, or find it to easy so they don't even bother. The whole MS access stuff can be very dull.
They could even bring in sections of web development into the new course, ie setting up a LAMP environment and learning PHP. And not just learning to create table (shudders) based rubbish in Dreamweaver or the like.
Or on the comp sci/software dev side of things learn Java programming for Android. Although I would think that iOS dev would be a step too far at this stage - not sure?
Or maybe start teaching them a high level languages such as Python or Perl to help give them a good understanding of programming practises etc.
This is great news. However just think of how hard it is to hire software developers. Now how the hell are schools (with fixed salary caps and a work environment filled with children) going to compete?
They can't, so just throw some mathy teacher in there and hopefully the students will more or less teach themselves, and if you're really lucky, each other. That's how every high school CS class I've ever taken has worked. It's not ideal, but it's better than no CS at all.
ICT is learning how to use computers efficiently, how to do a non-computer job like accounting on computers., basically how to use Office. Computer Science is different, and now that home computers are common enough could easily be done at the same time (school years) as ICT. This divide of using or creating goes on forever, my options of courses for next year are Business IT (supporting those people who are just using Office) or Programming (making programs, though possibly only in Java).
Back in 1993 I was doing GCSE Computer Studies which included everything from desktop publishing to writing simple BASIC programs. But when I want to continue this into A-level, I had to personally convince the computer studies teacher to let me do it on my own since the school didn't offer it as an option. In the end, I did a mostly self-taught AS Computer Studies with some guidance from the teacher.
I was extremely lucky to have a well-resourced school and a flexible teacher
I have worked as the designated IT Tech at schools across several counties in the UK. In some junior schools the role of I.T co-ordinator is just given to the newest/youngest most unsuspecting teacher. Usually they have no specific ICT knowledge or interest in the subject.
There is a vast difference in the quality of ICT teaching at the younger age it doesn't provide a standard foundation for this new curriculum to sit upon.
Students arriving at high school at 11 have such different experiences of ICT. The curriculum they need going forward needs to have flexibility to teach what's individually challenging & help those above average to excel & not get bored.
The schools that I've seen integrate ICT successfully separate the 'use' of IT day-to-day from the teaching about of the history, theory & application of the subject.
For example groups in the class using laptops to type up a project vs sitting in the ICT suite to learn to write instructions to control robots/traffic light sequences etc.
It's not fair to consitute word processing, e-mail, or desktop publishing as learning ICT anymore. It's like providing a lesson on using the telephone, it's not needed now they're a universal method of communication.
Now sitting at a computer to do general work doesn't constitute ICT learning.
Integrating ICT into the classroom works well. You have no idea how long it takes to take 30+ excited pupils down to the ICT suite, to keep them calm, get them logged on with their own username & password and to start a lesson. The lesson is half over by then.
Had I been exposed to the world of programming in a structured way I would have jumped into it feet first & would no doubt have many years experience by now.
I can vividly remember being around 12 and wishing to know how I could build websites & to learn how to code and not knowing where to start or who to ask. I assumed that it was for college/university level. College focused on building relational databases & the data protection act, Then at University I was thrown in with already proficient programmers.
Even though I had 16 years within the academic system I feel the majority of what resonates with me has all been self taught in the last few years since leaving University.
The new curriculum needs to show pupils exactly what's possible with the power & scope of CS skills and that it's accessible from a young age.
Simon has been doing a ton of work behind the scenes getting a decent curriculum in place for computing in the UK, playing a major role in a group of educators tackling the issue. I would not be surprised if Michael's words were lifted directly from one of his reports :-)
I loved the picture's caption accompanying the article: "Schools will be free to use teaching resources that will equip pupils for the 21st Century". The picture then shows young pupils using aging CRTs. It made me smile :)
The problem is that Google and a lot of newer companies fetishise CS ie computer science to an insane degree.
What indistry realy wants and needs is some one who can take a problem and produce a suitable solution which is much more computer engineering.
What you don’t want is more ivory tower geeks writing in lisp who obsesses about algorithmic purity and can produce noddy systems that work for some cases but will break badly in the real world.
One example from the 80's I was helping build billing systems for BT I also had to run the system and make sure it all worked properly. The first time we hit £1,000,000 in a month (about $5,000,000) in today’s money ) we had a small celebration and I recall the CTO (one of Vints reports I believe) nudging me and saying "this had better be right or we are both out of a Job"
Its like saying we need more engineers like Ross Brawn I know lets train more physicists 10 years later you have 200 Brian Cox's and not 195 Ross Brawns and 5 Brian Cox's working at Cern
I don't see why CS can't “take a problem and produce a suitable solution.” Understanding the essential underlying theory (complexity theory, etc.) is essential to solve these problems.
Furthermore, not all Computer Science degrees are purely theoretical course where all you do is esoteric logics and category theory. If anything, I think there's too little theory in the UK's CS courses these days.
But most real world probles arn't "impliment a bubble sort" pure CS problems 99% of the time you need domain knowedge to point out the the customer err you have used the wrong equaion for that design.
That's a remarkably smart thing for a politician to say about computing education.