This is response to Ryan Walmsley’s post recently where he concluded the answer to the above question to be no. Here is my response.
Ryan brings up a number of interesting points on how he believes MIT Scratch should only be used with kids up to the age of 11. I highly disagree with this statement. Scratch is an amazing platform for a beginner of any age! I have taught kids as young as 5 to even a 70 year old with Scratch. I really don’t think age has anything to do with it as a beginner is a beginner.
Like many universities, the Open University is faced with the issue that 90%+ of its students have never coded before. Although I don’t believe the module should be a full year, I guess it depends on the number of hours expected every week (is a full time course vs part time etc).
The Open university is far from alone in teaching using Scratch. Many universities in the United States use Scratch to teach their first year students. Code.org has produced a great video explaining why.
In the first few months, it is in my opinion more important to teach students the constructs and problem solving associated with computer science than teaching syntax. Syntax is different for every language, Scratch lets you move onto any language after where as jumping straight into a text based language may lock certain syntax into students heads as being associated with for example an if statement.
Although I will admit Scratch 1.4 is not a tad limiting for use with university students after more than a few weeks, Scratch 2.0 adds objects and a number of other items, making it much more suitable. Even better though would be Snap or byob.
Should we discourage students from learning programming before the course?
This though is the bit I personally disagree with most. Ryan concludes no, we shouldn’t be teaching kids to code.
Computer Science already has the highest dropout rates in UK universities (9.8%). Is this because it is just too hard for 9.8% of students? I don’t think so, I believe the high dropout rates are due to the students being misinformed about what Computer Science actually is. At school you already get a chance to study modern languages, art, business studies, history etc. All these subjects have direct follow on subjects at university and have much lower dropout rates. The students picking them have a rough idea what to expect given they may have been studying them for 7 years already at secondary school. With a majority of students, Computer Science is not an offered subject, how are they meant to have a clue what to expect if they don’t get a chance to try it out beforehand in school?
Many students do Computer Science as they believe it will follow on from A-level ICT, something which they then discover is very far from the truth.
Chicken and egg
It is a chicken and egg scenario though. If enough students study Computer Science in school, universities can add it to the requirements, but given it is statistically seen as a harder A-level, students won’t take it unless they are really interested in the subject until they need it for universities.
Some universities have taken the first step, I have to applaud my own University, Queens University Belfast who provide lower entry requirements for students studying A-level Computing. It is currently in the minority though in doing this, but more are starting to follow suit.
Pick a course that is right for you
It is ultimately up to the student though to pick the correct university for them. It is their job to read the fine details of their course and decide if the course content is right for them. If it isn’t especially in England, perhaps consider looking into a different university given there are quite a few with most having similar fees.
To conclude, do I think students should be given the opportunity to code from KS2 onwards? Yes, it gives them a better ground to make a decision on a university course.
Do I believe MIT Scratch is worth using in universities? Yes, although would recommend Snap over Scratch due to more functionality.