Clarifying the Dewar and Schonberg Article

There has been quite a bit of discussion about the article by Dewar and Schonberg
http://www.stsc.hill.af.mil/CrossTalk/2008/01/0801DewarSchonberg.html
claiming that:
“It is our view that Computer Science (CS) education is neglecting basic skills, in particular in the areas of programming and formal methods. We consider that the general adoption of Java as a first programming language is in part responsible for this decline.”
In http://itmanagement.earthweb.com/career/article.php/3722876 Dewar clarifies that it isn’t Java that he blames so much as the “use of the Java’s graphical libraries lets students cobble together code without understanding the underlying source code.”
The only evidence of these claims is that they see a decrease in performance in their systems and architecture classes. They also have trouble recruiting qualified applicants who have the right foundational skills for their Ada programming company that develops mission critical software.
The biggest flaw in the article is the lack of evidence supporting the claims. How many people fail the systems and architecture classes now compared to when C++ or C was used as the introduction language? Is this a problem just at their schools or nationwide? If the introductory courses switched to Java and the follow on courses never changed to introduce concepts no longer covered in the introduction course (like pointers) then of course more people will fail. It is likely that using C or C++ in the intro course just caused more people to fail and quit after the first course instead of later. Perhaps the systems and architecture courses are being taught poorly. At Georgia Tech we found that student performance improved in low-level systems types courses when we used the context of programming for a game boy. Students today don’t find low-level systems programming as interesting as they did 20 years ago, when computers weren’t capable of much.
I am not surprised that they have trouble finding people who know Ada. It certainly peaked many years ago in terms of popularity. I also don’t find it compelling that they want people to have more low level skills since the biggest growth is in jobs that have higher level skills (like software engineers).
One of the reasons Java is a popular language in industry is because you don’t have to build everything from scratch. Good software engineers need to know how to reuse existing classes and how to design classes that can be reused. Why should students have to build their own graphics primitives instead of using the Java graphics classes? What learning do they miss out on by not doing this?
When I first took a 3D graphics course we had to develop the algorithm for drawing a line. As students we found this a boring and tedious task since even at that time all the graphics packages had algorithms for drawing a line. I very much doubt that this is required in current 3D graphics courses. Yet, the field of 3D graphics has made huge advances since then. In part we made advances in fields by not reinventing everything.
Dewar in particular claims that the introductory curriculum has been “dumbed down” to make it more fun and appealing. Again, what evidence does he give for this claim? He says that students are not learning formal methods for proving program correctness, but my understanding is that this field which was popular in the 80s has not had much success. He also claims that students don’t have enough knowledge of floating point computation. Again, what proof does he give for the need for this? Students certainly need to be aware of the problems with floating point computation, but very few will go on to do mission critical low-level work.
Our research on learning computing in a context whether it be Media Computation, Matlab, or robotics has shown that it does improve student success and retention. We also have the evidence to back this up, not just at Georgia Tech, but at several other universities and colleges. Just because you make something fun or interesting doesn’t mean you have “dumbed it down” or that students aren’t learning what they need to in order to be successful in a career in computing.
Barb Ericson
CSTA Teacher Educaiton Representative

Leave a Reply