Skip to main content

How Should You Teach Software Development?

Posted by flozano on January 5, 2006 at 7:33 AM PST

First of all, I have to admit that teaching in college was not the pleasant experience I expected it to be. I love teaching. But I found most students had no interest in what I was trying to teach them. I know I am not a bad teacher, because when I do training I receive a good feedback from my customers. And I am proud a handful of my students actually became respected software developers and they openly state this happened thanks to me.



Working with IT for more than 10 years, I found the same problems other bloggers are complaining about to recruit people, specially undergraduates. And I feel each year the problem gets worse, that is, people become able to get a computer-related degree having less and less competence to write computer programs. But I was surprised to learn this was not just a problem in Brazil, where the educational system is deteriorating in all levels, but a problem hirers and project leaders also have in the US and Europe.



At first I blamed the then new easy-to-use generation RAD tools. People started to think software development was about drag-and-drop components from a visual palette and no one wanted to learn about coding. I remember since the early days of Clipper there were lot of tools promising “you don't need to be a programmer to write applications using this tool”. Well, decision-makers and potential software developers believed this falacy and now we are feeling the consequences.



You can't blame the universities or students alone for the low quality of today undergraduates. Who told the students to go the easy way and not to care about complex algorithms, data structures and CS theory? Today wanna-be software developers are at best computer power users, they are not old-time hackers in the sense as people addicted to computers that wish to learn how they work and find new uses for them.



While people discuss if they should start by teaching Java or whether they should adopt Scheme they are missing the point. A real good software developer will change from one language to another with the same ease he can change his shirt. He has to. Popular computer languages (specially for Information Systems) have a very short life span. Before the course ends either the language will be virtually dead (as Clipper is) or it will have become a new, different language, for any practical purpose (as VB.Net today). Even if you argue Java is more or less the same for 10 years, if you take into account the proliferation of new APIs and frameworks, you understand that a Java developer who stayed current since its creation should have studied enough to get one more college degrees or maybe a latu-sensu.



So the language you do use for teaching does not matter. What matters is what you are teaching. It's a fact that most students have a need for employment and so they'll chase whatever they see in employment ads. That's the reason so many universities are teaching Java early on. You'll never have interested students if you teach using an “underground” language and spends most of the time teaching boring examples like sorting algorithms. You won't also have results if you require the students to learn complex math like queue theory while at work (or at internship) all they do is CRUD applications.



It doesn't matter either whether functional programming, lambda calculus and finite automatus are important for solving complex problems if you can't show the students real-world problems, which they can relate to what they believe it will be their daily work routine. You also have to start with problems most of them are able to solve. Else they'll loose interest. Computers are perceived as being “user-friendly”. It won't work to focus only on the hard parts of the work.



Both educators and IT pros have to face the hard truth: the industry needs a lot of software developers. If you do work as a teacher it's your job to prepare as many people as you can to fill those jobs. You are not meeting your goals of you think you should only pass the computer hackers. The other way around, IT people have to start being honest about what they require from new pros. If they continue to send misleading messages to new workers they'll continue to get lame new pros.



Teacher really have a great part of the guilty. I found very few of them who would teach real-world OO practices and effective use of relational databases. Most of them still uses Turbo Pascal or whatever they learn when they were undergraduate students. And, when some of them are forced to new things, like Java, they prove to be lame developers using those tools. A teacher will never be respected by his students if he teaches Java (even if only as part of an OO design course) and cannot answer questions about Struts, Hibernate, JSF and other stuff students hear about.



So we have universities striving to meet his customers (the students) perceived needs, which are driven by an IT industry that has lied for many years about what it takes to be a real developer, and teachers who are too much frozen in academia and have no real-world experience. A sure recipe for failure.



The problem is made worse by the fact no school exercise, that you have at most a semester to complete (alongside five or six other assignments to complete in parallel for other courses) will ever force the students to deal with the real-world needs of maintenance, productivity, security and reliability. A college assignment ends when it's presented to the teacher, while a real software project has just started when users get the first release.



I think students should be required to complete something like medical residence before they get their degree. They need to spend some months working full-time on a real-world software project that requires team work to actually learning what it takes to create good software. And their involvement with this project should not end with release 1.0. Maybe they should be required to work in a maintenance or enhancement project instead of developing a new application.



But I see in Brazil (is that the same in the US and other countries?) a strong tendency to shorten the computer-related college courses, to meet the high demand for new IT pros. Instead of spending four to five years studying computers, you can now spend just two to three years. How could the curriculum be compressed to half the time span? Of course students won't learn many thing they are supposed to learn.



I see the computer field as a very deep field, which needs experts with proper training in different specialties, just like physicians. How long does it take to become a good DBA? To become a network-security expert? Or to become an operating-system level developer, who builds device-drivers? The problem is, neither the industry nor the academia recognizes this need. Both believe better software tools and faster computers will make up for the missing knowledge of software developers. And it's much easier for the students to believe in that than doing the hard work that will make then good software developers.



Whoever wants to be a good software developer today in on his own, he won't receive much help or guidance from either academia or industry. :-(

Related Topics >>