Tuesday, February 13, 2018

The Battle Against Compartmentalisation - Curiculum Design Challenges

Recently there has been a lot of discussion about the need for ethics courses as part of Computer Science / Software Engineering (or really, any form of engineering) degrees for obvious reasons. While I don't dispute the need for such courses (and indeed, I firmly applaud and welcome the introduction of these as integral parts of the curriculum) experience suggests that we do need to think about how we're presenting such material to students.

Specifically, I have doubts about whether the standard model of "let's just include a course in there to tick that requirement off" is actually the most effective way of doing it. Examples of such courses include "Programming in Matlab", "Ethics", "Security", and to a lesser degree, parts of HCI/Usability/UX. Many of these also carry a bit of added baggage in that they are often designated as "required" courses for a particular degree (more on this later).

From personal experience (and from observing students over many years), topics like this cannot be easily "compartmentalised" into a "tidy little thing that you think about separately from other things". That is, you can't really say, "Here is the body of knowledge you need to know. Memorise it, and pull it out of a hat when you need it". Instead, true understanding and mastery of such material is actually only achieved by adopting the "fundamental mindset" involved. For example, a few crude examples of fundamental mindsets for the aforementioned fields are:
     * Security - Trust no inputs - Treat everything that the "user" inputted as being potentially compromised, and a potential attempted attack.
      - Alternative mindset: How can I hack/crack this?

     * Ethics - How could this go wrong in the worst case? Who could get hurt/harmed, how bad would that be, and are there better alternatives that won't cause anyone that sort of trouble?
      -  Alternative Summary 1: Don't be a jerk
      -  Alternative Summary 2: Would you subject yourself and your loved ones to this? Your mother? Your children/future-children/grandkids/great-grandkids?

     * Usability - Humans are clumsy and stupid creatures of habit (with limited memories, limited attention spans, limited physical capabilities, and a whole bunch of other handicaps).   The problem therefore becomes - how do we try to reduce confusion and/or the potential for things to go wrong so that the bumbling apes can still get their jobs done.
       - Alternative Summary: Could I still use this thing when drunk, sick, injured, all of the above, and I couldn't look up the code/docs to check what's going on?

     * ProgrammingComputers are idiots - They can perform complex operations, lots of them, very very fast. But, you have to precisely tell that what to do, when to do it, and how to do it first.

However, you can only adopt/absorb a fundamental mindset if you spend time developing a new a set of skills and/or adopt new ways of thinking about problems.



Thus, there are really 2 main problems with teaching such material as a "one-off" course unit:
     1) Incomplete Mastery / Aborted Development of Expertise - Things like designing user interfaces and understanding how to successfully gather (useful) requirements from users are tricky skills to master.

Sure, everyone could potentially "do" these jobs (assuming a very low threshold for what you'd consider acceptable). But, to do a good job of these (e.g. maximising value provided versus amount of time wasted fixing/redoing things), you often need a lot more time to hone your skills and importantly, "sixth-sense" intuition/experience about what approaches will tend to work better.

Those can only be developed by gaining experience over longer periods of time - by only providing a single opportunity (or maybe 2 tops) during a short window (i.e. within 6-12 weeks, no, correction a few hours a week for 4-weeks (as the other 6 are likely quite overloaded and harried)), graduates are in fact woefully underprepared. Yet, many may also be lulled into a sense of complacency that they now "know everything there is to know about the subject".

(I'm all too aware of how big this gap can be ;)  If you do find yourself in that position, remember that it helps to carefully review the fundamentals from time to time - you'll often be surprised at how something that may have once seemed like a simple, obvious "throwaway" nugget can suddenly turn into a gift that keeps giving for dealing with something you've been struggling with recently.

       2) Compartmentalisation / Isolated-and-Detached Thinking / Forgetting-over-Time - By keeping such topics confined to a particular course, there is the risk that students view the material as being this isolated and detached body of knowledge that they only deal with when in a certain mental box/zone. For example, let's consider the example of students taking a programming course at school (or as part of their degree). Sure, they may "learn to code", and be able to do it to the point where they can pass the course.

But, that's still a massive gap between coding becoming a "fundamental aspect" of their approach to problems they may face outside of the computing classroom, and simply being able to write code when a coding problem is placed in front of them to solve (e.g. the processing script they got from their co-worker/predecessor to carry out some important part of their job broke, and they need to fix it). Here, "fundamental aspect" means that when faced by some routine/repetitive well-defined problem (e.g. "I need to grab all the data from websites A, B, C and combine all that to generate insights/report X"), they may be more inclined to realise that this is something that could be solved with code instead of continuing to do this job manually. They may still not necessarily know everything needed to it all themselves (especially if parts of the task goes beyond their skill level), but at the very least they can begin to formulate an approach and/or be able to seek appropriate assistance in doing so.

There is a very real risk in most cases that students consider material they learned in such courses as "throwaway" knowledge. Indeed, a lot of material you do learn in a course may prove to be completely useless in your future life. (I can safely attest that these days, I only have hazy recollections of solving Quantum Mechanics equations, the complex web of organic compound reactions, and the inner workings of a ribosome).

This would be all fine and well, were it not for the fact that sometimes we make such courses "required courses" for a degree because later courses need students to know how to do this stuff (programming in Matlab), or UI design (as most software has a UI of some sort). The problem is that often students may be introduced to this material in an earlier course, and then not see it again for 2 years, and then suddenly we need to them be able to do it again, having provided no opportunities for practice (aka remembering what the hell it was all about) and/or development of skills in the meantime

Sure, there's only so much we can fit into the curriculum, but there is a real problem here that students have often completely forgotten material that they come to consider as being a "one-off" - almost mentally boxing up and consigning the material to the mental "trash can of useless trivia" that one acquires. For example, our department used to run a required course for 3rd year engineering students, and we'd often have to spent the first few weeks of labs just re-teaching them everything about how to program in Matlab, as they had already completely forgetten everything about it despite having spent an entire semester learning about it in first year. I don't know about you, but this strikes me as the very antithesis of "effective teaching and learning".

~~~

While we're talking about "required courses", I'd like to bring up another problem that "forcing" students to take certain courses so that they are exposed to particular material has: In cases like this, you're starting off on the wrong foot, since the students don't want to be there in the first place (in fact, some may be quite disgurntled that they are being "forced" to take some course they view as being worthless/unwanted, or simply because "some higher power said that they must do it").  In other words, they're already less receptive to the concepts than they otherwise have been, so you're going to have an uphill battle to get many to even seriously consider and interact with the material - precisely the opposite of what we actually need when it comes to many of these topics.

~~~

What solutions (if any) are there to these problems?
To be honest, I am not claiming that I have any fullproof solutions to these problems. What I am proposing instead are some ideas that may be worth investigating and exploring that might lead to better outcomes (hopefully).

So, what could we do:
1) Integrate such material pervasively throughout the curriculum - So, instead of having this material only feature in a dedicated course, it needs to appear/reappear in a few other (often widely unrelated) courses, at different times over the entire degree programme. The idea here is that different lecturers should try to point out how these kinds of fundamental concepts apply to and/or affect the material they are teaching. Of course there are practicality concerns (i.e. how do we fit this in, without overwhelming students trying to grapple with the intricacies of some technically difficult problems).

2) Extra-curricular activities/challenges for students of at all stages/abilities to develop and apply those skills?  If it isn't possible to actually fit any more coverage of the material into other courses (I'd be surprised if that's actually true in some cases), this is a prime example of where extra-curriculars can come into it (e.g. regular projects/challenges outside of class, collaborations with the community/industry, etc.).  The downside though is that not all students will be exposed to these (or want/be able to take part), thus reducing the possible coverage to only the students who participated. That may limit it to the ones who already cared about these issues in the first place, meaning that the others still don't benefit.

3) Dilute the focus of recurring "software engineering" courses to focus less on "Object-Orientated" design (and structural/procedural issues linked to those) to also put recurring attention on these issues?


4) ...?


No comments:

Post a Comment