Well, that’s quite a mouthful, I agree – ontogeny recapitulates phylogeny. What it reflects is rather interesting. The phases in the development of an individual organism mimic to an extent its ancestral form. For example, the human embryo develops gills bars which are then modified for other purposes. But, “to an extent”. Biologist point out that the so called recapitulation occurs at a very coarse level, bereft of details [The Sciences of the Artificial, Third Edition by Herbert A. Simon, MIT Press, 1996].
But this serves as an exciting analogy for something similar that happens while we are learning a subject. The student’s progression seems to resemble the progression of the discipline from childhood to maturity.
Take for example, software engineering. First there was waterfall, then the iterative and incremental methods, and then agile. The streamlined regularity of waterfall is most useful for the tyro software engineer; there are copious milestones and checkpoints so that (s)he has enough guidance to get along. The essence of iterative and incremental development is appreciated with experience, when the power of feedback becomes manifest. Finally, agile in all its facility marks the transition from instruction to interaction. The software engineer now has sense enough not to need micro-management, (s)he can leverage the typical loose-coupling of an agile team. (Indeed, a criticism of agile has been its over-dependence on “premium people” to be successful). So the progression of a software engineer from a neophyte to a seasoned pro is marked by a progressing embrace of the more mature development methodologies.
What is the implication of this trend on how we teach software engineering to students? Do we introduce waterfall, then iterative and incremental development, then agile, in sequence? Or should we indoctrinate students in the most mature methodology of the day and touch upon others as historical footnotes? In conventional sciences the dogma of the day is “taught”, while its ancestors may (or may not) be mentioned – for example ether in physics, or phlogiston in chemistry. But unlike ether and phlogiston, waterfall is not merely of historical import, systems of relevant scope are successfully built using waterfall, very much in the present.
So must software engineers learn and unlearn (or at least learn to supersede) a methodology before moving on to the next? Or is there a more organic advancement?