Interesting evolution is always unexpected. The evolution of species plays out over eons; it is only through the long-lens of hindsight do we see it. But software evolution can be quick, yet interesting. It is not uncommon for a component to have changed its spots over just a few iterations. Software’s eponymous softness -- its malleability to change -- helps it evolve quickly, yet significantly. But other artifacts of our lives, even those hard to touch, evolve too.
I have seen a mirror morph into a photo frame. It is not one of those full length mirrors where one can see oneself, whole and more. The mirror stands where it has stood last thirty years, on a chest of drawers, resting against wall. It is a table mirror, roughly the size of an A4 sheet, with borders brown, but not baroque. The brown has become more distinguished with age. With age, the borders have also gathered pictures. Pictures from an age when photography was an occasion; and photos were few and precious. The pictures are of different sizes, and of different people. They capture those people as they were, and as they will be. Starting from the rim, the photos have eaten inwards into the mirror. There is hardly room for fresh reflection.
I have also seen photo frames that have not morphed into mirrors. Usually, they can hold countless pictures, can change them in sequence, turn them bright or dim, and even play music per the picture’s mood. They are digital photo frames. They just need to be turned on.
Today, with technology, we have space for infinite recall, and time for little recollection. But memory has its own ways. In the fullness of time, it flattens the peaks and flatters the plateaus. Looking at a mirror, we see the now; not what was or will be. We need a photo frame for that.
I see the morphing mirror as a fine metaphor for the changing needs that drive evolution. With progressing life, reflection and recollection has blended into one, the mirror has become a photo frame. In the life of software, as operating circumstances change, a component tunes to its newer needs. The more unexpected an evolution, deeper is the signature of Life.
Tuesday, August 2, 2011
Saturday, April 23, 2011
Software Development: Levels of Languages
Languages of software development operate at several levels: facilitating expectations from a software system to be shared between stakeholders (specification languages), capturing analysis and design artifacts through a common set of idioms and notation (modeling languages), and implementing the system in executable code (programming languages). Each of these levels represents a particular layer of abstraction.
As with natural languages, languages of software development also offer many layers of details. Out of them, only those relevant to the problem at hand need to be considered in a given situation. Some of the factors that lead to the popularity of a programming language are: power of expression, ease of learning by beginners, ease of running on a wide range of architectures, tool support etc. The choice of a particular programming language for developing a system depends on the kind of system it is. Each programming language has a definite focus area; for example C is well suited to systems programming. How high-level a programming language is depends on the degree to which it hides the underlying complexity of the physical machine that executes the instructions given in the language by the human programmer.
A model is based on some assumptions; it highlights some aspects of interest in a system, while hiding others. Challenges to modeling software arise to large extent from its inherent ‘invisibility’ and ‘unvisualizability’. Unified Modeling Language (UML) is a language for visualizing, specifying, constructing, and documenting the artifacts of a software system. UML consists of three kinds of building blocks: things, relationships, and diagrams. UML diagrams are classified as structural and behavioral—the former captures the static aspects of a system while the latter captures the dynamic aspects. Although very popular, UML is not universally accepted in the software engineering community: A common criticism is that it is a product of “design by committee”.
Specification languages such as Z (invented by Jean-Raymond Abrial and named after the mathematician Ernst Friedrich Ferdinand Zermelo) broadly relate to the formal methods of software development, using mathematical concepts and constructs to describe the behavior of a software system; use of formal methods leads to an increase in time and cost earlier in the development life cycle, vis-à-vis the use of traditional, non-formal methods. This is one of the reasons why formal methods are not used widely for commercial software development, where time to market is often given higher importance than quality.The quest for exactitude and completeness is paramount in a specification using a language such as Z. Specification languages and formal methods, when used in combination with traditional approaches can facilitate the construction of quality software.
As with natural languages, languages of software development also offer many layers of details. Out of them, only those relevant to the problem at hand need to be considered in a given situation. Some of the factors that lead to the popularity of a programming language are: power of expression, ease of learning by beginners, ease of running on a wide range of architectures, tool support etc. The choice of a particular programming language for developing a system depends on the kind of system it is. Each programming language has a definite focus area; for example C is well suited to systems programming. How high-level a programming language is depends on the degree to which it hides the underlying complexity of the physical machine that executes the instructions given in the language by the human programmer.
A model is based on some assumptions; it highlights some aspects of interest in a system, while hiding others. Challenges to modeling software arise to large extent from its inherent ‘invisibility’ and ‘unvisualizability’. Unified Modeling Language (UML) is a language for visualizing, specifying, constructing, and documenting the artifacts of a software system. UML consists of three kinds of building blocks: things, relationships, and diagrams. UML diagrams are classified as structural and behavioral—the former captures the static aspects of a system while the latter captures the dynamic aspects. Although very popular, UML is not universally accepted in the software engineering community: A common criticism is that it is a product of “design by committee”.
Specification languages such as Z (invented by Jean-Raymond Abrial and named after the mathematician Ernst Friedrich Ferdinand Zermelo) broadly relate to the formal methods of software development, using mathematical concepts and constructs to describe the behavior of a software system; use of formal methods leads to an increase in time and cost earlier in the development life cycle, vis-à-vis the use of traditional, non-formal methods. This is one of the reasons why formal methods are not used widely for commercial software development, where time to market is often given higher importance than quality.The quest for exactitude and completeness is paramount in a specification using a language such as Z. Specification languages and formal methods, when used in combination with traditional approaches can facilitate the construction of quality software.
Labels:
formal methods,
languages,
software engineering,
subhajit datta,
Z,
Zed
Software Architecture: In Spirit and Letter
“You employ stone, wood and concrete, and with these materials you build houses and palaces. That is Construction. Ingenuity is at work. But suddenly you touch my heart, you do me good, I am happy and I say ‘This is beautiful.’ That is Architecture.” – Le Corbusier (1923), quoted in Architecture: From Prehistory to Post-Modernism.
The above quotation captures the spirit of architecture. We recognize good architecture when we see it, but it is difficult to define what good architecture is. Seeing the Taj Mahal, one of the greatest architectural wonders of the world, one is struck by its beauty, symmetry, poise, and proportion. But one is also aware, there is something nameless beyond these obvious attributes that endows the Taj with its architectural majesty. This sense is imparted to us by the underlying architecture. Whenever we deal with complex structures, we need to explore architectural ideas to simplify the task as well as make the end-product beautiful and resilient.
If architecture is difficult to define, software architecture is more so. One of the first organized attempts at studying software architecture can be traced back to the 1995 November/December issue of the IEEE Software magazine (Volume 12, Issue 6). Many of the articles published in that issue introduced foundational notions of software architecture, such as Kruchten’s 4+1 view model.
Rumbaugh et al. capture the essence of software architecture as ``The organizational structure of a system, including its decomposition into parts, their connectivity, interaction mechanisms, and the guiding principles that inform the design of a system.” Key themes of software architecture revolve around taking some fundamental decisions about the structure of subsystems, components, and their interactions that together make up the entire system, and relates to reusing some of the wisdom that other people have gathered in working with similar systems.
Architectural patterns facilitate such reuse. As Alexander defines them, “Each pattern is a three-part rule, which expresses a relation between a certain context, a problem, and a solution.” Relation is a key term in the above definition—patterns connect problems, solutions, and contexts through relationships. The model-view-controller (MVC) pattern is among the most widely used architectural patterns in Web development, where we have a presentation layer, often called the graphical user interface (GUI), a middle layer to handle business logic for fetching and manipulating data, and a back-end database for storing data.
Software architecture as a discipline can also be seen from different angles: as a strategy for organizing a system’s constituents, as a high-level abstraction of a system’s design, as a vehicle for communicating fundamental ideas about a system’s development. Software architecture is not software design, although there is no clear line where the former ends and the latter starts. Design relates to choices about how specific functionality is to be delivered, architecture is more about decisions regarding the system’s structure. In future, with software systems becoming larger and more complex, software architecture ideas will need to address deeper issues around delivering more sound and durable solutions to user needs.
The above quotation captures the spirit of architecture. We recognize good architecture when we see it, but it is difficult to define what good architecture is. Seeing the Taj Mahal, one of the greatest architectural wonders of the world, one is struck by its beauty, symmetry, poise, and proportion. But one is also aware, there is something nameless beyond these obvious attributes that endows the Taj with its architectural majesty. This sense is imparted to us by the underlying architecture. Whenever we deal with complex structures, we need to explore architectural ideas to simplify the task as well as make the end-product beautiful and resilient.
If architecture is difficult to define, software architecture is more so. One of the first organized attempts at studying software architecture can be traced back to the 1995 November/December issue of the IEEE Software magazine (Volume 12, Issue 6). Many of the articles published in that issue introduced foundational notions of software architecture, such as Kruchten’s 4+1 view model.
Rumbaugh et al. capture the essence of software architecture as ``The organizational structure of a system, including its decomposition into parts, their connectivity, interaction mechanisms, and the guiding principles that inform the design of a system.” Key themes of software architecture revolve around taking some fundamental decisions about the structure of subsystems, components, and their interactions that together make up the entire system, and relates to reusing some of the wisdom that other people have gathered in working with similar systems.
Architectural patterns facilitate such reuse. As Alexander defines them, “Each pattern is a three-part rule, which expresses a relation between a certain context, a problem, and a solution.” Relation is a key term in the above definition—patterns connect problems, solutions, and contexts through relationships. The model-view-controller (MVC) pattern is among the most widely used architectural patterns in Web development, where we have a presentation layer, often called the graphical user interface (GUI), a middle layer to handle business logic for fetching and manipulating data, and a back-end database for storing data.
Software architecture as a discipline can also be seen from different angles: as a strategy for organizing a system’s constituents, as a high-level abstraction of a system’s design, as a vehicle for communicating fundamental ideas about a system’s development. Software architecture is not software design, although there is no clear line where the former ends and the latter starts. Design relates to choices about how specific functionality is to be delivered, architecture is more about decisions regarding the system’s structure. In future, with software systems becoming larger and more complex, software architecture ideas will need to address deeper issues around delivering more sound and durable solutions to user needs.
Labels:
architecture,
patterns,
software engineering,
subhajit datta
Saturday, February 19, 2011
The Laws of Software Engineering; or the Lack Thereof
The first law of software engineering is that there are no laws of software engineering, at least, yet! As the recently deceased software engineering pioneer Watt S. Humphrey had so succinctly said, “Physicists and engineers make approximations to simplify their work. These approximations are based on known physical laws and verified engineering principles. The software engineer has no Kirchoff’s law or Ohm’s law and no grand concepts like Newtonian mechanics or the theory of relativity”.
Where would the laws of software – when we do discover them – be placed in the pantheon of factors that influence software development? Booch has identified the following levels in the limits of software: influence of politics, impact of economics, importance of organizations, problems of functionality, problems of design, difficulty of distribution, challenges of algorithms, laws of software, and laws of physics.
When we compare the state of the art in software engineering, vis-à-vis other engineering disciplines the following lines of divergence can be noted:
Predictable outcomes – Engineers are focused on ensuring there is lowest possible surprise in the production process so that its outcome can be accurately predicted. On the other hand, software engineers reared in the art of programming and/or computer science research often regard surprises and unpredictability of outcome as key motivators for software development.
Design metrics – Software engineers do not have corresponding metrics for allowable stresses, tolerances, performance ranges, structural complexity, failure probabilities, etc. These are routinely used in other engineering disciplines. Commonly used retrospective measures of software, such as lines of code or performance ranges, may present an oversimplified view of the problem domain.
Failure tolerance – The fundamental aim of any engineering effort is to prevent failure; and when it unfortunately occurs, to learn from it. Software systems fail often and sometimes spectacularly. Yet software engineers do not have an established framework to document and share the lessons from such failures.
Separation of design from implementation – In other engineering disciplines, designers design and handover the ‘blueprints’ to the construction specialists. Software design and implementation together are too closely tied conceptually to enable such a separation. The same software engineer usually has to do both, often simultaneously.
It is wondrous and a little confusing how software engineers manage to create systems of such complexity and criticality in spite of such divergences and without an established set of basic ideas and first principles (enshrined as laws) to fall back upon.
But then, wondrous yes, but confusing, may be not, as structures of immense grandeur and longevity have been built when humans had no inkling of the laws that underpinned such feats of engineering. The pyramids of Egypt are just one example.
Where would the laws of software – when we do discover them – be placed in the pantheon of factors that influence software development? Booch has identified the following levels in the limits of software: influence of politics, impact of economics, importance of organizations, problems of functionality, problems of design, difficulty of distribution, challenges of algorithms, laws of software, and laws of physics.
When we compare the state of the art in software engineering, vis-à-vis other engineering disciplines the following lines of divergence can be noted:
Predictable outcomes – Engineers are focused on ensuring there is lowest possible surprise in the production process so that its outcome can be accurately predicted. On the other hand, software engineers reared in the art of programming and/or computer science research often regard surprises and unpredictability of outcome as key motivators for software development.
Design metrics – Software engineers do not have corresponding metrics for allowable stresses, tolerances, performance ranges, structural complexity, failure probabilities, etc. These are routinely used in other engineering disciplines. Commonly used retrospective measures of software, such as lines of code or performance ranges, may present an oversimplified view of the problem domain.
Failure tolerance – The fundamental aim of any engineering effort is to prevent failure; and when it unfortunately occurs, to learn from it. Software systems fail often and sometimes spectacularly. Yet software engineers do not have an established framework to document and share the lessons from such failures.
Separation of design from implementation – In other engineering disciplines, designers design and handover the ‘blueprints’ to the construction specialists. Software design and implementation together are too closely tied conceptually to enable such a separation. The same software engineer usually has to do both, often simultaneously.
It is wondrous and a little confusing how software engineers manage to create systems of such complexity and criticality in spite of such divergences and without an established set of basic ideas and first principles (enshrined as laws) to fall back upon.
But then, wondrous yes, but confusing, may be not, as structures of immense grandeur and longevity have been built when humans had no inkling of the laws that underpinned such feats of engineering. The pyramids of Egypt are just one example.
Saturday, February 5, 2011
Software Engineering’s Evolving Journey
For as young a field as software engineering, history is not just something to do with the past. It is also very much what is unfolding in the present. Computing is becoming faster, cheaper, and more ubiquitous. This speed often leaves us cold and confused: How do I know whether what I learn today will have any value tomorrow? Addressing this question lies at the crux of learning a little bit of software engineering’s history. A basic historical awareness can make us ready for change and what it means for individual careers, as well as the collective profession.
Some of the trends that have influenced software engineering and continue to do so can be summarized as:
Programming to software engineering – Programming plays a significant role in software development. This role has been redefined continuously with the evolution of software engineering. From a pre-eminently central position, programming has moved to be one of the concerns for the software developer.
Hardware-software: from coupling to congress – In the early days of computing, software was hardly recognized as something different from hardware. Then, much of the instruction needed to run a computer was hard-wired into the hardware. Additional instructions that came via software were specific to the hardware. When hardware was sold, software came with it; almost no one sold or bought software by itself. Compare that with the present; probably the operating system is the only piece of software now that comes with the hardware we buy. Almost every other application software for general use —those that let us do all those fun and useful things—is acquired and installed separately, often downloaded from the Web for free. The erstwhile coupling between hardware and software can be said to have been replaced by something of a congress.
Advent of High-Level Languages – What has changed profoundly over the past few decades is how we communicate with computers. What has brought about this change is the vehicle of communication: language. Very simply, the so-called high-level languages are programming languages whose syntax and idiom are closer to that of English. High-level programming languages let one be a programmer without being able to communicate in machine or assembly language. This caused an explosion in the number of programmers, which widened the scope of the software engineering profession.
Advent of the Personal Computer – It is estimated that more than one billion personal computers (PC) are in use in the world now. Certainly a far cry from that mere handful they were expected to sell in the early 1980s. Computers have come from being isolated, redoubtable behemoths, to portable, friendly devices sitting on our desks or being carried around in hand bags. Concomitantly, there has been exponential growth in the amount, variety, scope, and power of the software that needs to support what we do with those ‘personal’ computing machines today. With the coming of the PC and its many subsequent variants, software engineers have become ever busier.
Global Software Development – As the Web’s presence increased throughout the 1990s and then into the new millennium, software engineering became a truly global enterprise. It is customary now for the customers, users, developers and managers of a software project to be dispersed across continents, time zones, and cultures. This makes software development something of a unique enterprise. In today’s typical software projects, development proceeds round-the-clock, through the collaboration of offshore and onsite team members. Global software development has deep economic as well as social implications within and without the software engineering profession.
Return of Open Source – Very simply, open source is about sharing the source code (as well as executables) of software developed by a particular group or individual for free, so that others can use or modify the code without paying any royalties. Open source was nothing special in the initial era of computing. Companies sold hardware, and the software came with it, for free. Academics shared code just as they shared ideas. When software became a commercial commodity in its own right, large corporations made every effort to stifle the free flow of code. The coming of the Web in the early 1990s gave open source aficionados the fast, reliable, and cheap media they needed to connect and build exciting and useful software. And then give it away for free. This is how open source returned, after being in exile for many years. Other than being an affirmation of intellectual freedom, the open source paradigm led to the re-examination of many fundamental principles of software development. Corporations large and small have come to recognize the power of open source and are keen to leverage it. Open source has fundamentally changed software development, and the change is here to stay.
The interplay of these trends – many of which overlap and influence one another – have shaped software engineering’s evolving journey. These are living trends, and they beget newer trends.
Some of the trends that have influenced software engineering and continue to do so can be summarized as:
Programming to software engineering – Programming plays a significant role in software development. This role has been redefined continuously with the evolution of software engineering. From a pre-eminently central position, programming has moved to be one of the concerns for the software developer.
Hardware-software: from coupling to congress – In the early days of computing, software was hardly recognized as something different from hardware. Then, much of the instruction needed to run a computer was hard-wired into the hardware. Additional instructions that came via software were specific to the hardware. When hardware was sold, software came with it; almost no one sold or bought software by itself. Compare that with the present; probably the operating system is the only piece of software now that comes with the hardware we buy. Almost every other application software for general use —those that let us do all those fun and useful things—is acquired and installed separately, often downloaded from the Web for free. The erstwhile coupling between hardware and software can be said to have been replaced by something of a congress.
Advent of High-Level Languages – What has changed profoundly over the past few decades is how we communicate with computers. What has brought about this change is the vehicle of communication: language. Very simply, the so-called high-level languages are programming languages whose syntax and idiom are closer to that of English. High-level programming languages let one be a programmer without being able to communicate in machine or assembly language. This caused an explosion in the number of programmers, which widened the scope of the software engineering profession.
Advent of the Personal Computer – It is estimated that more than one billion personal computers (PC) are in use in the world now. Certainly a far cry from that mere handful they were expected to sell in the early 1980s. Computers have come from being isolated, redoubtable behemoths, to portable, friendly devices sitting on our desks or being carried around in hand bags. Concomitantly, there has been exponential growth in the amount, variety, scope, and power of the software that needs to support what we do with those ‘personal’ computing machines today. With the coming of the PC and its many subsequent variants, software engineers have become ever busier.
Global Software Development – As the Web’s presence increased throughout the 1990s and then into the new millennium, software engineering became a truly global enterprise. It is customary now for the customers, users, developers and managers of a software project to be dispersed across continents, time zones, and cultures. This makes software development something of a unique enterprise. In today’s typical software projects, development proceeds round-the-clock, through the collaboration of offshore and onsite team members. Global software development has deep economic as well as social implications within and without the software engineering profession.
Return of Open Source – Very simply, open source is about sharing the source code (as well as executables) of software developed by a particular group or individual for free, so that others can use or modify the code without paying any royalties. Open source was nothing special in the initial era of computing. Companies sold hardware, and the software came with it, for free. Academics shared code just as they shared ideas. When software became a commercial commodity in its own right, large corporations made every effort to stifle the free flow of code. The coming of the Web in the early 1990s gave open source aficionados the fast, reliable, and cheap media they needed to connect and build exciting and useful software. And then give it away for free. This is how open source returned, after being in exile for many years. Other than being an affirmation of intellectual freedom, the open source paradigm led to the re-examination of many fundamental principles of software development. Corporations large and small have come to recognize the power of open source and are keen to leverage it. Open source has fundamentally changed software development, and the change is here to stay.
The interplay of these trends – many of which overlap and influence one another – have shaped software engineering’s evolving journey. These are living trends, and they beget newer trends.
Labels:
evolution,
software engineering,
subhajit datta,
trends
Saturday, January 22, 2011
Change and Complexity: The Software Engineering Response
The central challenges for software engineering are complexity and change. Software engineering’s response to these challenges comes in two parts: breaking down the problem of building a software system into smaller, more manageable ‘chunks’ to confront complexity; and setting regular checkpoints during the process of building a software system to address the effects of change.
The breaking down results in workflows and the checkpointing leads to phases; together they constitute the software development life cycle (SDLC). The SDLC lies at the heart of software engineering.
Workflows represent sets of activities starting from understanding what users want from a software system (Requirements), to translating the language of the problem into the language of the solution (Analysis), to expressing the solution constructs in the language of development (Design), to building the system using programming resources (Implementation), and finally, verifying whether the system matches the stated requirements (Testing).
Phases, on the other hand, are focused towards monitoring and managing change. During Inception we ask, what do the users want from the system? During Elaboration, we are interested in knowing if the system is feasible. Next comes the question: How do we build the system? This is the concern of Construction. Finally, during Transition, we enquire, how do we transfer the system from the developer domain to the user domain? In a particular development life cycle, we may not know the answers to these questions when we ask them. But based on our experience and understanding, we have an expectation of what the answers are likely to be. When expectations are not met, it serves as a reality check: A change, not budgeted for, must have occurred. This makes us aware of the need to find out what changed and what all that change might affect.
On the face of it, software engineering’s response to the problems of change and complexity seems adequate. But it admits certain challenges.
An element of linearity is implicit in the way software engineering seeks to address change and complexity. Customers come with requirements, which are analysed, followed by the design of the system, its implementation and testing. If the right questions are asked at each point, Inception, Elaboration, Construction, and Transition seemingly follow one another in harmony.
But reality is much messier. Answers are seldom ready when questions are asked, customers and users change their minds all the time, and technology and business environments change. In the real-world of software development, it becomes imperative to go back and forth across workflows and phases several times, driven by a variety of reasons. Life is inherently non-linear, and software engineering is no exception. But just as in life, in software engineering too, we build our case on assumptions of linearity. And then hope to tackle non-linearity on a case-to-case basis.
The key challenge with software engineering’s response to change and complexity boils down to being able to monitor, control, and leverage the myriad feedback paths that exist in the real-world software development life cycle.
Feedback is one of the most fundamental techniques of engineering. In simple terms, feedback involves controlling an activity by regulating the input based on the output. Feedback exists at many levels, practical as well as perceptual. For example, an exception handler is a simple feedback loop. It monitors the execution of a piece of code and takes appropriate action if the outcome is not as expected. On the other hand, modifying a system’s design based on user response is also an instance of feedback. In software engineering, the difficulty usually lies in integrating feedback of different forms and at various levels into a consistent and repeatable development model.
This is the central challenge with software engineering’s responses to the problems of change and complexity.
The breaking down results in workflows and the checkpointing leads to phases; together they constitute the software development life cycle (SDLC). The SDLC lies at the heart of software engineering.
Workflows represent sets of activities starting from understanding what users want from a software system (Requirements), to translating the language of the problem into the language of the solution (Analysis), to expressing the solution constructs in the language of development (Design), to building the system using programming resources (Implementation), and finally, verifying whether the system matches the stated requirements (Testing).
Phases, on the other hand, are focused towards monitoring and managing change. During Inception we ask, what do the users want from the system? During Elaboration, we are interested in knowing if the system is feasible. Next comes the question: How do we build the system? This is the concern of Construction. Finally, during Transition, we enquire, how do we transfer the system from the developer domain to the user domain? In a particular development life cycle, we may not know the answers to these questions when we ask them. But based on our experience and understanding, we have an expectation of what the answers are likely to be. When expectations are not met, it serves as a reality check: A change, not budgeted for, must have occurred. This makes us aware of the need to find out what changed and what all that change might affect.
On the face of it, software engineering’s response to the problems of change and complexity seems adequate. But it admits certain challenges.
An element of linearity is implicit in the way software engineering seeks to address change and complexity. Customers come with requirements, which are analysed, followed by the design of the system, its implementation and testing. If the right questions are asked at each point, Inception, Elaboration, Construction, and Transition seemingly follow one another in harmony.
But reality is much messier. Answers are seldom ready when questions are asked, customers and users change their minds all the time, and technology and business environments change. In the real-world of software development, it becomes imperative to go back and forth across workflows and phases several times, driven by a variety of reasons. Life is inherently non-linear, and software engineering is no exception. But just as in life, in software engineering too, we build our case on assumptions of linearity. And then hope to tackle non-linearity on a case-to-case basis.
The key challenge with software engineering’s response to change and complexity boils down to being able to monitor, control, and leverage the myriad feedback paths that exist in the real-world software development life cycle.
Feedback is one of the most fundamental techniques of engineering. In simple terms, feedback involves controlling an activity by regulating the input based on the output. Feedback exists at many levels, practical as well as perceptual. For example, an exception handler is a simple feedback loop. It monitors the execution of a piece of code and takes appropriate action if the outcome is not as expected. On the other hand, modifying a system’s design based on user response is also an instance of feedback. In software engineering, the difficulty usually lies in integrating feedback of different forms and at various levels into a consistent and repeatable development model.
This is the central challenge with software engineering’s responses to the problems of change and complexity.
Labels:
change,
complexity,
feedback,
linearity,
non linearity,
software engineering,
subhajit datta
Sunday, January 16, 2011
Software Engineering’s Challenges: Change and Complexity
Every engineering discipline starts off in response to some pressing problem. Software engineering confronts the problems of change and complexity.
The very nature of software—its plasticity—makes it amenable to a continuous cycle of change. It seems rather easy to accomplish. After all, tweaking one statement in a software program can radically alter the program’s behaviour. But such tweaking—little by itself, but considerable in conjunction—can end up
changing the intent of the program’s design in fundamental ways.
It is absurd to expect a car to fly or float. But very often a software system built for one context is expected to function in as drastically different contexts, with the same grace and efficiency. These expectations can be traced to our wide cognitive gap with the use of software. Decades and centuries of using cars and bridges respectively, and millennia of using houses has ingrained in our minds what cars, houses, and bridges can and cannot do. Accordingly, we tune our expectations as well as environmental factors to set the context for these systems to function. In comparison, the use of software amongst a large community of lay users has just begun. Our understanding of how and to what extent software can serve our needs is yet not complete. As a result, the problem of change for software comes primarily from changing user expectations, and also from changes in the environment—technological and social.
Complexity is a complex word and there is no one definition to cover its ken; even reaching a definition is fraught with difficulties. But we need to care about it in life as well as in software engineering as complexity arises out of simplicity, at times suddenly and surreptitiously. Think of a simple computer program of five lines of code. It is straightforward; by carefully reviewing each line, we can hope to have complete knowledge of the program’s structure and behaviour. Now what if, a loop is introduced in the program—a simple construct that executes a set of statements repetitively, until a condition holds. The number of execution paths through the program has significantly increased now, and it has become far more difficult to know for sure what happens in each step when the program runs. This example is just a watered down instance of the combinatorial complexity software systems customarily face. Then there are even more involved issues such as complexity of the problem domain, complexity in the interaction of the various forces—technological, commercial, political—that a software system has to balance to be successful.
A common feature of complex systems is that they are greater than the sum of their parts. Anyone who has worked on a course project to build a piece of software stretching across several files can appreciate the sense of this statement: A piece of software is made of individual files, but it delivers something that merely bunching the files together will not achieve. Now scale-up to a real world system—with hundreds, if not thousands of files; and thousands, if not millions of interfaces between them; perhaps simple by themselves, but certainly complex when functioning together. And this is just one, relatively less significant, facet of software complexity.
A central theme of software engineering is about framing an adequate response to the problems of change and complexity.
The very nature of software—its plasticity—makes it amenable to a continuous cycle of change. It seems rather easy to accomplish. After all, tweaking one statement in a software program can radically alter the program’s behaviour. But such tweaking—little by itself, but considerable in conjunction—can end up
changing the intent of the program’s design in fundamental ways.
It is absurd to expect a car to fly or float. But very often a software system built for one context is expected to function in as drastically different contexts, with the same grace and efficiency. These expectations can be traced to our wide cognitive gap with the use of software. Decades and centuries of using cars and bridges respectively, and millennia of using houses has ingrained in our minds what cars, houses, and bridges can and cannot do. Accordingly, we tune our expectations as well as environmental factors to set the context for these systems to function. In comparison, the use of software amongst a large community of lay users has just begun. Our understanding of how and to what extent software can serve our needs is yet not complete. As a result, the problem of change for software comes primarily from changing user expectations, and also from changes in the environment—technological and social.
Complexity is a complex word and there is no one definition to cover its ken; even reaching a definition is fraught with difficulties. But we need to care about it in life as well as in software engineering as complexity arises out of simplicity, at times suddenly and surreptitiously. Think of a simple computer program of five lines of code. It is straightforward; by carefully reviewing each line, we can hope to have complete knowledge of the program’s structure and behaviour. Now what if, a loop is introduced in the program—a simple construct that executes a set of statements repetitively, until a condition holds. The number of execution paths through the program has significantly increased now, and it has become far more difficult to know for sure what happens in each step when the program runs. This example is just a watered down instance of the combinatorial complexity software systems customarily face. Then there are even more involved issues such as complexity of the problem domain, complexity in the interaction of the various forces—technological, commercial, political—that a software system has to balance to be successful.
A common feature of complex systems is that they are greater than the sum of their parts. Anyone who has worked on a course project to build a piece of software stretching across several files can appreciate the sense of this statement: A piece of software is made of individual files, but it delivers something that merely bunching the files together will not achieve. Now scale-up to a real world system—with hundreds, if not thousands of files; and thousands, if not millions of interfaces between them; perhaps simple by themselves, but certainly complex when functioning together. And this is just one, relatively less significant, facet of software complexity.
A central theme of software engineering is about framing an adequate response to the problems of change and complexity.
Labels:
change,
complexity,
plasticity,
software engineering,
subhajit datta
Subscribe to:
Posts (Atom)