Object Oriented Programming (OOP) is the dominant approach to programming today, and I'm not going to spend much time defending it. The fundamental strength of OOP is that it brings the structure of applications closer to the structure of the real world, and we understand the real world a lot better than we understand software, so OOP makes our software easier to specify, design, implement and maintain.
With the growth of OOP, it has become possible to do software engineering. Prior to OOP, simply expressing software designs in a language that was suitable for clear communication of the structure of the application was an unsolved problem. Now the Unified Modeling Language (UML) has at last given software engineers a means of expressing their designs simply and elegantly. Language limits the thoughts we can express, so when we need to express new thoughts we need to invent new languages. Language creation goes on all the time, from the drift in meaning of old words, the the invention of new words, to the evolution of entirely new, purpose-built languages like UML.
Although an engineer can use UML with no tools more sophisticated than a pen and paper, there has been a growth industry in software to aid the design of applications. These "meta-apps" -- apps for building apps -- have a number of things in common. Their learning curve is extremely steep. The code they generate is frequently hard to compile and difficult to modify, and rarely contains any methods that actually do anything, despite the wealth of information that the models contain about application behavior. And the cost is often greater than that of the compiler that code is being generated for.
Method and Process
OOP has grown up hand in hand with analysis and design methodologies that are intended to help software engineers create software that fulfills user's needs as quickly and cheaply as possible. Many of these methodologies are extremely heavy-weight, to the extent that to use them would require a substantial re-organization of a company's software development group. While it is certainly true that some companies could benefit from such a re-organization, that kind of undertaking is extremely high-risk, and the benefits are at best theoretical. This has lead to a perception amongst software engineers, and their managers, that while object oriented design and analysis (OOD/A) is a worthy goal, it should be approached gradually. Some developers have gone so far as to say that OOD/A methodologies are positively detrimental to the development process. These people say that OOD/A leads to endless debates about design until deadlines force developers to plunge into code-like-hell development just like they always have.
Large organizations delivering systems the constitute millions of lines of code can certainly benefit from the systematic adoption of OOD/A. But most software development is not done on such a large scale, and many smaller companies that would like to do a better job methodologically are shut out by the high cost of tools and training, and the high risk involved in organizational changes. And it is clear that without organizational changes -- without formal analysis and design phases that developers have clear room in their schedule to undertake -- the benefits of OOD/A and the use of the associated tools are less than clear.
Barriers to Methodology Adoption
Existing development organizations tend toward two extremes in terms of their development process: at the one end there are smaller organizations that are dominated by code-and-fix approaches, and at the other there are larger organizations that are dominated by highly structured methodologies in reproducible processes. Many companies in the first state would dearly love to get into something more like the second, but they are put off for several reasons.
The first is the risk of reorganization on the scale required to benefit from traditional OOD/A. The second is that the resources to do so are often simply not available -- the whole point of wanting to make such a move is that your organization is currently not very efficient, so your people are not keeping up to the demands on them. Adding the costs of total re-organization on top of that is just not realistic for many of the companies that could most benefit from OOD/A methodologies.
Methodology and process are two sides of the same coin: for a methodology to be useful, the development process has to be structured around it. There is no point in trying to use a RAD methodology that emphasizes close consultation with end users if your process does not have time available in the development schedule for meetings between developers and customers. In many organizations, individual developers would like to adopt OOD/A methods, but are prevented from doing so by a process that expects early deliverables in the form of code rather than specifications or designs.
An Ideal Methodology
Based on the foregoing considerations, an ideal methodology is one that individual developers can adopt, experiment with and use without committing the entire organization to it. It will also facilitate the early generation of code, and by implication support a tightly iterative development cycle, so that goals for early deliverables can be met without compromising future development. Many developers have had bad experiences with wizards and frameworks that make the first stages of application development breezily fast while ultimately creating a situation where the developer has created a great deal of code that is not well understood and depends on implicit, poorly documented framework behavior. Such code is hard to debug and impossible to change, leading to longer and longer delays in the later stages of the development cycle, and an ongoing struggle for maintenance programmers. Often, jettisoning the framework and porting customized code to new applications by hand is the only way out of this sort of mess, and any application development tool should be designed with avoiding this kind of situation in mind.
The rest of this chapter covers the things I think should go into an ideal methodology -- things like reusable specifications, useful code generated from specifications, and ease of testing.
Reuse and Other Myths
Reuse is one of the promised benefits of OOD/A, and yet it has proven to be one of the most difficult benefits to realize. There are several reasons for this: OOD/A tools have not tended to make it easy to split parts of designs into separate files so that they can be shared by other applications. Also, the tools themselves typically only weakly support the notion of OO frameworks, so ensuring that all developers are working within the same framework and using it in the same way is a very high maintenance task. Although many tools have excellent scripting capabilities, someone has to design, develop and maintain those scripts, and once again the resource limitations that motivate the use of such tools to begin with turn up as a barrier to adopting them.
Reuse is always easiest at the highest level. Interface descriptions, for instance, are better shared as IDL than the implementation language, simply because the higher-level description tends to be more portable. It can support different target languages, different platforms and different applications more easily than low-level representations can.
The disadvantage of making the highest-level representation available to developers is that they will have an urge to change it. No amount of technology can quite eliminate the need for policy, and this is one place where policy needs to be made and enforced pretty strongly. But the advantages of high-level reuse far outweigh the disadvantages, and therefore the ideal methodology will be supported by tools that let developers modularize the design so that it is easy to manage and straight-forward to reuse. This will help propagate the methodology through the organization, so that once a developer becomes confident in a particular design, it can be shared with other developers. These developers will need to start using the same tools, at least to parse the design into code, and will see their benefits up front, without requiring management policy to change in the least.
Automation of Routine Tasks
To be really useful, a methodological tool should automate some of the uglier tasks that developers have to deal with, and should ease some the strain of application life cycle management. As mentioned above, some tools that make the going very easy to begin with can have very high costs later in the application life-cycle. This is true not just for developers, but for users as well. And things that cause users pain and suffering are generally considered to be quite bad.
High on the list of painful things for both developers and users is the problem of serialization. Almost every application of any interest has at least one file type associated with it. Applications of any size have several: configuration files, data files, help files, documents, you name it. These files tend to have formats that are developed on-the-fly, as needed. When a developer thinks of something that just has to be added to a particular configuration file, it gets added, and e-mail goes around to everyone saying, "Throw away your old XYZ files, the new format looks like this..." For users, the outcome of this process is the commercial equivalent of a ransom note that arrives with every new release: "Pay us a bundle of money, or never see your data again."
For the development organization, this phenomenon translates directly into lost sales. Customers have two basic responses. The first is to delay upgrading for as long as possible, because if anyone in their organization upgrades everyone will have to, which is a very large cost that has to wait until at least the next budget cycle. The second customer response is to investigate competitive programs, which may well support your old file format and offer a few new bells and whistles besides. Introducing incompatible file formats is very much like introducing a new application, which gives your customers a good reason to go back and re-evaluate their original purchasing decision.
If you are selling into the home computer retail market this is not such a big deal, although with more and more multi-computer families it will start to have an impact there as well.
The ideal methodology will therefore support tools that generate serialization code for the classes described by the design, and the serialization format will be FORWARD compatible. That is, Version 1 of an application will be able to read Version 2's files, edit or otherwise interact with the bits it recognizes, and write out the modified file with all the Version 2 information intact. This would take a big load off developers, and would keep your customers upgrading incrementally as they needed new features, not drive them away by demanding that everyone upgrade at once.
Ease of Use
Ease of use has become the watchword of modern development practice, and the tools that developers use should be easy to use as too. There is something wrong with a development tool that supports a "best practices" methodology but is hard to learn, painful to use and falls over frequently. OOD/A methodologies are hard to learn: even experienced developers with Ph.D.'s in fairly difficult subjects have been known to take quite a long time to learn how to do OOD/A well. The most difficult part of the process is analysis: the turning of abstract, fuzzy concepts into concrete terminology that can be used as the language of the design. And the length of the analysis and design cycle means that errors in analysis are often very hard to recover from, because by the time the first working code reveals them the development process is too far along to change very much.
The creation of new concepts is hard, yet we ask developers to do it routinely in application analysis and design. Translation of real-world, domain concepts into software equivalents very often involves concept creation: the new concept will live partly in the domain, but will be mixed with implementation details as well. Although we would like to keep these separated, the fact is that they always become a bit mixed, and we have to live with that, and ease the process by which it happens as best we can.
The ideal methodology will therefore be as easy as possible to learn -- it shouldn't require a week-long course, although such a thing might help. The analysis process should be as concrete as possible: it should provide tools that let developers think about things in concrete rather than abstract terms. One very effective tool for doing this is the Use Case, which is a story or narrative about they way the user interacts with the application: it describes what the user does and how the system responds. The ideal methodology would generate code from use cases -- or something very closely related to them. Use cases can be understood by both developers and users, which gives an added level of risk-reduction when it comes to verifying the analysis.
What one would like to do is generate a test plan and test scripts from the specification, so that at the end of the day it would be possible to automatically verify that the application conforms to the specification. Of course, because important parts of the application would be generated from the same spec as the test scripts, the confidence in this happening would be pretty high, but in any real application there will be a large amount of custom code that needs to be tested. This is particularly true for desktop applications that involve a lot of UI if the specification language does not capture UI very well.
Summary of An Ideal Methodology
In summary, the features we are looking for a in the ideal methodology look like this:
There are a few things that are not on this list: the big one missing is anything about UI. The reason for this is simple: GUI builders are a solved problem. The remaining problems with GUIs are esthetic, not technological. The growth in concern for usability in recent years is strong evidence of this: usability is not about technology, it is about layout, about look and feel, about intuitiveness and affordances. Software development methodologies are fundamentally about problems of representation, and usability is not a problem of representation but of appearance. There is no ambiguity about the concept of "button" that has to be made clear and explicit -- there is instead a question of how the user knows what objects on the screen are buttons, what those buttons do, and how the user's expectations can be shaped about where certain buttons are, and the like. These are important problems, and worthy of attention. But they are not problems that an OO analysis and design methodology will solve.
This chapter has sketched the ideal OO analysis and design methodology, and given some features that its tools should have. The next chapter introduces Narrative Programming, the methodology of the future, and answers the question, "Why are English majors so good at software development?"