Wethern’s Law of Suspended Judgement states (in a rather tongue-in-cheek fashion) that “Assumption is the mother of all screw-ups.” A more popular take on this would be, “Don’t assume – it makes an ‘ass’ of ‘u’ and ‘me’.” But when you are dealing with assumptions that could cost thousands, if not millions of dollars, it’s not always a laughing matter.
Best practices in software architecture state that you should document the rationale behind each decision that is made, especially when that decision involves a trade-off (performance versus maintainability, cost versus time-to-market, and so on). In more formal approaches, it is common to record along with each decision the context of that decision, including the “factors” that contributed to the final judgement. Factors may be functional or non-functional requirements, but they also may just be “facts” (or factoids…) that the decision-makers found important (technology constraints, available skill sets, the political environment, etc.).
This practice is valuable because by way of listing these factors, it helps highlight assumptions that the architects may have that are affecting important decisions regarding the software that is being designed. Very often these assumptions are based on “historical reasons”, opinion, developer lore, FUDs, or even “something I heard in the hallway”:
- “Open source is not reliable”
- “Bitmap indexes are more trouble than they’re worth”
- “The customer would NEVER accept a page that takes 5 seconds to load”
- “The CIO would reject anything that isn’t sold by a major vendor”
It is important to make these assumptions visible and explicit for the sake of posterity and for future re-evaluation. However, it is even more critical to make sure that any assumptions that aren’t based on relevant empirical evidence (or a confirmation from the people involved, for political factors) be validated before a decision is finalized. What if customers can wait 5 seconds for critical reports if you provide a counter? In exactly what way is exactly which open source project unreliable? Have you tested the bitmap indexes on your data, using your application’s transactions and queries?
And don’t overlook the word “relevant.” Something that was true in an older version of your software may not be true today. The performance of bitmap indexes in one version of Oracle may not be the same as in another. An older version of a library may have had security holes that have already been fixed. Your old reliable software vendor may be on their last legs financially. The technology landscape changes every day, and so do people. Who knows? Maybe your CIO has become a closet fan of Linux.
Facts and assumptions are the pillars on which your software will be built. Whatever they are, make sure the foundations are solid.