Regardless of the process (or lack thereof) you use for creating software, there has to be some idea of requirements. These requirements tell us, in whatever form they are expressed, what it is that we’re trying to do. Most people who have worked on software projects for a while have faced impossible/highly unlikely requirements foisted on the project by people who do not understand technology and are prone to wishful thinking. (Simply saying that something will happen doesn’t make it occur.) Clearly unreasonable requirements based on wishful thinking are not good. But that’s not what I want to talk about here. Rather, I’d like to concentrate on requirements which are actually reasonable in some local context, but in the overall project context are just as bad, if not worse, than wishful requirements. These are
toxic requirements.
Fifteen years ago
Gerald Weinberg discussed how considerations around software quality are ultimately political: someone has to make a decision about what “quality” means in a certain case, and doing so can result in more alignment to one person/group’s interests over another. Implicit in Weinberg’s work is that requirements are political as well: statements about software quality are expressed as requirements, and by extension all decisions about what to do (and hence what not to do) are made by designated people/groups balancing options.
How does this apply to open source, and how is it an antipattern? As I discussed in an earlier post, tight binding between open source and product work can cause problems. Toxic requirements emerge in this context when product requirements are foisted on the open source community. Of course many times product requirements are also community requirements, so there is no imposition. But there are also times when the decision makers (committers) use their position to impose requirements on the community that have not been demonstrated as having broad applicability beyond specific product needs. In some situations, the case for broad applicability (crucially usually including some modifications of the requirement) can be made, but it is more expedient to use committer authority to push the process. In such cases, committers will often use the “I do, therefore I decide” principle of open source to justify the decision. But these rationalizations avoid the real issue: if your idea of open source includes principles like “openness,” “transparency,” and “community” then making decisions in the narrow scope of product requirements is poison. Instead you must be willing to either build community consensus (or at least the lack of objections), or simply not implement the requirement in open source. This is very hard to do, especially when your sponsors (who “pay the bills”) are demanding fast action in a direction that was decided by a closed process.