Although originally the Broken windows theory is about criminology and sociology, many other areas adopted the analogy. In a broader sense it illustrates the negative effect of temporarily lowered norms. More specifically in software development, it is used to emphasize the long-term benefits of a clean codebase and suggest that the it's best to leave every work item in a better state, as minimal short-term investments can potentially save much trouble in the future.
I would argue that it's not a good idea to take this practice as a rule of thumb, and try to apply it in every case.
The Broken windows theory
According to the Wikipedia, the first forerunner of the theory was an experiment in 1969, when researchers parked a car in California with no license plates and the hood up. Although it clearly seemed abandoned, people ignored its presence. After the week passed, the social scientists who conducted the experiment deliberately broke one of its windows with a sledgehammer. Soon after the same people who just ignorantly passed by before, started to vandalize it. The moral of the story is that such things can happen even in an otherwise civilized community when norms are lowered.
A bad analogy for software development
It's easy to map the above example to someone's life as a citizen. If I live in a clean and nice neighborhood, chances are that I more likely to cut my grass frequently. When the streets are littered, I might be tempted to throw away my garbage as it does not really make any difference.
So far so good, but if we are to compare developers with citizens and codebases with cities, we should go one step further.
Cities are maintained and developed by a lot of people from a mayor through various professionals to all those who live there. They have very different skills and responsibilities, but they work together to achieve the common goal.
It might occur, but it's rare that people contributing to a city in many different roles. There are plumbers, firemen, electricians and bus drivers, but they usually do their primary profession. Also, the boundaries of the responsibilities are clearer. It's relatively uncommon that
- a bystander starts fixing power cables on the street
- a passenger decides that the public transport could use some facelift so he quickly orders the newest bus models, or makes a contract with a different transportation company
- when someone goes to the grocery store and hear the door creak as it opens immediately starts pouring oil to the rusty hinges
But unlike people in real cities, developers do various tasks at the same time on a project, like browsing the codebase looking for something, doing maintenance of maybe not well known modules, building and designing new components or fixing bugs and security issues.
While doing these activities, we wear one of the different hats, considering one aspect of the problem. This can lead to narrow vision, so if we are alone, even common sense can not help to stop us from doing crazy things, like - merely driven by good intentions - non-electricians fixing power cables.
In real cities, change is difficult, but the power to change almost anything code-related in a given project is constantly at our fingertips, and we use this power all the time in our daily jobs. So to make things worse, we can - and most of the time, have to - apply changes quickly to the codebase to fix a bug or meet a new business criteria.
And because in software everything is a bit blurry and subjective, many times we don't even know if the window is broken or not.
Or that there are any windows.
Maybe the windows would never break, but we polish them every day, because we think that if they ever get dirty, someone will come around and smash them. We fear that all it takes one broken window to let loose the chaos, where there is no turning back.
Many times it leads to overcompensation and overengineering.
I think this approach is contagious as much as broken windows, because if the dev culture adopts the "always aim for the nice and general as possible because it surely pays off in the future" mentality, it will surely produce biased decisions. Creating unnecessary general or polished solutions not always pay off, as predicting the future is really hard.
Responsible Boy Scout
Don't get me wrong. I am generally agree that we should aim for nice and clean solutions. But I don't like that the definition of "nice" and "clean" are ambiguous and are many times decided individually rather than by the team, when they shouldn't.
I think the Boy Scout Rule is a great thing to follow, leave the camp cleaner than you found it. But there is no need to tell a real scout not to rearrange the whole forest in the process, because they can't.
We need rules that keep us on the right track, and a lots of information on which the team can decide what actions to take.
Some key points I would like to emphasize:
- The team should discuss rules about what clean code is. If possible use automatic tools and code reviews to enforce these rules.
- Measure code quality and performance to aid team decisions. For example, SonarQube is a powerful tool to track code quality changes.
- Clarify the boundaries of the responsibilities related to tasks and code quality.
- Adhere to the Boy scout rule, but don't wander too far and try to clean the next camp too.
- Don't refactor untested code.
- Don't refactor unrelated code.
- Don't refactor code that work well and did not change much lately.
- When changing dependencies and code that likely to affect other modules, be extremely careful. Are these modules tested?
- If possible, don't decide alone to avoid biases.
- Write tests to enable further refactorings and enhancements.
- Prefer simple solutions, but plan carefully to make the possibility to improve and refactor.
Problems makes problems. Trying to prevent all of them can easily make more. It seems more feasible to measure, plan, and even let some windows break.