BSC West 2015 Keynote: Better Thinking for Better Software: Thinking Critically about Software Development

[article]
Summary:

Software developer Laurent Bossavit delivered the second keynote presentation, about why we need to think more critically about software development. He began his presentation by saying his intention was to make you question what you know—or what you think you know.

Software developer Laurent Bossavit delivered the second keynote presentation, about why we need to think more critically about software development. He began his presentation by saying his intention was to make you question what you know—or what you think you know.

To do this, Laurent posed some questions of his own, the first being from Richard Hamming, whose contributions to the computer science world include the Hamming code: “What are the most important problems in your field?” Hamming observed that the people who gave good, thoughtful answers to that question ended up being more successful. Laurent said there’s an obvious answer for software professionals: bugs. He said it seems silly to think of bugs as things—even the word “bugs” implies they creep into our programs discreetly while our backs are turned. In reality, of course, we as programmers introduced them (albeit unintentionally).

Laurent invited the audience to notice what debugging involves. When people test, they usually rely on past instances, models, and beliefs or hypotheses about causes or scenarios. Basically, there are many thought processes going on within your brain. However, Laurent said, the brain itself is a “buggy” piece of “hardware.” He pointed out confirmation bias, the tendency to seek evidence that confirms your beliefs and dismiss evidence to the contrary. This phenomenon is proof that developers should never test their own code, because they want to believe that it has no bugs.

Few people base their knowledge about software engineering on science, heuristics, and acknowledging confirmed cognitive biases. Instead, they rely on their first impressions and deep-seated conceptions. Laurent explained the “backfire effect”: When someone believes a claim and you try to use impartial evidence to debunk it, the person comes away actually more convinced of the claim than before. People like holding onto their conceived notions, and this applies to creating and testing code, too.

A concrete example of when this can be dangerous in software engineering is the “cost of fixing defects curve.” You’ve probably seen it: a smooth arc on a graph depicting how little it costs to fix defects when they’re caught early in a development cycle, and how that cost grows exponentially the later in the lifecycle they are found. In reality, Laurent showed, that cost goes up and down in peaks and valleys. A blind trust that locating defects earlier is always better leads to people spending too much time testing early on and not testing as stringently later.

Relating to this, Laurent had some advice: If you want to get people to believe something, no matter how stupid, stick a number on it. Something about the perception that somebody, somewhere took the time to quantify a statement makes it much more believable to the average person.

For instance, a popular statistic in the software world is that 56 percent of all bugs identified in projects originate in the requirements phase. After some digging, Laurent discovered that this fact originated from one study. There is no evidence that this is a widespread truth, yet it keeps getting perpetuated. Instead of taking this as law, developers and testers should be focused on what is true for their specific projects and addressing each phase with fresh eyes and open minds.

So, how can you take a more systematic and disciplined approach to your software projects? Laurent recommends the PDCA cycle: plan, do, check, act. Unfortunately, due to confirmation bias, he says that more often those letters stand for promise, delay, cancel, apologize! Sometimes we fancy ourselves so clever that we dig ourselves into a hole, and you know the saying: If you find yourself in a hole, stop digging. Laurent illustrated this with a quote attributed to Albert Einstein: “We can not solve our problems with the same level of thinking that created them.” (However, of course Laurent looked up this remark, and it’s actually a paraphrase of Einstein’s original quote—which was about war, not software development.)

While it’s useful to know when information is false, debunking ideas isn’t enough. You should be constructive as well as skeptical, and treat every bug as an opportunity to learn.

The overarching message of Laurent’s keynote can be summed up by a parable he related. A rich man had been pestering a Zen master to take him on as an apprentice. He invites the Zen master to his house for tea, and the whole time he’s showing off his large collection of books and bragging about his extensive learning. They sit down for tea, and the Zen master begins pouring a cup for the man. The tea reaches the brim, but the Zen master keeps pouring until it spills over. “Why did you do that?” the rich man cried. “Your mind is like this cup,” the Zen master said. “It is so full that there is no room for anything else.”

Don’t let your minds be so full of previous knowledge that you reject new perceptions for want of space, Laurent said. Leave yourself open to new ideas and be a thinker as well as a doer, and you will be able to approach software engineering in a more effective way.

About the author

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.