Skip to main content

No test - inconceivable. Not Technical Debt.

Technical Debt

The software term "technical debt" is getting a lot of play on the air waves.  But I do not think we are using it the way Ward was when he invented the term (a metaphor) to explain to his buisiness team why creating software fast to get feedback was a good thing.  But that they had to be willing and able to sustain a pace of repayment on the debt of doing just good enough design to get product feedback.  Their form of repayment was constant refactoring.  Always keeping the software model moving toward the best possible business model, which modeled the real world.  Using many XP practices to enable the repayment plan of the debt they were consciously assuming.

In this vain, technical debt does not cover the process of writing bad code, of poor design, of skipping steps (such as testing).  Those behaviors would be considered to be incompetent design and implementation.  That behavior results not in debt at all but a breach of the inherent contract between development team and the business. The warranty of merchantability.


The warranty of merchantability is implied, unless expressly disclaimed by name, or the sale is identified with the phrase "as is" or "with all faults." To be "merchantable", the goods must reasonably conform to an ordinary buyer's expectations, i.e., they are what they say they are. For example, a fruit that looks and smells good but has hidden defects would violate the implied warranty of merchantability if its quality does not meet the standards for such fruit "as passes ordinarily in the trade".

This industry (software development) is deceiving its customers and pass off bad products as if they just have a little bit of debt to be repaid.  As if the product was a 3 year old car with a 5 year loan.  One purchases the car and the debt.  But I suggest that is not the same thing as purchasing software the has been created in such a poor fashion as to have no unit test, or no acceptance test. Little if any way to become the application it appears to be.  A car on the outside but with a faulty electrical system, blown engine, and bad transmission, but really good tires and paint.

A case in point - a team wishes to upgrade the compiler that produces their application.  They have little to no tests for the application - it just works.  They then ask for a test group to provide the effort required to prove that the compiler upgrade doesn't cause any bugs.  I do not think one can pass off the warranty of merchantability to the test team and expect good things to come from this - pass the hot potato behavior.  This is not technical debt - it is incompetency - it is inconceivable.

"You keep using that word. I do not think it means what you think it means."
-- Inigo Montoya


Ward Cunningham on the creation of the "Technical Debt" metaphor.


I think we keep using the phrase technical debt - but it doesn't mean what we think it means.  Technical debt means a conscious decision to defer up-front design and research in the product development, in order to get to market with a model that is capable of becoming the the desired solution, and capable of eliciting the customer feedback that we desire, which proves that the product is evolving in the proper direction.  And provide return on investment earlier.

See Also:

A Technical Debit - Collateralized Debt Obligation you should not invest in

The SQALE method is particularly devoted to the management of the Technical Debt (or Design Debt) of software developments. It allows:
  • To define clearly what creates the technical debt
  • To estimate correctly this debt
  • To analyse this debt upon technical and business perspective
  • To offer different prioritisation strategies allowing establishing optimised payback plan.
Managing Software Debt - by Chris Sterling

3 comments

Most Popular on Agile Complexification Inverter

Exercise:: Definition of Ready & Done

Assuming you are on a Scrum/Agile software development team, then one of the first 'working agreements' you have created with your team is a 'Definition of Done' - right?



Oh - you don't have a definition of what aspects a user story that is done will exhibit. Well then, you need to create a list of attributes of a done story. One way to do this would be to Google 'definition of done' ... here let me do that for you: http://tinyurl.com/3br9o6n. Then you could just use someone else's definition - there DONE!

But that would be cheating -- right? It is not the artifact - the list of done criteria, that is important for your team - it is the act of doing it for themselves, it is that shared understanding of having a debate over some of the gray areas that create a true working agreement. If some of the team believes that a story being done means that there can be no bugs found in the code - but some believe that there can be some minor issues - well, …

Elements of an Effective Scrum Task Board

What are the individual elements that make a Scrum task board effective for the team and the leadership of the team?  There are a few basic elements that are quite obvious when you have seen a few good Scrum boards... but there are some other elements that appear to elude even the most servant of leaders of Scrum teams.









In general I'm referring to a physical Scrum board.  Although software applications will replicated may of the elements of a good Scrum board there will be affordances that are not easily replicated.  And software applications offer features not easily implemented in the physical domain also.





Scrum Info Radiator Checklist (PDF) Basic Elements
Board Framework - columns and rows laid out in bold colors (blue tape works well)
Attributes:  space for the total number of stickies that will need to belong in each cell of the matrix;  lines that are not easy eroded, but are also easy to replace;  see Orientation.

Columns (or Rows) - labeled
    Stories
    To Do
    Work In P…

Webinar: Collaboration at Scale: Defining Done, Ready, and NO.

I was invited to participate in a Scrum Alliance Webinar.  Maybe you would like to listen to us in a discussion of techniques to collaborate at scale (remotely and with many people).  The topic is one that I've got some experience in discussions - yet I never seem to get to done...
Collaboration at Scale: Defining Done and Ready and NO for Distributed Teams
With Joel Bancroft-Connors, Agile Organizational Coach; David A. Koontz, Agile Transition Guide; and Luke Hohmann, CEO and Founder of Conteneo, Inc.


14 February 2018 11 a.m. ET (USA).




The Scrum Guide is pretty clear on the criticality of the definition of Done: "When a Product Backlog item or an Increment is described as "Done," everyone must understand what "Done" means. However, the Scrum Guide ALSO says that the definition of Done can "vary significantly per Scrum Team." This leads us to examine when and how the definition of Done should vary, how distributed teams should cr…

A T-Shaped 21st Century Knowledge Worker

Knowledge workers in the 21st Century must have many areas of deep knowledge, while also be capable of collaboration across multiple other domains with dissimilar T-shaped individuals.  This description of a person is a metaphor.  Compare it to the shape of the "I" in the classic saying there is no "I" in Team.


I first read about Scott Ambler's term "Generalizing Specialist" - but it's so hard to remember the proper order of the words... get it backwards and it has an inverted meaning... T-Shaped is easier to remember. 
A generalizing specialist is someone who:
Has one or more technical specialties (e.g. Java programming, Project Management, Database Administration, ...). Has at least a general knowledge of software development. Has at least a general knowledge of the business domain in which they work. Actively seeks to gain new skills in both their existing specialties as well as in other areas, including both technical and domain areas.  General…

A FAILURE to Communicate

I was working with a failing team some time ago.  I use "failing" to describe the outcome of the team - not the people on the team.  Are you OK with that description?



An issue arrose in the stand up - a team member that was to verify the quality of a procedure did so and reported that there were a few records that didn't match expectation in the data set.  Upon inquire the number of records not matching was over 2000.  Most people acknowledged immediately the exaggeration - I could tell by the laughter.  After about 10 minutes of discussing the details of the problem - it appeared the team had a handle on the specific situation.

I stopped the discussion and inquired if they could name the impediment.  One team member did a great job of describing the impediment as a _communication gap_.  Wonderful - I could work with that - the problem had a name and it didn't include anyones Proper Name.

"If the problem has a first name; we are going to have a problem."

I&#…