Springe zum Inhalt


I found the blog of Bredex, a small company in the north of Germany I have worked 10 years before. They started to blog last year and have quite some nice posts on it. One post diskusses whether feature driven development is a double edged sword. Alex wrote that they experienced in a project that focusing on feature delivery is causing problems in the software quality. Refactorings that must be done to keep the software maintainable will not be done, because the team wants to deliver features. Yes - that is the main goal of agile software development - deliver a high business value quickly.

Recent financial crises must teach us another aspect. We can live a high quality live on the cost of the future. Some like to overspend and increase their debt. The same you can find in software projects. If you try to deliver as much features as you can, you will increase your technical debt. Even without much pressure from as business department a team can be overcommitted and will deliver features on the cost of quality. The bad quality can be visible to the customer, due to too many defects in the delivery, or it can be unvisible due to an awkward design. The last one is technical debt.The first one - high defect rates are usually tackled with proper conventional quality assurance. The technical debt can be tackled with contiuous refactoring.

But how can you measure your technical debt? By automated tools that output software quality metrics. One of these tools is dependometer - Valtech's open source solution. Another very good tool is Sonar from codehaus.

If you are planning a sprint and you have delivered 15 story point and your technical debt is not increasing, then you are running on a reasonable velocity. But if the quality metrics show you an increase of the technical debt, then you are too fast and you should put less story points into your next sprint and plan tasks to refactor or clean up the code.

Some questions pop into my mind now: How do I explain this to the customer? If I report the velocity, they will see a drop of it. So better not to report the velocity? Or start with a low pace assuming that you can get faster in later sprints? Maybe we can report the quality metrics too - defect rates and technical debt. It really depends on our customer's nature.

I have just uploaded the source code of my Motif interface builder to source forge. Anyone who is interested can find it at here.

In the current project the customer wants to have a simple Performance Compare Test. The customer wants to be sure, that the changes we made inthe software will not worsen the performance. Since the effort to make a full-blown load- and performance test is very high, this Performance Compare Test should be very simple.

The solution I found is the following: We will use Selenium IDE to record and play script. This script will be run on the old and the new version of the software. Because we only want to see, whether our changes have worsen the performance, this is acceptable. In Selenium IDE you have a button "Play with Selenium TestRunner". In this mode Selenium logs timestamps. We will use these timestamps to measure the time period it takes to click through the application. In the following image you can see the log from Selenium.

Selenium Log

The advantages of this:

  • Simple to install and easy to use, even on a developer's machine
  • The rendering of HTML and the JavaScript execution is measured as well

You still have to plan the tests, e.g. what test data you want to use and how do you want to click through the application. And of cause - this will not replace a full-blown load and performance test.

After having read "Getting Things Done" (GTD) by David Allen, I realized that one can merge the ideas of GTD and agile software development processes. In GTD David Allan proposes to have several goal level. The vision of life leads to long term goals, which leads to projects, and weekly goals and so forth. I've tried to match this idea to a process like Scrum. Starting from the bottom:

  1. During the daily scrum meeting the team defines a daily goal. This goal is influenced by the goal of the sprint.
  2. In the sprint planning the team fixes the goal of the iteration. If the iteration length is a week, it can be compared to the weekly planning in GTD.
  3. The project goal is formed by the product owner. The goal is not well defined, it's more a vision that is refined during the iterations.
  4. So - what's the next level. It's the strategic level. It's defined by the business vision and influences the IT strategy and activities in enterprise architecture.

If we introduce an agile process, we should address all goal levels. I think it falls to short, if the agile process is introduced only up to the project level. But how does the strategic level influence the project level? Do we need a stratigic backlog, like a product backlog?