Thursday, November 26, 2009

Sprint Planning Insights

We typically start Sprint planning moving user stories from the Product Backlog to the Sprint Backlog. However, we need to take into consideration that:
  1. User stories can be split, actually they should be split to accommodate them in a three to five days time frame
  2. User stories in the Sprint Backlog must include all necessary work; development, infodev and QE (Quality Engineering) tasks would need to be considered
  3. User stories should be delivered on a weekly basis, we should try to avoid at all cost to deliver all user stories by the end of the Sprint. On the contrary, our target should be to deliver small increments of implemented and tested functionality each week, see the bar chart below


A collateral benefit of having weekly deliverables is that by doing this, we'll contribute to distributing work evenly during the Sprint-sustainable phase concept-avoiding peaks at the end of the Sprint that usually cause to have implemented but not tested user stories.



I think that the key for this to happen is good planning, good in the sense of team's involvement but not description of tasks to the last detail. It's very common that planning is more an individual effort focused on estimating hours the more accurately than experience and guessing can permit.

Scrum approach for this is quiet different, firstly, planning is a team's activity were everybody has an stake. Secondly, it's divided in two parts: one for understanding what has to be done for each user story, and the other for providing time estimations. Both sessions combined should not take more than eight hours-it's recommended that the team invest four hours tops in each part.

The first part is also good for developers and QEs to communicate their understanding of what needs to be done; feedback from the whole team is what in general would increase the degree of understanding of what would be done and how it would be tested.

For the part where time estimations should be provided, I've been working in an approach that might work, no magic recipe that can be extrapolated to all teams but certainly the essence can be adapted to team's particular needs.

Let's start recognizing that estimations will always be inaccurate, no matter how much effort and time you invest. Further, estimating only hours is by far the most inaccurate way to estimate tasks duration.

An alternative is to estimate size and complexity. My preferred technique for this is a vote by a show of hands. If size and complexity are defined next comes priority, again the team would need to decide what they'd need to tackle first. This goes hand in hand with Sprint Backlog prioritization.

Practical experience shows that having a target date for implementation helps QE to get prepared for receiving implemented user stories. It's also a good idea to provide an estimation for this.

Hours come next and again the team should vote for estimating them to tasks. This voting would take into consideration the other already estimated parameters and hopefully would conduct to a more educated guessing. Again, accuracy is not what we're looking for.

Lastly, team members can start to volunteer for the work they want to do for the Sprint. Work balancing would be oriented to balance complexity and size more than just hours. I present and spreadsheet that summarizes this approach:

Wednesday, November 25, 2009

And Who Will Guard the Guardians?

Another interesting questions that I've heard the other day, do the code produced by automation guys has to be code reviewed and unit tested?

Again, some analysis before answering. Automation code run against a product and automatically executed test cases that otherwise would have to have been executed manually. There's an investment in building automated testing suites, but this greatly pays off when automation can be executed repetitively and in a fraction of time and cost of what it'd have taken to do it manually.

Automation can be used as part of a Build Verification Test that quickly smoke test builds to check if anything has been broken. Nevertheless, automation can also be extended to test product functionality and here's where the problem starts, what if automation is not catching errors?

Many times errors are not caught because the original test case has not been well designed or is outdated, but in many more occasions automation is failing because it has its own bugs. In past projects I've heard several times that those errors that were easily manually reproduced, passed automated test. This was of course a huge alarm sign that was pointing in the direction that code produced for automated suites had been poorly tested if tested at all.

Thus, automation can be hindered if good development practices are not applied for all code that automation team are producing. After all automation guys are still developers producing code that needs to be tested by a third party. Going back to the question, this is my two fold answers: first, automation guys should unit test and code review the code that they produced, second, Quality Engineers should test automated test cases and suites like they test other software.

Tuesday, November 24, 2009

And Who Appoints The Product Owner?

This is a very interesting question that I think that deserves some analysis before answer it. Let's start saying that the Product Owner is usually appointed by the organization and not by the team.

Even though Scrum literature does say that the Scrum Team it's self governed and can appoint their own Scrum Master, it doesn't say much about who appoints the Product Owner and how could decide to remove him if necessary. My recommended rule of thumb would be this: the Product Owner-originally appointed by his organization-could be removed if the team agrees and present a business case that justifies this decision. Upper management endorsement would be required though.

In my experience, Product Owners have a great deal of talent and experience dealing with clients, they're part of the sales force; more clearly, they're the advance team with clients. They have a great ability convincing people-both clients and in house-and are in essence business driven individuals.

So, as you can guess, these types of individuals are not abundant. Besides, Product Owners require a deep technical knowledge, not only good selling skills.

I guess that the question have some triggers, why a team would be thinking in changing its Product Owner? Is it because o a lack or communication? Is it because the Scrum Master is not full filling his role? Before even in thinking in changing somebody my advice would be to drill down the root cause of the problem.

Monday, November 23, 2009

The Scrum Dream Team

I was thinking in some of the characteristics-I guess that the word qualifications doesn't apply in this context-for a great Scrum team and I came up with a list:
  1. It has to be self directed, meaning that it has to be capable of getting into trouble and create its own ways to strait things
  2. It has to be self governed, meaning that it shouldn't wait for managers to be told what to do
  3. It works more based on agreement more than on impositions
  4. It respects everybody's personalities
  5. It really cares about everybody's well being
  6. It reach consensus because their team members don't have an "I win, you loose" mentality
  7. It creates a friendly environment, a note here, I come from a Latin culture where friendly really means friendly like people eating, jogging, going out, making jokes, and drinking together. It's not uncommon to see that team mates do things together on weekends and after office hours, this creates close bonds that survive years after they finish a project
  8. It has to have team members ready to make concessions and pact rather to go into a quarrel
  9. It has to have a good buffer (a.k.a. Scrum Master) from external interferences
  10. It has to have a track story of continuous improvements
  11. It has to be willing to learn new things

Friday, November 20, 2009

Scrum Masters and Referees

Watching a good football game (soccer in American sports jargon) the other day I realized how much fun you can have seeing a couple of excellent teams playing. I chanted a lot and the team that I was supporting won (of course I had no power to influence score, but it was a lot of fan screaming, jumping, and singing).

After the match I've started to think in some analogies between football and Scrum teams. Both are self-organized teams that work together in spite of their individual starts. Both has roles, specialties should I had said, like the goal keeper or the insider. But at the end anyone can score and change curse of the match in an split second.

But more importantly, there is this one character that nobody likes that much, the referee. And here I've found some interesting analogies with the Scrum Master:
  1. The least you see from the referee, the better that the match is being played (unless of course that you have an incompetent referee afraid of doing his job)
  2. The fewer interventions that a referee is forced to make, the better that the teams are self-governing
  3. Referees solve complicated situations that can easily escalate to crisis
  4. They use a mix of gentle and strong hand to keep peace in the play field
  5. They don't appear very often but when they have to, they're not afraid to show some muscle to enforce his decisions
  6. They are not the starts of the show, players are

At the end you might ask, why does a match need a referee? To rule or facilitate the game? See the similarities with the Scrum Master role.

Friday, November 13, 2009

Metrics and Process Improvement

It's not uncommon to hear that managers are interested in improving productivity in their teams, metrics-wise this means that the indicators that they're looking at, should start to show bigger and better numbers.
Metrics are commonly applied for measuring processes and they in turn reflect internal team organization policies and practices. Following this rationale, managers should start improving processes if they want to have better metrics. But what if processes are something that you can't measure and consequently improve?
Process Definition
According to Wikipedia a process in engineering is "engineering which is collaborative and concerned with completing a project as a whole; or, in general, a set of transformations of input elements into products with specific properties, characterized by transformation parameters".
Processes in Manufacturing
This definition stresses the point that a process implies a set of transformations; further, a process in manufacturing has some interesting characteristics, to name a few:

  1. It's standardized, meaning that it has a set of predefined set of steps that has to be followed by workers. Conversely, deviation from the standard produces defects

  2. It's repetitive, this is key in factories and shop floors where journeymen perform the same tasks again and again. Perfection comes from repetition, and seniority is based on the number of times that journeyman executed tasks. Repetitive tasks are also great for passing knowledge from journeymen to apprentices, and of course also great for foremen supervising work in floor shops. However, repetition cuts creativity and innovation.

  3. It can be extrapolated, the X process in the Y factory can be documented in a recipe format and then sold for use in the Z factory. This rationale of course doesn't consider that the same process in X and Z factories will be implemented by different teams. Again, focus in the process disregarding the human factor.
Process in Services
Processes in service industries like fast food franchises or banks have humans that follow some rules and procedures to provide services to other humans; however, some characteristics of this interaction can be similar to what happens in manufacturing like:
  1. It's standardized, there are standards for food elaboration, quality, and service time. Franchises are especially prone to set standards but even the old mom-and-pop type of restaurants has standards that they employees have to meet.
  2. It's repetitive, unless you decide to go to a very expensive restaurant, you'll be almost all the time getting the same food-good or bad no matter-that comes out of the same process that restaurant employees followed to prepare it.
  3. It can be extrapolated, otherwise there won't be fast food franchises that can offer the same product with the same quality-again, good or bad-in different parts or the world.
The Processes in the Software Industry Dilemma
By this time you might have realized that manufacturing or services processes can't be directly translated to software industry, some reasons are:
  1. The software development industry requires a high degree of innovation and creativity. This brings a lot of variability over processes; for instance, a test case can be executed by two different quality engineers with different results, one might have executed the test case in two hours and finding no bugs whereas the other might have execute it in half the time and with high severity bugs reported. Is the first quality engineer less effective than the second? Of course not, many factors like expertise, product familiarity, or technology knowledge make the difference. More importantly, the human factor plays here a more crucial role.
  2. Work is essentially not repetitive, maybe some parts of testing are but development is certainly not. Programming languages have a very few and well define set of structures like repetition, bifurcation, and comments. However, these are like building blocks that you can use in an uncountable number of ways to achieve the same results. Of course that there are good practices and rules to follow, but human creativity certainly is more valuable than repetition at least in this line of business.
  3. Recipes don't work anymore, you can't extrapolate processes from one development team to another because there are too many variables related to people that you can't control. For instance, you can't expect to have the same degree of motivation and commitment in two different teams, again soft-factors like those are what will prevent process extrapolation and consequently standardization.
So, seems like old project management techniques based on process standardization are no valid for this industry eh? Even more importantly, if you can't standardize process how could you improve metrics? I guess the question should be, is it really worth the try to look for processes and metrics applicable to the software industry? My short answer would be no, my long answer you be that we could look for a different type of metrics originated from agile processes.
Scrum Coming to Rescue
What Scrum can bring to help solve the dilemma? For starters, it doesn't prescribe a process/metrics measurement approach which I've described before simply won't be applicable in this industry.
Secondly, Scrum brings a shift in focus from control to empowerment, from contract to collaboration, and from documentation to code. This highly pragmatic paradigm requires many things for work, things like self-discipline and upper-management support, but undoubtedly empowering teams would be the cornerstone.
Just imagine having teams than can create their own process based on their needs; that can discard or modify process that are not longer applicable, and that can be able to achieve impressive results with no formal process in place. For some this might sound like project manager's worst nightmare, but for Scrum purist this could be heaven.
Self-directed teams are essentially highly committed and innovative-they have to be, otherwise they shouldn't have choose the Scrum way-that work at a sustainable phase and creates on the fly whatever process they need. Of course processes come from consensus and this in time generates internal commitment and motivation. In time this produces hyper-productive teams with competitive individuals looking for challenging stuff to work on the next Sprint.

Wednesday, November 11, 2009

What a Burndown Chart is Not

A burndown char is not:
  • a fortune teller crystal ball that you can learn to use to predict how things will be occurring in the future
  • a time machine that will allow you to travel back and forth in time to fix things or foresee pitfalls
  • a single chart that will concentrate in it all the necessary information for you to see at a first glance project and team status
  • a chart that is magically draw by somebody everyday before you come to the office
  • an accurate and no misleading indicator
  • a complicated tool that takes too much effort to understand and maintain
  • a converging point tool that the team will use to agree on something
  • a bargain chip that managers can use for negociating witht the team
  • the ultimate indicator for measuring team's productivity
Locally the question would be, what a burndown chart truly is? I'll save the answer to that question for another post, stay tunned.

Wednesday, November 4, 2009

Sprint Planning Meetings

One of the biggest confusions in the Agile community is about the need for sprint planning. Some people tend to think that since Scrum for instance is all about adaption, no planning is required.

In order to clarify this confusion is necessary to say that planning is not the same as detailed planning, the big difference is that in detailed planning we look for concrete data about what, how and who would do that during the Sprint. On the contrary, when we do Agile planning, we're just interested in understanding what needs to be done but not exactly how.

Planning sessions are also a great opportunity to question the Product Owner who in turn should look for the answers with final user or customer.

During the second part of the planning stage is necessary that the team estimates complexity for all user stories that will be moved into the Sprint Backlog. Estimated complexity is a fun exercise and an excellent opportunity for team members to make public their understanding of the work that needs to be done. One of the great benefits of this part of the planning meeting is that fosters communication among team members, explaining what one thinks will try to build in the Sprint is a perfect vehicle for understanding it better before even starting to work on it.

Finally, planning meetings should include QA work for testing and validating fixes. One common pitfall is to just plan for implementation but not for testing.

Tuesday, November 3, 2009

Fulling QA's plate


Many people tend to believe that a good Scrum Smells is having QA overbooked for the whole Sprint. From a capacity standpoint this might make some sense, but from an Agile perspective this approach has several limitations, to name a few:
  • Overbooked personnel has no breathing room for creativity
  • No creativity implies repetitive work and repetitive work is enemy of reflection
  • No reflection cuts continuous improvement
Further, Lean-wise having QA working at full capacity all time creates waste in the form of work being piled up waiting for being processed and processed work being blocked to pass to the next server in the line because it's also booked.

Lastly, having QA at full capacity will break the continuous phase principle because engineers simply will quit the project if we make them work at this killing phase during all Sprints. The ultimate goal should not be making full use of resources from a capacity perspective, on the contrary, noble goals like team work and team empowerment should be pursuit.

Monday, November 2, 2009

What QA do During the Sprint?

This has been an interesting question and debate topic. Let's start saying that in Scrum teams are not divided in development and QA, everybody is part of the whole team and takes active part in the Sprint. Further, QA's involvement goes thought all Scrum meetings, artifacts and roles. For instance, a Quality Engineer could be appointed as Scrum Master or even Product Owner.

If we think in planning meetings, QA should participate in user story definition and understanding of what is required to do. Moreover, QA has to create the acceptance criteria for all tasks. Also during the planning meetings, QA participation is crucial when estimating tasks complexity and duration. One important note here, Quality Engineers has to vote during planning poker meetings.

Remember that what we're estimating is complexity and time for tasks that has to be completed during the Sprint, and when I say completed I'm referring to the Definition of Done that has to be agreed by the team. One very common mistake is to include only development's estimations for tasks and as a result tasks get implemented but not tested or tested with bugs.

During the Sprint, QA has to participate in the Daily Scrum Meeting to report its status and blockages. One common Scrum smell is QA reporting that it doesn't have anything to test. Typically QA has almost anything to do during the first weeks of the sprint but in the last days it's suddenly overwhelmed. Again, this is a consequence of bad planning and a violation of the "Sustainable Phase" principle. Workload should be evenly distributed both for development and QA during the Sprint, avoiding killing hours of work at the end only for QA.

Also at the end of the Sprint, QA participates preparing and presenting the demo to clients. Further, during Sprint Retrospective, QA has to reflect in its work and interaction with development and make the necessary adjustments for the next Sprint. One word of caution though, don't wait for the Sprint Retrospective to adapt, adapt as soon as you feel the need.

Why Scrum is so Popular?

I was thinking the other day about why Scrum is such a worldwide phenomenon in terms of popularity, I mean, why is with Scrum that everybody seems to be so eager to learn about it?

One or two steps back, other frameworks and thinking tools like XP or Lean are tremendously good, their principles are so down to the earth that you should absolutely follow them in whatever project that you try to start. Further, there's certainly too much overlap among Agile, Lean, XP, Scrum and even Kanban, but at the end of the day Scrum seems to be gaining more popularity that the other Agile flavors.

If you like the books and read Beck, Jeffries, Poppendiecks, Schwaber's and others, you'll find that all are excellent but somehow Scrum succeeding again attracting more visibility.

So, ready to develop the mystery? My theory, and please take it just a personal believe without hard data behind, is what has propelled Scrum to the top was the certification program. If you think it twice you'll see that there are around a hundred Certified Scrum Trainers around the world that has to pay a $7000 annual fee to maintain their certification status and still make a profit out of that. This small army of CST are intensively active organizing courses around the globe and as a result we currently have around 60.000 Certified Scrum Masters.

CSMs paid close to a grand in the US for a two day course, of course that is well invested money but certifications are not just for shining, has to have a practical use. CSMs do their best to introduce Scrum in their teams and organizations, but soon enough they discover that almost nobody beyond them has heard of or is willing to adopt Scrum. What they do? Convince more people to take the CMS course and keep preaching the Scrum word in their organization.

As a result we have a growing community that spreads very quickly, some people even called it "Pandemic Scrum". Of course that there's a business model and people making profit of it, but what a heck, Scrum is a great framework that is making IT people happier and more productive.