Debugging QA on an Agile Team

Continuing my discussion on all things Agile with John Borys, in this episode we talk about QA, the oft-overlooked and frequently maligned aspect of the agile process.

As developers (and QA!) we are constantly improving (or at least we should be). The problems John and I discuss today aren’t uncommon. In fact, many organizations face challenges implementing Agile and lean principles, and even more challenging scaling these principals. That’s why we created ScaledConf, a conference focused on learning and scaling the tools and processes for high-performing teams.

Full Transcript

[background music]

Michael Carducci:

You’re listening to the "No Fluff Just Stuff" podcast. The JVM Conference series with dozens of dates around the country. There’ll be one near you. Check out our tour dates at nofluffjuststuff.com.

Hey, everybody. This is your host, Michael Carducci, speaking. I am continuing my discussion with John Borys. Specifically, we’ve been reflecting on agile and some of the challenges that come with that, challenges for the team making that transition, challenges dealing with the organization.

This week we’re going to talk about QA and how it relates to agile. What are some of the things that we typically do wrong? I know that’s been one of the biggest pain points for me and a lot of teams that I’ve worked on over the years.

Exactly how to get that QA piece right so that the tickets coming back don’t derail the sprint or the next sprint. It’s a very common sticking point. If this is an area that interests you at all, I hope you get a lot out of the conversation, a lot out of John’s experience and wisdom in this area.

I know there are a lot of challenges in applying some of these principles. There are just simple truths in software development. This is why I’m such a big fan of agile. It was the first methodology. It was the first approach to acknowledge the realities and attempt to deal with them.

Attempt to respond to them instead of pretending they didn’t exist. This idea, for example, that the very act of delivering software changes the requirements. That we can no longer just say, "We’re going to build this entire product. And then rebuild 80 percent of it as soon as people see it, and we start getting feedback."

It’s an untenable approach. That’s why in my current business I practice Lean Startup principles, Lean development because it allows me to build and refine these hypotheses. Basically, follow the scientific method to develop an app very quickly, being very responsive to users requirement.

This is easier for me because I have a very small company, but there’s a whole separate set of challenges taking these ideas and scaling them into the enterprise. Agile has historically been a difficult thing to scale up. Lean principles have been a difficult thing to scale up.

That’s actually why on the No Fluff Just Stuff too, where we’ve created a conference dedicated to this area. Dedicated to taking these principles, learning how to scale them up, learning leadership principles, learning the skills you need to adopt these technologies, adopt the latest high-performance approaches, tools, and techniques.

If you’re interested at all in this area, if you’re facing any of these challenges, I would highly recommend going to scaledconf.com. Take a look. We’ve got an amazing list of speakers lined up, some really great content. Check it out. You’ll get a lot out of it. For now, let’s begin with John Borys.

Michael:

I’m here in New York City with John Borys. We’re talking about agile methodologies, practicing agile. Maybe one of the most maligned things is QA in agile. This is where I’ve seen so many projects go off the rails.

That we commit to a sprint and at the end of the sprint, we have everything coded. Then we start our next sprint while the last sprint is being QA’d. Then, bug reports start coming in. Now, we’re over-committed.

John Borys:

What do they say about, "Well, how do I track these?" Not how do you track the bugs, but, "How do I account for them in our velocity?" Then, actually, a ScrumMaster is going to say, "Well, you already had those stories in the last sprint."

Those bugs are things that…those belong to the stories previous. I’m not going to give you more stories to fix bugs that you should have had last time. There’s a couple of things going on there. That, by the way, is very common to have QA be a sprint behind or other teams, I’ve heard, will just cut their development time in half.

Say, "For the first week of the sprint, we’re going to do development. Then, we’re going to stop, and we’re going to give it to QA for the second week. We’re going to make our stories really small so that there’s only five days’ worth of development and then five days of QA."

I say, "Well, then what do you do for the next five days?" They say, "Well, then we start on the stories for the next sprint." What?

Michael:

[laughs]

John:

Wait a second. You’re starting for stories in the next sprint? Once you get into the second sprint, guess what? You’re spending 10 days of coding, and QA’s 10 days behind. The first sprint, you thought you were good because you only did five days of coding.

Then, the next sprint you’re starting on five days earlier and you’re coding for the first five days of that sprint so you’re doing 10 days of coding. Then, you’re standing the stuff over the wall. No matter how you slice it, QA’s behind.

Whether it’s, you rationalize something to yourself and say, "We’re doing…" Even if somehow that was possible, forget the issue that you’re starting on the sprint ahead of time. You’re not helping the problem by just saying we’re only going to code for the first half of the sprint and QA for the second half of the sprint.

You’re still doing that waterfall hand-off. You’ve just shortened the iteration loop. You’ve just shortened the implementation time between the time that I can develop and then when I make that hand-off, whether it’s at 5 days or 10 days.

Michael:

I would go so far as to call this an anti-pattern.

John:

Well, it’s waterfall. Code first, and then give it to QA.

Michael:

What’s the alternative? How do you functionally get this to work?

John:

I have to ask then, where’s QA? Where do they come from? How did they get there? Why are you using them?

Michael:

To give you an actual example, a lot of companies don’t have dedicated QA. A lot of places don’t. Should we be testing our own code? Should we actually have some responsibility there?

John:

I was thinking more of the other side of the coin where they do have…When you give me that scenario and say, "QA’s a sprint behind." That implies to me that…

Michael:

There is a QA.

John:

…there is a QA, and they’re a separate entity, probably managed by somebody else.

Michael:

It’s a different team, different…

John:

They’ll open tickets. The only communication you have with them is usually via that defect tracking system. Even if it isn’t, even if you see them every day in the hallway, it doesn’t matter. The point is they’re a separate team.

Michael:

That’s the first mistake.

John:

Because a separate team implies a handoff.

Michael:

If QA’s not the only area where this happens, you see this with DBAs, you see this with UX. I know when I first started reading about implementing agile and implementing Scrum, that all the roles are full-time roles.

I know that’s nice in theory, but in practice is that possible? How do you deal with situations where you may not need a full-time DBA or you may not need a full-time UX person on the agile team.

John:

You need generalized specialists. The DBA person, the QA person which is what we originally started talking about, those people have to evolve. If I have to evolve, as a developer, to maybe learn maybe some more JVM languages, learn functional programming because of Java 8, do all the things that I do.

At one point I had to learn SQL or whatever, but my specialty is Java. My specialty is angular, doing front-end stuff, whatever. I need to do all of those other things to be, in my opinion, a well-rounded programmer.

When I was a developer — I can’t believe I just said that phrase by the way, "When I was a developer," — I wanted to do full stack, and I’ve always done full stack. I’ve been in a couple small engagements where I was relegated to the service center or relegated to the front end because that was the role that I was filling.

Full stack is always where I wanted to be. I thought it gave me the most value. It was the most interesting and challenging. We’re expected as developers to evolve. Why shouldn’t a DBA or a QA person evolve as well?

I want that DBA person full-time on my team. I want that QA person full-time on my Scrum team, but I don’t have enough work for them to do in those particular specialized roles.

I need them to be able to fill other responsibilities and help us complete stories and deliver functional software, sometimes in a sprint that might not involve their specialty at all.

With QA, it’s definitely going to involve their specialty every time, but with that DBA person I want them to be able to write other code as well. They have to be able to maybe write Java, maybe do JavaScript, maybe testing, whatever. They need other skills. They need to develop and expand that way.

The QA though, to answer your original question, the QA person…I have this talk on the No Fluff Just Stuff tour "QA in the Agile World." We talk about that. It’s got to be baked in from the beginning. I want to tear down the wall between ourselves and QA. I don’t want to throw stuff over the wall.

What happens is there’s a very adversarial relationship traditionally between QA and developers. We don’t like them. They’re a headache. "My code’s perfect man. Don’t tell me there’s something wrong with it."

Automatically, there’s defensiveness there. But if I bring the QA person into the Scrum team, and they sit next to me and they pair program with me and offer their insights from the get-go. If they sit next to me, and ask questions, and help me uncover details in a story conversation with the product owner.

If they sit through a design session and throw up concerns that might possibly not be seen, the developer’s kind of blind to because they’ve got a QA perspective. All those things deliver value to the team.

Michael:

Absolutely.

John:

You might say, I can’t pair program with a QA person.

Michael:

[indecipherable 10:48] .

John:

Maybe, they’re not driving, but they can certainly tell you when you haven’t had enough unit tests or you’ve missed a corner case.

Michael:

I found doing exactly that, pair programming with a QA person to be hugely beneficial because anybody who has ever put code into staging or into the QA environment and had it insta-broke, there are basic gaps.

Whenever that’s happened for me, I’ve learned that there are just basic gaps in my knowledge and my approach to testing my own code. Sitting with somebody and just asking these questions, "Oh, we’ll try putting this in here," even simple things like that.

What if somebody doesn’t fill this field in? What if the validation’s turned off? What if they put special characters in here? That’s a very simplified example, but this type of thinking, which is very different to our type of thinking…

As a programmer, I think A to B. I need a thing that does this, OK, it does that. Well, what about when people do stupid things, which is always? That’s a very similar example of how powerful sitting next to a QA can be.

John:

Then, they can still do the other QA things, but they do them continuously with you. If there’s a particular thing where it’s me and you pairing and the QA’s got some time or there’s somebody else available, they can do the pairing with that person and work on automated testing, BDD scenarios.

They can help you plan out your steps for your step classes in behavioral driven development with Cucumber and [indecipherable 12:23] . There’s a lot of other things that they can help with along those lines.

Michael:

That really was the big question that I wanted to get to. You’ve answered it, and you’ve nailed it. The big challenge with having a dedicated QA on the team is there’s not always something to test. I might not have a testable artifact until very late in the sprint. The whole team might not have a "testable" artifact until the very end of the sprint.

John:

You’re testing continuously as a developer. There’s always stuff to test, and if you pair with that QA person…

Michael:

Then you’re getting that value all the way through.

John:

Yes, and it’s a big deal.

Michael:

You’re also preempting a lot of these tickets that if you got two days before the end of the sprint would be a problem, but sitting with the QA…This is actually been hugely eye-opening to me because I haven’t had the opportunity to sit in that talk yet. This was an area that every "agile" team that I’ve ever been on or been part of has struggled.

John:

Glad I could help.

Michael:

That’s huge. We all have a shelf of programming books or shelves of programming books.

John:

My wife threw them all out.

Michael:

Oh, yeah?

John:

It’s all eBooks now. Kindle.

Michael:

I have a few. I got rid of almost all of mine. They do go out of date after a while anyway, but there’s a handful that I’ve kept. There’s even a smaller number that I can count on one hand that I go back to again and again. One of them is "The Agile Samurai."

John:

Wow. That blows me away.

Michael:

Really?

John:

How old is that?

Michael:

It’s not that old, I don’t think.

John:

I don’t think so either because I just bought it day before yesterday.

Michael:

Are you serious?

John:

I’m not kidding at all. It’s on my Kindle.

Michael:

I highly recommend it.

John:

Not my Kindle, it’s on my iPad.

Michael:

It’s a light read, it’s not a heavy read. It’s not a very weighty tome. It’s actually an entertaining to read, too. You read through it, and there’s so many scenarios that ring true.

One of the ideas that John Rasmussen, I think I’m pronouncing that right, espouses is this idea of an Agile Inception Deck. It’s an exercise to get everybody on the same page, and that you answer a bunch of basic questions.

One of the slides is like 11 slides, and one of them is who’s on the team. It talks about QA, and analysts, and UX and it says, "Who’s involved here?" In the past, in teams, we’ve glossed over that. "Well, so and so will do that, I guess. We’ll send the QA to the QA Department."

John:

Don’t throw it over the wall, dude.

Michael:

That’s the takeaway. I wish there was more about this to talk about. If you’re going to commit to being an agile team, then commit to being an agile team not a bunch of agile programmers.

John:

No, generalized specialists. Then, everybody is considered a dev member. The QA person is a developer. We give everybody the same level, but there’s different times where different people with different specialties are going to move to the front.

Michael:

They’re going to shine.

John:

That’s why it’s a self-organized, self-managed team. Hey, let’s hit Central Park before I have to fly out of here.

[background music]

Michael:

All right. Thank you so much for your time and attention. I’ve got John here. We’re in New York City. We’ll be on the road. We hope to see you at one of the shows soon.

John:

I’m going to get some [inaudible 15:39] .

Michael:

Or at least a hot dog or deli sandwiches.

At No Fluff Just Stuff, we bring the best technologists to you on a roadshow format. Early-bird discounts are available for the 2016 season. Check out the entire show lineup and tour dates at nofluffjuststuff.com. I’m your host, Michael Carducci. Thanks for listening and stay subscribed.

Leave a Reply

Your email address will not be published. Required fields are marked *

*