Summary
In this episode, Rob and Michael discuss the importance of a clear definition of done in Agile development. They share real-world examples of how unclear done can lead to delays and rework, and how a strong definition of done can help teams stay on track and meet deadlines.
Detailed Notes
The importance of a clear definition of done in Agile development was the main topic of this episode. Rob and Michael shared their experiences with unclear done and how it can lead to delays and rework. They discussed the components of a good definition of done, including code review, automated testing, documentation update, and deployment to staging and production. They also emphasized the importance of holding oneself and the team accountable to implement definition of done into the workflow.
Highlights
- A clear definition of done is critical for teams to avoid scope creep and ensure that requirements are met.
- Ambiguous done can lead to unfinished work, hidden bugs, and endless tweaking.
- A strong definition of done can help teams stay on track and meet deadlines.
- Code review, automated testing, documentation update, and deployment to staging and production are all important components of a good definition of done.
- Holding oneself and the team accountable is key to implementing definition of done into the workflow.
Key Takeaways
- A clear definition of done is critical for teams to avoid scope creep and ensure that requirements are met.
- Ambiguous done can lead to unfinished work, hidden bugs, and endless tweaking.
- A strong definition of done can help teams stay on track and meet deadlines.
- Code review, automated testing, documentation update, and deployment to staging and production are all important components of a good definition of done.
- Holding oneself and the team accountable is key to implementing definition of done into the workflow.
Practical Lessons
- Develop a clear definition of done with your team to avoid scope creep and ensure that requirements are met.
- Implement code review, automated testing, documentation update, and deployment to staging and production as part of your definition of done.
- Hold yourself and your team accountable to implement definition of done into the workflow.
Strong Lines
- A clear definition of done is critical for teams to avoid scope creep and ensure that requirements are met.
- Ambiguous done can lead to unfinished work, hidden bugs, and endless tweaking.
- A strong definition of done can help teams stay on track and meet deadlines.
Blog Post Angles
- The importance of a clear definition of done in Agile development: a case study
- How a strong definition of done can help teams avoid scope creep and meet deadlines
- The benefits of implementing definition of done into your workflow
- Real-world examples of the importance of a clear definition of done
- How to develop a clear definition of done with your team
Keywords
- Agile development
- Definition of done
- Scope creep
- Code review
- Automated testing
- Documentation update
- Deployment
Transcript Text
Welcome to Building Better Developers, the Develop-a-Nur Podcast, where we work on getting better step by step, professionally and personally. Let's get started. Hello and welcome back to Building Better Developers, Develop-a-Nur Podcast. I am Rob Brodhead, one of the founders of Develop-a-Nur, also a founder of RB Consulting. More about that in a second. First, I want to talk about this season, this series, this episode. We are in the season doing Building Better Developers with AI. We're going back two seasons ago, I think it is, and we're grabbing a topic, throwing it in AI and saying, what would you suggest for a podcast? And then we're basically analyzing that and it's giving us some great things to talk about. So that's where we're looking at this episode. Our title for this one is going to be defining DUN in Agile, how to stay on track and avoid scope creep. Now, back to RB Consulting. We are a company that helps others figure out the best way to use technology. That's the best way to look at it. Just like you can do a financial audit or security audit, you can also do a technical assessment, which is very similar to a technical audit, things like that. We're going to sit down, we're going to help you figure out what do you have, what is your current situation. We're also going to sit down and talk about your business, because that's really the most important part about using technology is how to leverage technology to do what you do. We're going to help you walk through your processes. What is it that you do in detail? So think about it, sometimes we get too much in our head, just like, how would you explain to somebody how to tie a shoe? There's probably business things that you do that are along that same line where you just know how to do it. But to explain to somebody else, which means to explain it to a computer or technology, can be a bit of a challenge. So we're going to help you bridge that gap. We're going to help you understand what's out there, because there's a lot out there. We've spent a lot of time. We are technology agnostic. And so we're going to find ways to help you take your technology junk drawer and clean it up. And through integration, simplification, automation, innovation, we're going to find the best approach for you, that custom recipe for success. So you can have a roadmap that you can execute on or we can help you with that as well. Good thing, bad thing. This is going to be like one of the goofiest ones we've had maybe so far out of a long birth stable. Good thing today was I was sitting there and I was eating lunch and I had something like get stuck between my teeth and I was like, OK, I got to go like get that thing out. And it came free. The bad thing was when that came free, I also had part of my tooth came free. So I had like a cracked tooth that somehow had lost its its strength or whatever. So not in a painful way. There's nothing painful yet. I can drink hot and cold liquids. It's not causing my head to explode or anything, but enough that I'm going to have to go find a dentist very quickly and get all that kind of stuff repaired. So, you know, sometimes the simple things turn into not very simple things, sort of the story of my life right now, much like Michael's, which he has regaled us with in recent episodes. Let's see how it's going there this time as we check in with Michael. Hey, everyone, my name is Michael Melasch. I'm one of the co-founders of Developmenter Building Better Developers. I'm also the founder and owner of Envision QA, where we help startups and growing companies build better software faster with fewer problems. Our services cover software development, quality assurance, test automation and release support. Companies come to us when they want to avoid delays, reduce bugs and launch with confidence, whether you're building your first MVP or scaling your own business. We make sure that your software is reliable, efficient and ready for growth. You can learn more at EnvisionQA.com. Let's see. Good thing, bad thing. So last time I talked about the water issue, so that's been resolved. I guess. Good thing we now get to enjoy the new toilets we had installed a month ago. Now that the water is working again, we can finally enjoy all the new toilets. We can finally enjoy all the upgrades we kind of did in the house, which we weren't able to do last time because we had no water. And as far as bad things go, I got a project that's kind of dragging out and just dragging me down a little bit. So, but weather's getting nice, so I'm not going to let it get me down. Yes, weather has definitely been getting nicer. It's been awesome enough that I've actually had the windows open on a couple of mornings and not been like dying of heat exhaustion. So it's always good. So I dive right in this time. I followed up from a prior post, so it didn't give me like any, you know, excellent idea or anything like that. It just said, I said, hey, how about doing this? And it said, absolutely. Here's a detailed breakdown. And it gives us the same kind of thing that we've had in the past. So it's a suggested episode structure and item one with some bullet points. We'll dive right in. What does done really mean in Agile? Explain the Agile principle of a definition of done, D-O-D, contrast it with just finished coding. Why clear done criteria are critical for teams? I want to, I really want to go with the like jump to the end there. Why clear done criteria are critical for teams? Because this is what I was saying is that when we, we sometimes when we start a project and we say, we need to make sure that one of the first things we do is we define what done is, is that people look at us like we've got three heads or something like that. The thing about done is that there are varying understandings of what done in a software project and particularly mean. Like does done mean that you just wrote some code? Does it mean that you wrote unit tests with that code? Does it mean that you have done a full, it's gone through QA? Does it mean that it's been deployed? Does it mean that the user is using it? There's a lot of different ways you can look at done. And within a development project, there's also things that done may include things like, has it been properly, you know, the tied unit tests, has it been properly commented or documented? Has it been committed to version control? Has it been merged into a branch or something of those nature? As the, the ticket that originally, you know, that originated that task, been moved through its processes and moved to complete so that it is done. Has it been signed off on there? Things like that, that are very much part of your, your development process and your standards and your team or even your corporate process and standards that need to be taken under consideration when you consider what done is. Some places done may mean that it has to actually go through like a code review and a security analysis review and all of these other things. And it's not necessarily a, it's not necessarily a, it's not necessarily a security review and all of these other things that are way, way more than done in the, hey, I wrote the code and I tried it on my local machine and it works. And I'm using air, I'm using air quotes everywhere here for those that can't see it, because that's sort of how it is. It's like, what really is done? And we need to make sure we do that because that is the, that is the target for whatever we're doing. So if we ask somebody, is it done? We're not going to get, well, sort of, or yeah, but it's not, or it's kind of, or any of that you need to, you need to know, is it done or is it not? Because that's going to be a key part of scope creep and estimation and things like that. So where do you want to pick this one up? So, yeah, so you kind of touched on a lot of things. I'm going to go with contrasting with just finished coding because one of my biggest pet peeves is you do all this work or developer does a lot of work and they say they're done. They push the code up and then it gets to testing and you go down and you sit there and you read the ticket and you're like, the test was reading the ticket and they're like, what is done? What did you do? You know, it's not clear in the requirements what it is that they were supposed to do. So what did you work on? So when you're working on the requirements, the definition of done needs to be clear for everyone that reads the ticket because if you're working on the ticket, you're working on this change, you want to make sure that the change is what is implied in the ticket. There have been times where I have made mistakes where I read the ticket one way, someone else reads the ticket another way, and what gets implemented is not what was the requirement for the definition of done. And you run into these situations when the requirements may be clear but may not be clear enough to really define the definition of done. Case in point, you could have, I'll just pick on login screen because login screen is just about everywhere. You could have a situation where I have a login screen and you basically were told, hey, set it up to where a registered user can log in with username and password. Cool. I write the code, I can log in. Now, it gets to the tester and they're going to read that as, okay, so I can log in with username and password. It does not specify things like case sensitivity, special characters, things of that nature. So if they go to test a login as typical login security, which has been around for a while, they're going to break things. They're going to think, well, why is this not working as expected? So you need to make sure that within the requirements definition of done is some of the things of what is done. So done would be implied user can log in using any username, any password, or if there are other requirements, then you need to lay that out that, hey, username can only be lowercase, username can be camel case, username could be any case as long as the username matches a user. These are not just requirements, but these are what needs to essentially be the story for testing so that you know it's done. So someone picks this up or a user goes to test this. They know specifically how to test it to see how that works. Now, if it's a backend change, it's a little more difficult. You're going to have to have another developer test that. But this is to me from a test-driven developer approach, what definition of done means to me, because if I can essentially lay out how this works, then I can code it. If there are ambiguities in what I need to do, then it is not a clear definition of done. This sort of goes right into the next point. Why ambiguous done leads to scope creep. When done means different things to different people, leads to unfinished work, hidden bugs or endless tweaking, creates mismatch expectations between devs, QA, and clients. Which is really what we just talked about. Here it's that back and forth. I'm going to probably go right into the next one since Michael Soares stole this one and let him talk about the next couple items and just touch on this real quickly to get my thoughts. Really what the problem is, it does become very frustrating when you don't know what done is. It really is very much the developer's QA and customer. You will have stuff that, for example, goes to QA and it's to them not done. It hasn't covered the requirements that they think it needs to. They kick it back to the developers and they're like, why is the developer not getting the work done? The developer is like, why is the QA on my butt all the time? Why do they keep changing stuff? Why can't they just accept it? Of course, the same thing happens with the customers. It will go all the way through the customer. The customer is like, this isn't what I wanted. This isn't how I needed it. People get frustrated. It does lead to scope creep. It's really more of the scope creep tends to be that now people start expanding what they want to talk about or add to the requirements to try to make sure that they can figure out that it actually gets done. It's almost shoot for the stars so you fail and hit the moon. It's that kind of stuff. It's just a bad situation to be in. Real world examples. Stories from teams where unclear done led to delays or rework. How a strong definition of done saved another team from project chaos. I'm going to throw that one to you. Yes, I'll run with this one because the company I've worked for over the last year's transition, we were acquired by another company. And before we were acquired, we had clear requirements. We knew what needed to be done. Everything we had, we had definition done. Our tickets were being completed on time. We met the expectations. Yes, there was occasionally some rework because like Rob said, when you deal with reports, you run into, oh, that's a simple change. But outside of reporting, almost everything we did was able to be completed on time, on task. And we knew what it was we were doing and could test it. In that transition shift to the new company, as we were pulled in, almost every ticket I have had, it feels like it is a monolithic spike. Every single ticket I have is ambiguous. It is basically make this work in the inside of this ecosystem. Hell or high water. Just make it work. The problem is this is such a monolithic application that you have no idea where to go within this application. There are multiple teams working on this project. And unfortunately, even though we are in the process of transitioning into this new ecosystem, we're still making change in the old ecosystem. So you could have one piece, you get it working, then go back and pull the latest change. What? You just redid this or, oh, you changed this. Now it doesn't work. So this is so frustrating that having clear guidelines and definition done really avoids that and can hopefully get you across projects and meet your deadlines. Excellent. Good examples. And I'm going to dive into the next one because we're going to try to get through a couple of these points this time. Components of a good definition of done. Code complete and reviewed. Automated test passing. Documentation updated. Deployment to staging production verified. Acceptance criteria met and signed off. I think that's a really good start. And I think I want to sort of touch on these real quick, each of these, because code complete is in review. It's something that I think we should do on a regular basis. I think there is very much a value to reviewing code. I have worked on projects that have code reviews very strong all the way to don't do it at all. And the strong, honestly, the stronger the better. I think, yes, it takes time. There's effort. There's it can be frustrating because you get something kicked back to you. It's like, hey, you need to make this conform. But it does pay off in the long run. And this is from somebody that there's more than a few times I've been frustrated with the code review, especially the code analysis, static analysis stuff I do all the time. I'll get frustrated with something that gets kicked back and it says you should do this. And I'm like, I don't really want to do it. Like, I'm just going to. And there's always that temptation. And sometimes I fail. I fall for it to just say, you know what, I'm going to pass it anyways. We're going to move on. But there is also a value in doing those automated tests passing is I will. I've been on those where it's like, OK, I'm creating tests for everything. And I've been in a situation where I'm like, all right, I'm going to whip a couple tests out. We're going to pass it. We're going to move on. Yes, going through and doing those tests can be time consuming, but particularly getting those automated tests built will help you in the long run. And yes, sometimes they fail because you change requirements or something like that change. But it also gives you actually an extra leverage to not change stuff, to say, look, if we have to change this and have used this before, if we have to change it, the change is not that big a deal. But we have to retest all of this stuff or we have to update all of these tests. And then suddenly that thing that was like, yes, it's a little change in your quotes actually is something that is not a small impact. And we have to actually think about that. And you could say, well, just skip the testing. But it's like, well, wait, but any of those places it's testing, if one of those fails, then we would have to go find it. So you're going to have to keep doing it. Documentation update. We skipped this all the time. I know everybody does, but it really should be something that we build into our processes to make sure that's part of done is that we, wherever we need to update documentation, we do so. I think the deployment thing is something is getting better with CI, CD and some of those kinds of things and pipelines. But I think we don't do it enough. I think it's very good to deploy it and run it through its tests on the on the new site, make sure everything goes. And of course, actual done is that it's been signed off on. So we probably have a done during a sprint or done for a certain step, but that is not done for that feature because it's not done until we can go all the way through and somebody can actually use it. Thoughts on those? Yeah, I want to briefly touch on that. I'm going to just go right into the next one. One of the things that Rob touched on, you know, the automated testing, you know, going back and fixing those tests, make sure you don't let your test get stale or just don't delete tests that are failing. A lot of situations if you're rushing to get to the end, they I've seen developers do this where they don't maintain tests, they just modify the test enough to make the test pass, but not really meet the requirement that the test is passing. So make sure that you keep your tests somewhat fresh to the requirements as they change. I'm just going to jump into five who creates and maintains the definition of done, you know, project owners, scrum masters and the dev teams collaborate and DOD evolves as the project matures. I'm going to take that first one, you know, who creates and maintains the definition is done, the team, your project owners, the scrum masters, the dev teams. If you are working as the developers, chances are within your team itself, you as a team need to sit down at least quarterly, agree on what your team wants for definition of done. Everyone should be on the same page so that there is no ambiguity, no confusion of when you scope out tickets, you flush out the requirements that when you pick a ticket, you set, hey, I'm going to get it done in this amount of time. Then you're going to get it done in that amount of time. And this does require working with the project owners and the scrum masters at the beginning of this. It's going to be difficult, but in the long run, it's going to save you a lot of time, headache and hassle. Where are your thoughts, Rob? Yeah, and I think in that, that's the whole point is that if you have problems with it early on, if you're if the scrum master, the product owner, don't even the dev team, if they don't have a good definition of done, that should show up in your retrospective. That should be something that gets flagged. It should be something that you correct as you move forward, because that's part of the whole idea is that agile should be getting better as you go. And honestly, there have been more part of the reason that I know that it's important to define done is that we have had this come up in sprints during as we've gone through an agile project. And we've gotten to a point where like, you know what, we need to do a better job of done. Maybe we need to add something. We need to change something, tweak something. We've gotten away from maybe one of our steps that now we're not doing it right. So let's go back to it. Code review process or the more than a few times where it's like we need to adjust the code review process, bring more people in, bring less people in, provide different, a different format of feedback, things or, you know, less feedback, more feedback, smaller chunks of work. So they're easier to review. There's a lot of that kind of stuff that goes on. We're cruising right along. How to implement definition of done in your workflow. Incorporate into the user stories and sprint planning, use checklists or tools like Jira, GitHub, and Notion, make definition of done visible and agreed upon by all stakeholders. And this really is just like once you've defined done, you should document it. There should be something in your, it should be in your team documentation, your development processes, your team work, your team work. Your team documentation, your development processes, your project processes that this is what done looks like. These are the steps. These are the bullet points that have to be a part of that. They don't have to be included in order for us to actually be done. And then within that is we can then, if we're using, especially if you're using like Jira or one of those kinds of things, Trello or Sonner, whatever it is, that when it goes into the done column, then we know that all those things have actually been completed. And it's not bad in some of those that you have, you know, sometimes the columns, the swim lanes that you're moving your ticket through are all of the things to define done. So maybe it's like you start out and then it's being coded. And then once coding is done, it goes to unit testing. And once unit testing, it goes to QA review. And then it goes to code review or, you know, and like, and not necessarily in that order, but it's like you can in your swim lanes, document all of the things that need to be done. And then that should move through. And then you can even have things around that. You can have logic that says it can only go from this column to this swim lane to this swim lane. And only this person can move it from this swim lane to this swim lane. Things like that that can really help you be more efficient with what your definition of done is and how you move your tasks through it. Thoughts on that one? So the last thing I really touch on with this is holding yourself and your team accountable is one of the best ways to implement definition of done into your workflow. If your team really, it should be a personal practice because a lot of teams and some companies don't even do this, which is bad. But personally, if you want to be a good developer, go from coding to becoming a developer to really just keep growing and improving and being the best developer that you can. You need to hold yourself accountable and make sure that every task you go into or you work, you look at with the mindset of what is the definition of done? What is it that I'm trying to complete with this? And how does this fit into not just what I'm doing, but the bigger picture? Because sometimes you could be say, hey, build this. But in the bigger scope of things, that's not what needs to be. It's actually something else, but kind of got lost in perspective. The best example I can think of for that is go back to that tree swing picture that's all over the Internet for software development. Starts out with this is what was pitched a tree swing. You go through multiple iterations, roller coasters, tree, no tree. And all the customer really wanted was a rope and a tire. They wanted a tire swing. Defining definition of done helps you avoid scope creep, but also helps you ensure that the requirements stay on task and get you the right product at the end of the development cycle. Yeah, we talk a lot about knowing your why your definition of done is your why for each individual task. It's like it really is. It's the things that keep you guardrails for your work and to make sure that you stay on task and stay on focus. That being said, it is time to wrap this one up. As always, should just email at info development or dot com if you've got suggestions, product ideas or anything like that for topic ideas or product ideas, I guess, as well. Any of those things, we'll be happy to hear from you. We want your feedback because we're here for you so we can build better developers. You can build a better podcast by letting us know what your thoughts are and where you want to go. Future topics, areas of interest, things like that. I know there are some things we haven't spent a lot of time on, so we can always go back to those. Also, you can check us out on X. You can go out at developer.com. You can go to or you can go at develop and work. You can go to developer.com and we have plenty of places for you to leave us feedback. We've got a contact us form. You can leave stuff there. You can. We have a developer Facebook page. You can definitely put stuff out there. However you want to get a hold of us. We're happy to get that feedback and incorporate into us building a better piece of solution, a better solution, better bit of content for you. You can always check us out if you're not on the out on YouTube on the developer channel. Also, if for some reason you're tired of seeing our faces and you want to just listen to us on audio, wherever you listen to podcasts, you can find the building better developers, developing our podcasts. As always, I appreciate your time. Appreciate you hanging out with us for a while. I appreciate you putting up with my very lame introduction in Spanish. I'll try to like clean that stuff up. Go out there and have yourself a great day, a great week, and we will talk to you next. Thank you for listening to building better developers to develop a new podcast. You can subscribe on Apple Podcasts, Stitcher, Amazon, anywhere that you can find podcasts. We are there. And remember, just a little bit of effort every day ends up adding into great momentum and great success.