Detailed Notes
In part two of our Building Better Developers interview with Greg Lind, founder of Buildly and OpenBuild, we dive deep into automating quality in software development through AI, automation, and continuous testing.
Greg shares how his team integrates QA into every stage of the development process—from developer-led unit tests to AI-driven analysis of pull requests. Learn how automation ensures accountability, speeds delivery, and keeps software quality consistent across the pipeline.
🎯 In this episode, we discuss: • How to automate testing for better QA coverage • Using AI to improve code reviews and transparency • Why quality must start with the developer • Continuous improvement through data and feedback • The future of Agile in an AI-powered world
“QA often gets left until the end. But it has to start from the developer.” — Greg Lind
📘 Read more insights on our blog: https://develpreneur.com/automating-quality-in-software-development-greg-lind-p2/
🔗 Learn about Greg: https://www.linkedin.com/in/greglind/ 🌐 Listen to more episodes: https://www.develpreneur.com
#BuildingBetterDevelopers #GregLind #SoftwareDevelopment #Automation #AI #Testing #QualityAssurance #ContinuousImprovement #BuildingBetterFoundations
Transcript Text
[music] [music] [music] [music] Well, hello and welcome back. We are in part two of our interview. We are continuing this season of building better developers. It's just this is the developer podcast, but it is building better foundations. And funny enough, we thought the foundations were going to be about AI, but we really haven't touched on it. And as far as you know, we might not touch on it this episode, but we are touching along upon a another uh area as we're really talking about the foundation of the team itself and how it interacts uh with the CEO of Buildley B I L B U I L D Y.com to check out that plus going to be links in the show notes. Um we're going to carry this forward. First, I need to introduce myself. My name is Rob Broadhead. I'm one of the founders of developer. Also the founder of RB Consulting where we are boutique consulting firm. We help you leverage technology create road mapaps to be more successful. Good thing and bad thing. Uh good thing is uh I'm getting into the fall is always interesting as I get into the season of there is uh not like on a given week not a ton of work but by the time I get even started in the week work fills stuff up very quickly. So I have now over booked myself essentially every week because I'm I have all these plans of things that I need to get done want to get done working on my business instead of in my business. Um and then those are rabbit holes everywhere because sometimes in the business steps in the way sometimes working on my business I find new rabbit holes of like oh I need to go deal with this explore that uh this is every bit of so it is the good and the bad of so many things that are out there so many opportunities so many things that I'm chasing but the bad side is it's just I'm overwhelmed with there's too many things to chase so I'm going to simplify it down to passing the introduction over to Hey everyone, my name is Mike Malashsh. I'm one of the co-founders of Developer. I'm also the founder of Envision QA where we help businesses with their software problems, be it custom software, cookie cutter software, or just any type of software that you need to help run your business. If you need it customized or you're having problems with delivery, give us a call. We can help you build custom software or build tools to test yourself. Uh check us out at envisionqa.com. Uh good thing, bad thing. Uh good thing is we're getting we're past Halloween. We're heading towards Thanksgiving, so I'm getting uh kind of the vibe for some turkey uh and some pumpkin pie and those pumpkin spice lattes at your coffee shops. Uh bad thing, I'm really into my seasonal allergies now. I'm waiting for all the leaves. Well, they're pretty. I'm waiting for them to fall off the trees so I can get back to normal. And the normal that we're going to get back to is we're going to continue our conversation with Gregory and pick up right where we left off. >> Yeah, I see that an awful lot, especially like the cherry-picking. That's kind of like where they go through the backlog and cherry pick. Um, >> yeah. >> So, I kind of want to pivot just slightly with your processes. So, you've touched very heavily on um, you know, the agile approach with the project managers, development teams, and that. I'm interested in how you go about approaching the QA side of things. You've touched on it, but um I'm a little more QA biased because I like test driven development, but kind of walk me through how you handle QA through this process. How do you simplify this and how do you show the communication going on uh you know between the development, the project managers, and where things are? >> Yeah, I think Michael, you and I will have a very interesting conversation at some point about test-driven development. Um but I I I agree that I think QA often gets left until the end. Same with security, right? So and and unfortunately QA sort of becomes security in a lot of organizations as well. Um but it is part of that overall security process is making sure you have well tested software. Uh but it has to start from the developer. I think the I remember the first time the first job I ever had where there was an actual QA team. I moved from a web design organization uh essentially and that was doing you know brochure websites into a real software organization and I was shocked at the fact that they had two QA there's there was six developers and two QA um team members at the same time and I I remember thinking how can they how do they have enough to do if they're just testing the software that I've built there's not going to be you know I I document my own code I there's not going to be any issues. And the amount of times they came back to me that that first week with you need to fix this. Let's let me show you what's happening here. I was completely shocked. Um but that to say that I I what I learned at that point and then I started to learn down the road is I need to think like a QA person from the very beginning, right? I need to write my own unit tests so that I'm actually testing my software internally. I need to do that in a smart way. I don't need to go overboard with my unit tests cuz then I'm maintaining unit tests, not maintaining code. Um, and then I also need to make sure that the a QA developer understands what those requirements are. You know, the the old agile flip side of the card where you would write the test on the back. um that I don't think I can I I can't remember a single place I've been where anybody actually did that um in the planning poker phase of agile until a until a QA person came in and said I don't see a test on this. What am I what am I testing for? Um, and I think that's the the aspect for us that is we're I'm still struggling with is at what point does an automate like let's say the robot framework. Um, so a lot of the testers that I've been working with in the past have been using things like robot and selenium to automate front-end testing as well as backend testing. So that combination of unit tests to then um regression tests to then front end and backend tests um in or in a let's say a a green blue deployment process um that that part of the process you need to have the QA involved from the very beginning because they need to know what they're building what they're and that's product management QA developers design all need to be involved as early as possible in that process so that you can write those out. So what we we do in our in our process is essentially the developers are writing their unit tests after they've finished a block of code. They're not writing it before. And I know that doesn't fit with the testdriven development process, but I I I will come back to that cuz I do want to talk to you, Michael, about that. But I I what I like about that approach is we write minimal tests, right? So we in other words we're not having to maintain the tests we're just making sure that once that test is written that functionality works. So I think of it more as uh if you're an object-oriented programmer for example you're testing the object for the essentially the CRUD operations. You're not doing fieldle testing cuz every time then you're you're going back through and you're rewriting tests. So I like to see that happen early on in the process so that you're every time you commit your code you're committing your te your unit tests as well and then QA gets involved at the integration tests and especially at the front end cuz and I I have a lot of friends that are amazing front-end developers but I think front end is maybe the worst and and I know I'm going to get in trouble for this at writing tests. um they hate writing tests and they almost never do a and I've I've had so many times where I've I've asked to see tests and they will tell me oh it's yeah it's it's all automated I don't have to write a test cuz it this JavaScript framework is no no that's not how it works um so we have to go back and review that and I I think I think the robot framework has helped me be more accepting of that fact that a a front-end developer and even a back-end developer um can be resistant to tests um up to a certain point to where then a a Q and I I can already see the face of one of my uh old QA uh [clears throat] leads already sort of frowning at me right now as I say this, but they're covering for let's call the the laziness of the software developer on the other side. Um, and then they're going back and making sure that those tests are are accommodated for in the code with the developer. And I I I don't necessarily like that process, but I've yet to come up with a a a better process that keeps developers moving forward without using, and this is something that I'm starting to explore as well, is without using AI to write the QA tests. And I don't mean to put that out there as a replacement for QA in any way, right? And we know that we need people looking at all of this. Um, and and developers need to be in the process. And I think of QA as developers as well. Um, and I think that's the to me as a QA developer, you're still a software developer. You're just helping to ensure quality. Um, and I think that they need to be involved upfront. They need to be involved with the the product discovery, the the requirements, and often times I one of the best QA leads I ever worked with was helping with API models and saying, "Oh, no, you need to adjust the a the model to handle this occasion or this use case." And so I think having more eyes and ears on a problem is never a bad thing and especially when you're when you're building out in in the early stages. So yeah, involve QA. may don't don't go too far especially early stages and then making sure that you're at every step QA is in the loop in all of those pieces and then I think writing the test should be up to the QA developer not the software so I've never liked the flip the card approach because that's I don't I think developers should be focused on solving the problem not not what problems might jump up from that they should understand what what uh the connections are. But if it's and I come from the object-oriented world too where if it's all it's essentially encapsulated in one set of of code, you shouldn't be worried about breaking downstream things, right? Cuz it should all be um encapsulated, but sometimes it does. And so that's when QA comes and gets involved. And they they understand that more than most software developers, I think. So with your whole process because uh in the first half of this you talked about you know uh the transparency getting the project owners the developers involved. So with this APIs and tools you've been putting together how do you show the transparency or the level of test that is uh being added as you build these projects out so you know what's being tested, what's not, what needs to be covered. Uh and so also so you know when you go to production, you know what can you smoke test when it goes out there? how much manual testing has to be done, you know, how do you kind of manage that transparency through this process of yours? >> Yeah. So, part of the issue for us is about documentation inside of the repository. So, we have a set of tools and standards that we follow for every every docker container essentially has to have all of this already documented in it. So if we know that we're using uh let's say fast API and we want to use piest to run all of the unit tests there, we need to make sure that there are a certain number of pi tests essentially for each function that we've written. And and we need to make sure that essentially that those tests have been executed with uh hooks essentially before you check into git. Uh you have to make sure that that these tests have already been written. if there isn't a test, it it runs it for you. It doesn't see a test and then it it it pulls it back and says make sure that you have your test in before you check it in. So hooks, pre-commit hooks into GitHub is one way to to manage that and that's the way that we do it. Um, but there's a lot of other ways to to look at that as well. The other thing that that we like is like I said, I'm sort of fan of the robot test framework. Um, is making sure that that's built into your CI/CD tool as well, right? so that it runs those tests and the person responsible for that is is always the developer that's checking in the code. But he if he doesn't know how to write those tests or doesn't know how to use robot framework, which is more often than not an excuse, not really um the reality. They just don't want to write the tests. Um then they go to QA and say, can you help me write these tests or can you get me involved in writing this test? um or more often than not, could you write these tests for me? Um and that's that's really where I think the that first level is and and that's where the the CI/CD should be running those tests, making sure that that's happened before you push out code to to even to your development environment, much less your integration environment or your um production environment for sure. So it should have been tested on your local machine in CI/CD before it goes to development and then again before it goes to production at the very least. Um and then that's the the the last step and this is the part I still to this day have trouble with getting developers and myself even I I forget to do this more often than not is um to test the code that you just deployed in production to make sure that it works. Um and so following up not just with with QA at at each step, but also then getting QA involved in production testing and making sure that you're aware where they're testing and what they're testing and that they can when they follow up with an issue, it shouldn't be something that uh could have been discovered in a development environment or or a integration environment. often times it is uh but that doesn't change the fact that you should be following up and making sure that [clears throat] those tests happened and that somebody actually monitored that or you're doing that yourself in production as well. >> So you mainly use some of the backend tools and the hooks and that you're talking about. you don't really have that in this dashboard kind of integration you have so that like the PMs can see it and say oh you need we need to think about this for testing as you're building out the tickets and that >> it does it does in the sense that um the GitHub repository so we have a um an AI tool that looks through um the history of the commits for the day and then creates a and so it's rather than and rather than having a developer write their standup report which is again can feel like a huge waste of time sometimes but it is very useful for the rest of the team is we have the the AI actually write that for you. It goes back through looks at your commit messages and and writes through that. If it doesn't see anything about testing and it doesn't see a test in there then it flags it and it says there's no no tests were created for this. And that is usually where a QA person can then get involved and say, "Hey, and to me a the first job of a QA um person is again as a software developer, but also as a somewhat of a software manager, a side manager if you will, um that's reviewing code, making sure that the the standards are met, the lint testing all went through, that all of those lints are in place, the llinters are are not being overridden in in the CIC. CD tool or anything like that, which is a common I've done that myself. Um, just because I needed to get something pushed out. Um, so yeah, they like reviewing that process as well as being involved in the the code review process and the pull request process. So that's really to me the thing that we're still missing that I want to add is the pull request process bringing that into our tool so that it's not you're you're you're seeing the notification in GitHub that there's a pull request that's been assigned to you and that get gets reported into your dashboard so that you then know oh I've got a pull request I need to review. you run the pull request, run and and anything and any comments that you bring into that get assigned directly to that developer or to that team so that you're then following up with it and and managing it, but also then seeing the comments that were made in the poll request and being able to then go through that so that again you can use an AI to review that process and say, you know, a number of pull requests are failing because the the the lint tests weren't run beforehand or they weren't running their unit tests before it was run or they overrid this because they really wanted to push something through. So, being able to follow up and I'm sure there's all kinds of other areas that you could review and look at from just just from looking at pull requests and the comments that have and that happen in there. It's a it's not just a and I always I always get stuck on this is I try not to make it about blame like it's not who who wrote this bad test or who didn't follow the test. It's just it's a learning process, right? And going back through and seeing those pull requests and what was constantly, you know, this maybe a lead developer was always saying do this, do this, do this. And maybe he was right and we weren't doing it or maybe that was an unnecessary step that we could remove [clears throat] at some point and it helped to again speed up our velocity a little bit. >> So before I hand this back to you, Rob, just one followup to that. Um, one thing to think about and considering is a lot of your languages and tools now have uh like test profilers that will actually scan the code and tell you what test or what code is covered by test and what aren't. Uh, which is something you can really utilize with AI today. So, just if you're not looking at something like that, that's something you might want to consider with your processes. And with that, that's up to you. >> That's a good point though. I I think that again as a developer sometimes the QA process kind of goes into back of mind um and even when we're looking at the product backlog where you know we're still going back through and it's always like oh yeah what do we do about testing here or what do we do about QA here um and and so yeah it's a good point to like where we can automate things we absolutely should be and then where we where we need human eyes on that uh is essentially you know following up with each step of that automation And I I want to go back to a little bit of the idea of uh you know I don't know how many time same boat that we're you know overridden a llinter tool or or you know just push something through because [gasps] you know because the sometimes the process is I mean maybe that's just me being a little lazy and being a developer but sometimes the process gets in the way and it's like look we need to get this we need to get this done. the process does not push the ball forward and it is a little bit of the agile thing where it's like the process should not hold your you know hold you accountable I mean it should hold you accountable but not hostage basically that's the the challenge [clears throat] is to like >> to not just kneejerk do that I mean it's like it's one thing if you do it once and like okay I'm going to flag this and it's an exception and we're going to move forward versus it becomes a norm and then the next thing you know the cuz there's those have bet me and I've seen it in other places too where it's like this was here, this check was there and it just got ignored, ignored, ignored and that was fine until it wasn't. It's like it goes back to maybe communication of like this is why this is here. So, and it and technical debt, everybody's favorite thing to like just let it keep piling up is the okay, I pushed it through. However, we need to go back and fix that, address that, clean that up now rather than later. And that's a little bit I know it's a little bit me getting on my soap box, but it's also because we're a little bit out of time. And obviously we could do this for days. Uh there's a lot of of great conversations here, a lot [clears throat] of directions we could go and I know listening >> is like this this guy knows his stuff. This guy's got a lot of cool stuff. I might want to go read that book. Um so how what is the best way for people to get a hold of you and and reach out and get to learn a little bit more about you and and the process in your book? Yeah, for sure. Uh, so Bibli.io is is usually the best place to go. Everything that I do is sort of linked through there and um be but beyond that that I have a sort of a page for my book as well as a blog that I do through it's called radical theapy.dev. Um, and almost everything goes through there and and LinkedIn of course is where everything gets uh shared as well. So, those are those are the best ways to get in touch and and to follow up with these things. But, yeah, I I I especially appreciate anyone that goes through. We have we have a pretty simple process to sign up for a 30-day free trial. And you can sort of see our our book and everything else that we've done is all in our process and our tools. And so, you can see how we work and and how we do things. And we're always open to feedback. That's the I mean, that's the point of agile, right? That's the point of software development is you're not building things in a vacuum for yourself. You're building things for for other people, right? And so the more feedback we can get from users, the better. Especially seasoned developers, but also junior developers, I think, are the the most undervalued part of your team is the new person that's coming in and looking at it with fresh eyes. Um, and I I'd love to get new new developers involved, junior developers, people that are just starting to look at it and just say, "Oh, this seems upside down to me. Why don't you do it this way?" Right? Uh, and so I that to me that's the best way to if you've got fresh eyes and want to look at some of the problems that we're doing. Love to hear that. >> Yeah. It solves my uh a lot of times those new people that come in are the ones that solve my favorite problem is the you know, that's the way we've always did it kind of response. [laughter] Then suddenly um and sometimes they get it's I know it's frustrating like all of us were new at some point when you're frustrating you're like okay well that doesn't really help me understand this and it's even worse when lady you realize it's like oh yeah actually they shouldn't have done it that way and that's that's as I've you know trying to keep that pain when when it was me to be on the other side of it and think about it and keeping that open mind of like okay maybe we do need to re you know revisit this review it because stuff change you know things change things evolve you know AI for example which we did almost no talking about it at this conversation other than mention of it is obviously changing a lot of things that are out there and so whatever we did uh even now like agile like agile manifesto is about I don't know now about 25ish years old something like that and I remember when it first came out it was the latest thing the gang and then um you know in the same line was like to me it was all in with like patterns and agile and these are the thing this is future software development and all that stuff has grown and evolved and everything that you did 20 years ago probably 15 years ago was already like pay and there's better ways to do it and then 10 and then five and what we're doing today I'm sure people five years from now I'll be like I can't believe people are still using this archaic approach so you know it is always evolve or you know evolve or die I guess is a part of this >> I want to thank you so much for your your time I appreciate you and the crowd I'm sure there is a standing ovation I'm being drowned out by the the applause in the background here. Uh this is a great conversation. This was uh this is the kind of stuff we'd love to have where we can go a little deeper and talk about some things that actually a lot of these areas I didn't even think we were going to get into u but I think are very I think they're very critical for moving forward. It it really does go back to like okay you say you're doing agile. What really is agile? And if it's if what you're doing is broken, and you've given a perfect example of that with Buildly, is like if what you're doing is broken, then like let's embrace your inner laziness and your desire to get through these things and automate the things that stink and make [clears throat] some changes. You go out there and suggest some differences and and you know, start to evolve. You don't have to be a slave to what you think the process is. Uh and particularly agile, like it literally tells you don't be a slave to that. like adjust it as your team and your projects >> and too often I think the >> uh what I have seen now with scrum and some of that kind of stuff and sprints that it's it's too much been equated to agile and that's how you have to do it and there are too many examples I know of projects that really don't work with that as you have obviously found out as well. >> Absolutely. Yeah. And I I think the the laziness gene that's in all of us as developers, the AI tools, if we ever get another chance to talk about AI tools and how to use them to automate away those those things, it's it's ridiculously easy now to build something to do make your own backlog sort of approach if you need to or or or to get those things out of the way. the the artifacts that get generated even though agile is an anti-artifact sort of approach to to software development, they still get generated and they [clears throat] still have to be updated. And so usually it's a it's a product manager that needs a document here or CEO that needs something there. Yeah. So AI like using that to to manage that's that's where it shines. Simple, boring, repetitive tasks, that's where it's good. >> Yeah. I have found that that is um that is an area that was very very early on with AI when I started using it that I have I've embraced and it's u when you when really you just need the artifact the rest of the stuff's there and it's really just okay I've got to go about the you know the grunt work or whatever of generating the artifact whether it is a uh I mean and sometimes it's a simple document sometimes it is something that you know that this is there are other tools for it but things like having a you know an API document like the things that like swagger and some of those tools do so do so well and yeah they require us a little bit of forethought to make sure we do it in a way that and I think that's where it's going to be from here is like doing it a way that AI can then easily consume it and do the stuff that we we really don't want to slow us down like you said where you don't want the you want the developers writing the code and the functionality you don't want them stuck trying to figure out how to create tests and then fix the test for what they're coding it's like finding a way to get the things done >> and then you a use AI to sort of walk behind us and clean up the mess a little bit and say, "Okay, well, fine. I didn't get that done. Go do this for me." >> Yeah, exactly. I think that's the that's the primary use case now and will be for a long time for for any AI tools, development tools, is is going to be just about automating those those boring things away and education just teaching you how to how to do better and then following up, right? You know, and again, I don't I don't mean to to put Michael your job in the in the in the line of AI, but I do think reviewing tests and writing tests and then making sure that that code can be cleaned up. That's where AI and a Q a good QA developer could write something really well with an AI tool and do it even faster. And I think that's the it's more about to me it's it's about building up your velocity and and creating a team that works better together. That's where the AI tools can really help. >> 100%. Well, um that will wrap this one up. Um some nice little bonus material there as well as we got a couple little, you know, suggestions at the end. Uh we will we ran this uh we we turn this around pretty quick. I think this probably will show up as early as Tuesday and Thursday of next week. We drop on Tuesdays and Thursdays. If not, it'll be the week after. Um I will send links out uh when we do so. So you feel free to share them wherever wherever you want. Edit them out however you want. Um we will post them. We have a we do a blog article on our site. It's out on YouTube. It's out on um out on all the different places that you can get podcast things like that. So feel free to share as as as much as you like. uh really have enjoyed this uh and you like like I said we may reach I may reach out again sometime in the the future a few months from now and say all right let's let's have some other conversations because there's we left a lot on the table that we could have gone into uh and I have a feeling that we will all have different opinions of those three six and you know 12 months out yeah absolutely for sure yeah [clears throat] >> all right we'll let you go and thanks you a lot thank you for your time appreciate it and we'll be in touch again >> yeah know thanks It was it was fun. Hope hopefully we will get a chance to to hit those things later on. >> Definitely. We'll be Michael's working on his tester driven de test driven development notes as we speak. >> Oh, good. Yeah, I need some defensive test driven development. All right. >> Thanks, guys. Talk later, guys. >> Talk to you later. Take care. >> All right. So, uh, bonus material there, uh, because we did cut it off as far as you know, we cut that off on the audio right before we got into the AI bonus. Um, so we'll wrap this one up. I want to thank Greg again for his time. Um, really appreciate this was this was really one of those that it's it's funny sometimes. I I had not spoken with him before this. You look at some of the things that people do and what their background is and what their focus is and you're like, "Cool. This is where it's going to go." And you know, 15 seconds into the conversation, it goes a completely different direction. But honestly, this was one of my favorite like conversations I've had that I've been able to record in in quite a while. Um, you guys ever get a chance to check out lunch club, although I think it's almost impossible to get into it now. Um, or things like that. There's like one-on- ons and stuff like that if you can find them is like just find a way to sit down with people like this and talk tech for a while. Um, even if it, you know, if you work with people that you can talk to, great. If you don't find the people, especially now with people remote everywhere. Um, this is like once again, if you got a tenth out of this that I got out of it, it was more than worth your while. Thank you so much for your time. Appreciate you guys hanging out with us. Uh, as always, check us out at developer.com, Facebook.com, the developer channel, uh, out on YouTube. We've got content of plenty through all of those. Feel free to leave us leave us feedback wherever you see any of those things. We will be happy to work with you. Even if something we did years ago, we'll do our best to update it and let you know where we went with that or what went on with it or how it is going on today. As always, go out there and have yourself a great day, a great week, and we will talk to you next time. Okay, part one wrapping up. Um, let's do my three, two,
Transcript Segments
[music]
[music]
[music]
[music]
Well, hello and welcome back. We are in
part two of our interview. We are
continuing this season of building
better developers. It's just this is the
developer podcast, but it is building
better foundations. And funny enough, we
thought the foundations were going to be
about AI, but we really haven't touched
on it. And as far as you know, we might
not touch on it this episode, but we are
touching along upon a another uh area as
we're really talking about the
foundation of the team itself and how it
interacts uh with the CEO of Buildley B
I L B U I L D Y.com to check out that
plus going to be links in the show
notes. Um we're going to carry this
forward. First, I need to introduce
myself. My name is Rob Broadhead. I'm
one of the founders of developer. Also
the founder of RB Consulting where we
are boutique consulting firm. We help
you leverage technology create road
mapaps to be more successful. Good thing
and bad thing. Uh good thing is uh I'm
getting into the fall is always
interesting as I get into the season of
there is uh not like on a given week not
a ton of work but by the time I get even
started in the week work fills stuff up
very quickly. So I have now over booked
myself essentially every week because
I'm I have all these plans of things
that I need to get done want to get done
working on my business instead of in my
business. Um and then those are rabbit
holes everywhere because sometimes in
the business steps in the way sometimes
working on my business I find new rabbit
holes of like oh I need to go deal with
this explore that uh this is every bit
of so it is the good and the bad of
so many things that are out there so
many opportunities so many things that
I'm chasing but the bad side is it's
just I'm overwhelmed with there's too
many things to chase so I'm going to
simplify it down to passing the
introduction over to
Hey everyone, my name is Mike Malashsh.
I'm one of the co-founders of Developer.
I'm also the founder of Envision QA
where we help businesses with their
software problems, be it custom
software, cookie cutter software, or
just any type of software that you need
to help run your business. If you need
it customized or you're having problems
with delivery, give us a call. We can
help you build custom software or build
tools to test yourself. Uh check us out
at envisionqa.com.
Uh good thing, bad thing. Uh good thing
is we're getting we're past Halloween.
We're heading towards Thanksgiving, so
I'm getting uh kind of the vibe for some
turkey uh and some pumpkin pie and those
pumpkin spice lattes at your coffee
shops. Uh bad thing, I'm
really into my seasonal allergies now.
I'm waiting for all the leaves. Well,
they're pretty. I'm waiting for them to
fall off the trees so I can get back to
normal.
And the normal that we're going to get
back to is we're going to continue our
conversation with Gregory and pick up
right where we left off.
>> Yeah, I see that an awful lot,
especially like the cherry-picking.
That's kind of like where they go
through the backlog and cherry pick. Um,
>> yeah.
>> So, I kind of want to pivot just
slightly with your processes. So, you've
touched very heavily on um, you know,
the agile approach with the project
managers, development teams, and that.
I'm interested in how you go about
approaching the QA side of things.
You've touched on it, but um I'm a
little more QA biased because I like
test driven development, but kind of
walk me through how you handle QA
through this process. How do you
simplify this and how do you show the
communication going on uh you know
between the development, the project
managers, and where things are?
>> Yeah, I think Michael, you and I will
have a very interesting conversation at
some point about test-driven
development. Um but I I I agree that I
think QA often gets left until the end.
Same with security, right? So and and
unfortunately QA sort of becomes
security in a lot of organizations as
well. Um but it is part of that overall
security process is making sure you have
well tested software. Uh but it has to
start from the developer. I think the I
remember the first time the first job I
ever had where there was an actual QA
team. I moved from a web design
organization
uh essentially and that was doing you
know brochure websites into a real
software organization and I was shocked
at the fact that they had two QA there's
there was six developers and two QA um
team members at the same time and I I
remember thinking how can they how do
they have enough to do if they're just
testing the software that I've built
there's not going to be you know I I
document my own code I there's not going
to be any issues. And the amount of
times they came back to me that that
first week with you need to fix this.
Let's let me show you what's happening
here. I was completely shocked. Um but
that to say that I I what I learned at
that point and then I started to learn
down the road is I need to think like a
QA person from the very beginning,
right? I need to write my own unit tests
so that I'm actually testing my software
internally. I need to do that in a smart
way. I don't need to go overboard with
my unit tests cuz then I'm maintaining
unit tests, not maintaining code. Um,
and then I also need to make sure that
the a QA developer understands what
those requirements are. You know, the
the old agile flip side of the card
where you would write the test on the
back. um that I don't think I can I I
can't remember a single place I've been
where anybody actually did that um in
the planning poker phase of agile until
a until a QA person came in and said I
don't see a test on this. What am I what
am I testing for? Um, and I think that's
the the aspect for us that is we're I'm
still struggling with is at what point
does an automate like let's say the
robot framework. Um, so a lot of the
testers that I've been working with in
the past have been using things like
robot and selenium to automate front-end
testing as well as backend testing. So
that combination of unit tests to then
um regression tests to then front end
and backend tests um in or in a let's
say a a green blue deployment process um
that that part of the process you need
to have the QA involved from the very
beginning because they need to know what
they're building what they're and that's
product management QA developers design
all need to be involved as early as
possible in that process so that you can
write those out. So what we we do in our
in our process is essentially the
developers are writing their unit tests
after they've finished a block of code.
They're not writing it before. And I
know that doesn't fit with the
testdriven development process, but I I
I will come back to that cuz I do want
to talk to you, Michael, about that. But
I I what I like about that approach is
we write minimal tests, right? So we in
other words we're not having to maintain
the tests we're just making sure that
once that test is written that
functionality works. So I think of it
more as uh if you're an object-oriented
programmer for example you're testing
the object for the essentially the CRUD
operations. You're not doing fieldle
testing cuz every time then you're
you're going back through and you're
rewriting tests. So I like to see that
happen early on in the process so that
you're every time you commit your code
you're committing your te your unit
tests as well and then QA gets involved
at the integration tests and especially
at the front end cuz and I I have a lot
of friends that are amazing front-end
developers but I think front end is
maybe the worst and and I know I'm going
to get in trouble for this at writing
tests. um they hate writing tests and
they almost never do a and I've I've had
so many times where I've I've asked to
see tests and they will tell me oh it's
yeah it's it's all automated I don't
have to write a test cuz it this
JavaScript framework is no no that's not
how it works um so we have to go back
and review that and I I think I think
the robot framework has helped me be
more accepting of that fact that a a
front-end developer and even a back-end
developer um can be resistant to tests
um up to a certain point to where then a
a Q and I I can already see the face of
one of my uh old QA uh [clears throat]
leads already sort of frowning at me
right now as I say this, but they're
covering for let's call the the laziness
of the software developer on the other
side. Um, and then they're going back
and making sure that those tests are are
accommodated for in the code with the
developer. And I I I don't necessarily
like that process, but I've yet to come
up with a a a better process that keeps
developers moving forward without using,
and this is something that I'm starting
to explore as well, is without using AI
to write the QA tests. And I don't mean
to put that out there as a replacement
for QA in any way, right? And we know
that we need people looking at all of
this. Um, and and developers need to be
in the process. And I think of QA as
developers as well. Um, and I think
that's the to me as a QA developer,
you're still a software developer.
You're just helping to ensure quality.
Um, and I think that they need to be
involved upfront. They need to be
involved with the the product discovery,
the the requirements, and often times I
one of the best QA leads I ever worked
with was helping with API models and
saying, "Oh, no, you need to adjust the
a the model to handle this occasion or
this use case." And so I think having
more eyes and ears on a problem is never
a bad thing and especially when you're
when you're building out in in the early
stages. So yeah, involve QA.
may don't don't go too far especially
early stages and then making sure that
you're at every step QA is in the loop
in all of those pieces and then I think
writing the test should be up to the QA
developer not the software so I've never
liked the flip the card approach
because that's I don't I think
developers should be focused on solving
the problem not not what problems might
jump up from that they should understand
what what uh the connections are. But if
it's and I come from the object-oriented
world too where if it's all it's
essentially encapsulated in one set of
of code, you shouldn't be worried about
breaking downstream things, right? Cuz
it should all be um encapsulated, but
sometimes it does. And so that's when QA
comes and gets involved. And they they
understand that more than most software
developers, I think. So with your whole
process because uh in the first half of
this you talked about you know uh the
transparency getting the project owners
the developers involved. So with this
APIs and tools you've been putting
together how do you show the
transparency or the level of test that
is uh being added as you build these
projects out so you know what's being
tested, what's not, what needs to be
covered. Uh and so also so you know when
you go to production, you know what can
you smoke test when it goes out there?
how much manual testing has to be done,
you know, how do you kind of manage that
transparency through this process of
yours?
>> Yeah. So, part of the issue for us is
about documentation inside of the
repository. So, we have a set of tools
and standards that we follow for every
every docker container essentially has
to have all of this already documented
in it. So if we know that we're using uh
let's say fast API and we want to use
piest to run all of the unit tests
there, we need to make sure that there
are a certain number of pi tests
essentially for each function that we've
written. And and we need to make sure
that essentially that those tests have
been executed with uh hooks essentially
before you check into git. Uh you have
to make sure that that these tests have
already been written. if there isn't a
test, it it runs it for you. It doesn't
see a test and then it it it pulls it
back and says make sure that you have
your test in before you check it in. So
hooks, pre-commit hooks into GitHub is
one way to to manage that and that's the
way that we do it. Um, but there's a lot
of other ways to to look at that as
well. The other thing that that we like
is like I said, I'm sort of fan of the
robot test framework. Um, is making sure
that that's built into your CI/CD tool
as well, right? so that it runs those
tests and the person responsible for
that is is always the developer that's
checking in the code. But he if he
doesn't know how to write those tests or
doesn't know how to use robot framework,
which is more often than not an excuse,
not really um the reality. They just
don't want to write the tests. Um then
they go to QA and say, can you help me
write these tests or can you get me
involved in writing this test? um or
more often than not, could you write
these tests for me? Um and that's that's
really where I think the that first
level is and and that's where the the
CI/CD should be running those tests,
making sure that that's happened before
you push out code to to even to your
development environment, much less your
integration environment or your um
production environment for sure. So it
should have been tested
on your local machine in CI/CD before it
goes to development and then again
before it goes to production at the very
least. Um and then that's the the the
last step and this is the part I still
to this day have trouble with getting
developers and myself even I I forget to
do this more often than not is um to
test the code that you just deployed in
production to make sure that it works.
Um and so following up not just with
with QA at at each step, but also then
getting QA involved in production
testing and making sure that you're
aware where they're testing and what
they're testing and that they can when
they follow up with an issue, it
shouldn't be something that uh could
have been discovered in a development
environment or or a integration
environment. often times it is uh but
that doesn't change the fact that you
should be following up and making sure
that [clears throat] those tests
happened and that somebody actually
monitored that or you're doing that
yourself in production as well.
>> So you mainly use some of the backend
tools and the hooks and that you're
talking about. you don't really have
that in this dashboard kind of
integration you have so that like the
PMs can see it and say oh you need we
need to think about this for testing as
you're building out the tickets and that
>> it does it does in the sense that um the
GitHub repository so we have a um an AI
tool that looks through um the history
of the commits for the day and then
creates a and so it's rather than and
rather than having a developer write
their standup report which is again can
feel like a huge waste of time sometimes
but it is very useful for the rest of
the team is we have the the AI actually
write that for you. It goes back through
looks at your commit messages and and
writes through that. If it doesn't see
anything about testing and it doesn't
see a test in there then it flags it and
it says there's no no tests were created
for this. And that is usually where a QA
person can then get involved and say,
"Hey, and to me a the first job of a QA
um person is again as a software
developer, but also as a somewhat of a
software manager, a side manager if you
will, um that's reviewing code, making
sure that the the standards are met, the
lint testing all went through, that all
of those lints are in place, the
llinters are are not being overridden in
in the CIC. CD tool or anything like
that, which is a common I've done that
myself. Um, just because I needed to get
something pushed out. Um, so yeah, they
like reviewing that process as well as
being involved in the the code review
process and the pull request process. So
that's really to me the thing that we're
still missing that I want to add is the
pull request process bringing that into
our tool so that it's not you're you're
you're seeing the notification in GitHub
that there's a pull request that's been
assigned to you and that get gets
reported into your dashboard so that you
then know oh I've got a pull request I
need to review. you run the pull
request, run and and anything and any
comments that you bring into that get
assigned directly to that developer or
to that team so that you're then
following up with it and and managing
it, but also then seeing the comments
that were made in the poll request and
being able to then go through that so
that again you can use an AI to review
that process and say, you know, a number
of pull requests are failing because the
the the lint tests weren't run
beforehand or they weren't running their
unit tests before it was run or they
overrid this because they really wanted
to push something through. So, being
able to follow up and I'm sure there's
all kinds of other areas that you could
review and look at from just just from
looking at pull requests and the
comments that have and that happen in
there. It's a it's not just a and I
always I always get stuck on this is I
try not to make it about blame like it's
not who who wrote this bad test or who
didn't follow the test. It's just it's a
learning process, right? And going back
through and seeing those pull requests
and what was constantly, you know, this
maybe a lead developer was always saying
do this, do this, do this. And maybe he
was right and we weren't doing it or
maybe that was an unnecessary step that
we could remove [clears throat] at some
point and it helped to again speed up
our velocity a little bit.
>> So before I hand this back to you, Rob,
just one followup to that. Um, one thing
to think about and considering is a lot
of your languages and tools now have uh
like test profilers that will actually
scan the code and tell you what test or
what code is covered by test and what
aren't. Uh, which is something you can
really utilize with AI today. So, just
if you're not looking at something like
that, that's something you might want to
consider with your processes. And with
that, that's up to you.
>> That's a good point though. I I think
that again as a developer sometimes the
QA process kind of goes into back of
mind um and even when we're looking at
the product backlog where you know we're
still going back through and it's always
like oh yeah what do we do about testing
here or what do we do about QA here um
and and so yeah it's a good point to
like where we can automate things we
absolutely should be and then where we
where we need human eyes on that uh is
essentially you know following up with
each step of that automation
And I I want to go back to a little bit
of the idea of uh you know I don't know
how many time same boat that we're you
know overridden a llinter tool or or you
know just push something through because
[gasps] you know because the sometimes
the process is I mean maybe that's just
me being a little lazy and being a
developer but sometimes the process gets
in the way and it's like look we need to
get this we need to get this done. the
process does not push the ball forward
and it is a little bit of the agile
thing where it's like the process should
not hold your you know hold you
accountable I mean it should hold you
accountable but not hostage basically
that's the the challenge [clears throat]
is to like
>> to not just kneejerk do that I mean it's
like it's one thing if you do it once
and like okay I'm going to flag this and
it's an exception and we're going to
move forward versus it becomes a norm
and then the next thing you know the cuz
there's those have bet me and I've seen
it in other places too where it's like
this was here, this check was there and
it just got ignored, ignored, ignored
and that was fine until it wasn't. It's
like it goes back to maybe communication
of like this is why this is here. So,
and it and technical debt, everybody's
favorite thing to like just let it keep
piling up is the okay, I pushed it
through. However, we need to go back and
fix that, address that, clean that up
now rather than later. And that's a
little bit I know it's a little bit me
getting on my soap box, but it's also
because we're a little bit out of time.
And obviously we could do this for days.
Uh there's a lot of of great
conversations here, a lot
[clears throat] of directions we could
go and I know listening
>> is like this this guy knows his stuff.
This guy's got a lot of cool stuff. I
might want to go read that book. Um so
how what is the best way for people to
get a hold of you and and reach out and
get to learn a little bit more about you
and and the process in your book?
Yeah, for sure. Uh, so Bibli.io is is
usually the best place to go. Everything
that I do is sort of linked through
there and um be but beyond that that I
have a sort of a page for my book as
well as a blog that I do through it's
called radical theapy.dev.
Um, and almost everything goes through
there and and LinkedIn of course is
where everything gets uh shared as well.
So, those are those are the best ways to
get in touch and and to follow up with
these things. But, yeah, I I I
especially appreciate anyone that goes
through. We have we have a pretty simple
process to sign up for a 30-day free
trial. And you can sort of see our our
book and everything else that we've done
is all in our process and our tools. And
so, you can see how we work and and how
we do things. And we're always open to
feedback. That's the I mean, that's the
point of agile, right? That's the point
of software development is you're not
building things in a vacuum for
yourself. You're building things for for
other people, right? And so the more
feedback we can get from users, the
better. Especially seasoned developers,
but also junior developers, I think, are
the the most undervalued
part of your team is the new person
that's coming in and looking at it with
fresh eyes. Um, and I I'd love to get
new new developers involved, junior
developers, people that are just
starting to look at it and just say,
"Oh, this seems upside down to me. Why
don't you do it this way?" Right? Uh,
and so I that to me that's the best way
to if you've got fresh eyes and want to
look at some of the problems that we're
doing. Love to hear that.
>> Yeah. It solves my uh a lot of times
those new people that come in are the
ones that solve my favorite problem is
the you know, that's the way we've
always did it kind of response.
[laughter]
Then suddenly um and sometimes they get
it's I know it's frustrating like all of
us were new at some point when you're
frustrating you're like okay well that
doesn't really help me understand this
and it's even worse when lady you
realize it's like oh yeah actually they
shouldn't have done it that way and
that's that's as I've you know trying to
keep that pain when when it was me to be
on the other side of it and think about
it and keeping that open mind of like
okay maybe we do need to re you know
revisit this review it because stuff
change you know things change things
evolve you know AI for example which we
did almost no talking about it at this
conversation other than mention of it is
obviously changing a lot of things that
are out there and so whatever we did uh
even now like agile like agile manifesto
is about I don't know now about 25ish
years old something like that and I
remember when it first came out it was
the latest thing the gang and then um
you know in the same line was like to me
it was all in with like patterns and
agile and these are the thing this is
future software development and all that
stuff has grown and evolved and
everything that you did 20 years ago
probably 15 years ago was already like
pay and there's better ways to do it and
then 10 and then five and what we're
doing today I'm sure people five years
from now I'll be like I can't believe
people are still using this archaic
approach so you know it is always evolve
or you know evolve or die I guess is a
part of this
>> I want to thank you so much for your
your time I appreciate you and the crowd
I'm sure there is a standing ovation I'm
being drowned out by the the applause in
the background here. Uh this is a great
conversation. This was uh this is the
kind of stuff we'd love to have where we
can go a little deeper and talk about
some things that actually a lot of these
areas I didn't even think we were going
to get into u but I think are very I
think they're very critical for moving
forward. It it really does go back to
like okay you say you're doing agile.
What really is agile? And if it's if
what you're doing is broken, and you've
given a perfect example of that with
Buildly, is like if what you're doing is
broken, then like let's embrace your
inner laziness and your desire to get
through these things and automate the
things that stink and
make [clears throat] some changes. You
go out there and suggest some
differences and and you know, start to
evolve. You don't have to be a slave to
what you think the process is. Uh and
particularly agile, like it literally
tells you don't be a slave to that. like
adjust it as your team and your projects
>> and too often I think the
>> uh what I have seen now with scrum and
some of that kind of stuff and sprints
that it's it's too much been equated to
agile and that's how you have to do it
and there are too many examples I know
of projects that really don't work with
that as you have obviously found out as
well.
>> Absolutely. Yeah. And I I think the the
laziness gene that's in all of us as
developers, the AI tools, if we ever get
another chance to talk about AI tools
and how to use them to automate away
those those things, it's it's
ridiculously easy now to build something
to do make your own backlog sort of
approach if you need to or or or to get
those things out of the way. the the
artifacts that get generated even though
agile is an anti-artifact sort of
approach to to software development,
they still get generated and they
[clears throat] still have to be
updated. And so usually it's a it's a
product manager that needs a document
here or CEO that needs something there.
Yeah. So AI like using that to to manage
that's that's where it shines. Simple,
boring, repetitive tasks, that's where
it's good.
>> Yeah. I have found that that is um that
is an area that was very very early on
with AI when I started using it that I
have I've embraced and it's u when you
when really you just need the artifact
the rest of the stuff's there and it's
really just okay I've got to go about
the you know the grunt work or whatever
of generating the artifact whether it is
a uh I mean and sometimes it's a simple
document sometimes it is something that
you know that this is there are other
tools for it but things like having a
you know an API document like the things
that like swagger and some of those
tools do so do so well and yeah they
require us a little bit of forethought
to make sure we do it in a way that and
I think that's where it's going to be
from here is like doing it a way that AI
can then easily consume it and do the
stuff that we we really don't want to
slow us down like you said where you
don't want the you want the developers
writing the code and the functionality
you don't want them stuck trying to
figure out how to create tests and then
fix the test for what they're coding
it's like finding a way to get the
things done
>> and then you a use AI to sort of walk
behind us and clean up the mess a little
bit and say, "Okay, well, fine. I didn't
get that done. Go do this for me."
>> Yeah, exactly. I think that's the that's
the primary use case now and will be for
a long time for for any AI tools,
development tools, is is going to be
just about automating those those boring
things away and education just teaching
you how to how to do better and then
following up, right? You know, and
again, I don't I don't mean to to put
Michael your job in the in the in the
line of AI, but I do think reviewing
tests and writing tests and then making
sure that that code can be cleaned up.
That's where AI and a Q a good QA
developer could write something really
well with an AI tool and do it even
faster. And I think that's the it's more
about to me it's it's about building up
your velocity and and creating a team
that works better together. That's where
the AI tools can really help.
>> 100%.
Well, um that will wrap this one up.
Um some nice little bonus material there
as well as we got a couple little, you
know, suggestions at the end. Uh we will
we ran this uh we we turn this around
pretty quick. I think this probably will
show up as early as Tuesday and Thursday
of next week. We drop on Tuesdays and
Thursdays. If not, it'll be the week
after. Um I will send links out uh when
we do so. So you feel free to share them
wherever wherever you want. Edit them
out however you want. Um we will post
them. We have a we do a blog article on
our site. It's out on YouTube. It's out
on um out on all the different places
that you can get podcast things like
that. So feel free to share as as as
much as you like. uh really have enjoyed
this uh and you like like I said we may
reach I may reach out again sometime in
the the future a few months from now and
say all right let's let's have some
other conversations because there's we
left a lot on the table that we could
have gone into uh and I have a feeling
that we will all have different opinions
of those three six and you know 12
months out yeah absolutely for sure yeah
[clears throat]
>> all right we'll let you go and thanks
you a lot thank you for your time
appreciate it and we'll be in touch
again
>> yeah know thanks It was it was fun. Hope
hopefully we will get a chance to to hit
those things later on.
>> Definitely. We'll be Michael's working
on his tester driven de test driven
development notes as we speak.
>> Oh, good. Yeah, I need some defensive
test driven development. All right.
>> Thanks, guys. Talk later, guys.
>> Talk to you later. Take care.
>> All right. So, uh, bonus material there,
uh, because we did cut it off as far as
you know, we cut that off on the audio
right before we got into the AI bonus.
Um, so we'll wrap this one up. I want to
thank Greg again for his time. Um,
really appreciate this was this was
really one of those that it's it's funny
sometimes. I I had not spoken with him
before this. You look at some of the
things that people do and what their
background is and what their focus is
and you're like, "Cool. This is where
it's going to go." And you know, 15
seconds into the conversation, it goes a
completely different direction. But
honestly, this was one of my favorite
like conversations I've had that I've
been able to record in in quite a while.
Um, you guys ever get a chance to check
out lunch club, although I think it's
almost impossible to get into it now.
Um, or things like that. There's like
one-on- ons and stuff like that if you
can find them is like just find a way to
sit down with people like this and talk
tech for a while. Um, even if it, you
know, if you work with people that you
can talk to, great. If you don't find
the people, especially now with people
remote everywhere. Um, this is like once
again, if you got a tenth out of this
that I got out of it, it was more than
worth your while. Thank you so much for
your time. Appreciate you guys hanging
out with us. Uh, as always, check us out
at developer.com, Facebook.com, the
developer channel, uh, out on YouTube.
We've got content of plenty through all
of those. Feel free to leave us leave us
feedback wherever you see any of those
things. We will be happy to work with
you. Even if something we did years ago,
we'll do our best to update it and let
you know where we went with that or what
went on with it or how it is going on
today.
As always, go out there and have
yourself a great day, a great week, and
we will talk to you next time.
Okay, part one wrapping up. Um,
let's do my three, two,