Basic Agile: OODA Loops
Updated: Feb 24, 2022
What I hope you'll learn from this post:
What problems agile process is intended to solve
The one agile tactic to introduce first and why
"Here we go again," you think, "another bad release."
Another rollback, another set of arguments, another bunch of pointed fingers. It didn't use to be this way. What happened?
You've been the team lead here for a few months. It started as a small team, and you were brought in to grow it. The group initially seemed like working with rock stars, but progress has slowed to a halt.
It seems like you can't get a break. As the team gets larger, you're going slower, not faster. More mistakes pile up, and more people get frustrated with each other.
Now, this situation could be due to a variety of reasons. But let's assume you practiced Big OH hiring, and you're pretty sure all of your new team members' values are aligned with yours. No one is "the imposter."
Instead, you think people seem to be working from a completely different set of assumptions about how work should get done. What "done" means to some engineers isn't what "done" means to others. And you keep learning too late that things don't go smoothly when these assumptions are violated.
Your longer-tenured members are still working from their assumptions as a smaller company. They already had assumptions for who does what and what that means. Newer senior people are all working under the beliefs of their former companies, who had different workflows than yours. Your junior hires either have no idea what anything means or are working from textbook definitions.
You need a shared set of assumptions across the team.
Introducing Process: A Shared Set of Assumptions
You decide to introduce some process.
Process is a four-letter word. Some people roll their eyes; others check out. They don't see how process will solve this problem.
Boiled down, when a team adopts a process, it is recognizing a shared set of assumptions. It's deciding what "requirements" means, what "design" means, what quality standards, and who's responsible for what.
As we saw before, things go awry quickly when everyone has different assumptions. And in many cases, people will misidentify the problem, assuming that their way of working is the "correct" way, and everyone else is just sorely mistaken.
The issue is that the team must decide its own way to work. There is no right way to work; instead, every team must figure out how to use the process to fill in its weaknesses.
This is a new team, and it hasn't been decided yet how it will work together. So process is needed.
Still, it's going to be a hard sell.
Some people come to you - a mix of senior and junior folks - regaling you of tales from former employers or the next big thing from a book they read. They say, "What we need is Scrum" or "Kanban" or "Safe," except they say "safe" as if it were French for some reason, "saf-ee," and this leaves you confused.
Others tell you that process is why they left their old companies. "It gets in the way of their genius, or the red tape slows everyone down. You're making a mistake by even considering it at all."
It's crucial to resolve these disputes. While as a manager, you may need to use your formal/hard power just this once to get things going, real process improvement and culture change require buy-in. Otherwise, you're just practicing process in name only, and people aren't going to actually adopt it in their day-to-day work lives.
Use an Off the Shelf Solution
Is process like a library? Should you just decide which library or framework the team will use and call it a day?
First, you consider the off-the-shelf solutions. Do you just roll out Scrum?
This, unfortunately, violates Gall's Law, which states:
A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work.
All processes that work must descend from more straightforward or similar processes that also work. It's like programming; it's far easier to make one small change to the code and get it committed, reviewed, tested, and linted than doing all those things after a much more significant change.
We call this approach "recursive agile" - you need to use agile principles of iteration and incrementalism on your agile rollout. A process must start small and straightforward and then be grown.
Using off-the-shelf systems won't work - usually never works if Gall is believed.
Often, this is what people who complain about process point to when they say, "We tried to roll out Scrum at our last job, and it failed miserably." Now you know why. You have to start smaller, more concrete, with that "base case," and then iterate from there.
Use Nothing at All
What if it's not about process? What if some of the 'geniuses' are to be believed? Maybe this is a hiring problem.
"If we just hired engineers who knew how to code," they argue, "we wouldn't need process."
Process is "constraining" and "slows things down."
As you hear this, you think back to Big OH values. These people don't sound very humble or open to being wrong. It's their way or the highway. Still, let's assume they want what's best and are just reacting to some trauma in their own past.
How do you find the people who "know how to code?" You are already using code challenges, looking for testing and other quality metrics. You have a pretty excellent hiring process. You ask them what they'd change about it, and other than "let me interview every candidate!" they don't have any ideas.
Instead, you believe you have a group of pretty talented engineers in your team. While you see where they disagree, they all have great strengths, and they complement each other very well. If only you could get them working together!
How can you get these holdouts invested in process, though? They seem scared of slowing down; the process is worse than the problem you're facing.
Again, this points to starting small. Do an experiment and insist on measurables. Get buy-in from these holdouts by asking them to set the standards by which the investigation will be judged. And get them to commit to this: "If the experiment works out, I'll be on board."
But still, what is this "base" case of Recursive Agile? What is the small, initial first step?
What to Actually Do: OODA Loop
You should start with the OODA Loop.
OODA stands for "observe, orient, decide, act" and comes from the Air Force about 70 years ago. It's not a new idea.
Let's talk about what it is and what problems it solves, then go into an easy way to roll it out.
Observe means gather data. The critical thing to remember here is that all data is valid; some information is more valuable than others. You don't want to get stuck in "analysis paralysis."
Do you need hard metrics on everything, weekly surveys on engagement, and other intangibles? No - although, having such data might be nice.
Data can just start with a regular discussion about what's going well and what's not going well. People's feelings are data. They're biased; they can be logically flawed, but they are better than nothing and usually pretty cheap to get.
The OODA loop is excellent because it feeds back on itself. You can take data, work through the OODA loop and decide we need better data. You can then develop an action plan for how to get it!
Also called Sense-making.
Orientation is the hardest part.
You have to interpret the data to tell you a story about the world. Drawing conclusions from data is challenging. For example, do your repeated rollbacks imply you have:
A problem with communication?
A faulty CI/CD?
A quality issue?
A combination of all three?
Something else you haven't considered?
You've got to ultimately make sense of the raw data and turn it into a story or narrative that people can understand.
Again, the OODA loop is self-correcting here - in future iterations, you may "observe" that your last "orientation" was wrong and self-correct! So don't get stuck too long here. You need to turn your observations into stories, into understanding.
Decisions can only act on narratives or stories; they can't act on data.
Or, to put it another way, decisions shouldn't act on data.
Let's say you have some raw data about defects getting released. You attempt to target the data rather than the stories. You say, "Increase test coverage by 10%!" and track this.
Over time, more and more code is added that isn't related to the requirements and is very easy to test. This makes the proportion of tested code higher, leaving the untested portion - the portion that actually 'does the thing' - still untested. But the metrics look great!
Maybe you wouldn't do this, and no one starts out wanting to do this. But maybe there's a time crunch on a release one time, and someone sneaks in the tactic. Others notice, and the next time they're on a time crunch, they use it too. Before long, people add debug code or whole extra libraries that do nothing you need because they tested it to hit your metrics.
The story - the result of orientation - is that "quality is down." Now, you might decide with your team that "to improve quality, we should increase test coverage." This probably seems obvious to you.
But, tying the decision to a story makes it easier for a peer reviewer to ask, "These tests don't appear to increase quality which I thought was our goal here."
The decision process needs buy-in; otherwise, you won't have any hope of enforcing it on a self-driving team. Ultimately, the easiest decisions have a straightforward story that precedes them, three or four brainstormed options of dealing with the story, and a consensus around which option was taken.
So, if the story is "Quality is suffering" based on the rollback data as well as the general feelings of the team, then the brainstormed options might be:
Increase test coverage
Add more peer reviewers to each peer review
Allow Bob to personally review everyone else's code
The team may agree to target test coverage. Except Bob. What a jerk.
But did it work? Does targeting test coverage increase quality?
Act - Follow Through
We're going to find out - by doing an experiment.
The decision made above is an experiment; we've built a hypothesis that increasing test coverage will "increase quality" and reduce defects released.
The only way you can try this experiment is to, well, try it. Everyone attempt to follow the guidance for some timebox of time, two weeks or a month.
Then you check back in during the next loop of OODA.
Observe: did our target measurable move? Did test coverage go up
Orient: did quality go up, though?
Perhaps it worked, and you decide to focus on something else for your next process improvement. Or maybe it didn't, and you decide to try a different tactic next OODA loop.
If a tactic works - increasing test coverage did improve quality. Following through just means: "Alright, so we need to always have high test coverage going forward." And the process sticks.
It's usually easy to keep a process like this to stick because:
Everyone was around for the information gathering. It was transparent.
Everyone got asked their opinion on the sense-making. They felt heard.
Everyone agreed to work with the process for a short time while the experiment unfolded and
Everyone agreed to follow the new process for the longer term if it worked.
What This Looks Like: A Retrospective
The above may seem complicated, but it's pretty easy to do. Some may have recognized the above is encapsulated in Scrum's "Retrospective," but there are a variety of techniques you can google to find a style of OODA that you think might fit your team.
The Army's "hot wash"
Various "post mortem" templates for diagnosing project failures
Lean's "Kaizen" events
Both the sprint-based "Retrospective" as well as "Heartbeat Retrospectives" for teams that don't use sprints
You don't need to institute all of these, obviously. And whatever type of OODA discussion you implement can borrow from all of them.
We find it best if some sort of "retrospective" is done according to the following:
Regular cadence (two weeks to a month)
Talk about what is going well and why (Observe and Orient)
Talk about what is going poorly and why (Observe and Orient)
Talk about how to repeat what went well (Decide)
Talk about how to avoid what went poorly (Decide)
Create and agree to at least one concrete action the team can take to improve, and check in on the next retro (Act)
You'll organically grow a process using what works for your team if you do this regularly.
Process is a shared set of assumptions.
Complex processes evolve from simple ones - incrementally and iteratively.
Therefore, any scaled agile must have evolved from basic agile iteratively.
OODA loops allow you to iteratively improve your process.
The first, and most basic agile process, is instituting some sort of OODA loop, with Scrum's Heartbeat Retrospectives being one of the most common and easy-to-implement forms.
If you need help, don't hesitate to ask!
Want to learn more about Guildmaster's approach to agile and process?