OKRs and Lots of Bad Advice
One of the most frustrating aspects of being an OKR consultant is the terrible advice I see shared. Like most business topics, there is no canon for the framework. As a result, every Tammy, John and Angela have shared their view. As a novice, you can't differentiate. The impact of following bad advice can be, at best, a waste of time. At worst, it can be damaging for your business.
Some of the bad advice comes from tool vendors or people proclaiming to be OKR experts.
Much of the bad advice is seductive, promising an easy option. The advice ignores the necessary rigour and feedback loops that underpin great implementations. Thinking in terms of outcomes is tricky, but it's essential.
Here are some examples of the bad advice I see:
Considering initiatives as part of the framework. One central tenet of OKR is that teams agree on a goal but are free to decide how to achieve it. If we list initiatives alongside goals, we take that freedom away. They no longer act on short feedback loops. The framework is OKR, not OKRI.
Advocating the framework for task or project management. We're focused on outcomes. We break that connection when we start tracking tasks, outputs, or projects. You've wasted time if you complete a task and don't move the needle on your outcome. There are tools for tracking projects. Use those instead.
Making everything an OKR. If we want to focus, we have to decide what to focus on. When we try to connect every piece of work to OKRs, we inherently say that everything is important or, to flip it, that nothing is more important.
Not exactly advice, but listing bad examples is just as damaging. People learn from examples and re-apply them.
Here are some of the worst examples I've seen:
Objective: Save time wasted in update meetings. KR: Reduce the average number of meeting attendees
Why is this example from an 'OKR expert' terrible?
It's divorced from a customer outcome.
How do you know the time in meetings is wasted? You could reduce the number of attendees, and the quality of decisions may fall. Is this still a good outcome?
Objective: Our development process is now faster KR: Increase story point delivery to x each sprint
This example is from a tool provider.
Velocity is a terrible metric and completely game-able.
It's a measure of output. Even if the team manages to deliver more objectively, how does this connect to business outcomes?
Driving outcomes requires discovery work that could slow down delivery. Is that a bad outcome?
Objective: Improve our NPS by 20% Key Result: Produce three resources to help customers get more value from our product.
This example is also from a tool provider.
Objectives should be qualitative as the KRs tell us how we measure success. Even the quantitative form is weak. Does improving our NPS by 20% mean we go from 20 to 24 or 20 to 40?
Producing resources is output; in this case, it's a prescriptive output. Key results should be outcomes. What if you create three low-quality resources and they frustrate people? Is that a great outcome? What if you focus on one great resource? What if you let the team focus on improving the likelihood that customers refer you?
If you want some reliable sources for OKR insights, follow these three accounts on Twitter.
OKRs aren’t a piece of cake, if somebody is making them seem that way, their advice is probably terrible!