Retirement Isn’t What It Used to Be (Copy)
A few months ago, I was putting together a desk at home. You know the kind: flat-pack, deceptively simple, and accompanied by a diagram that looks like it was drawn by someone who has never actually built anything.
I grabbed what I thought was the right tool: a ratchet wrench. Solid, reliable, familiar. I’ve used it a hundred times. But this time it kept slipping, then stripping. And then the inevitable realization that I wasn’t using the right socket; I was using one that looked right. Close enough to get started … and close enough to do some damage. Not close enough to do the job properly.
That experience comes back to me a lot, especially when I see organizations selecting a 360° assessment tool. They find one that looks right, feels right, and has all the right language and features, so they go with it, and then wonder why the results aren’t what they expected.
The Problem Isn’t the Tool, It’s the Fit
There are a lot of 360° assessments out there. Most of them will "work" in the sense that they'll produce data, but the real question is whether the tool fits your purpose, your audience, and your context.
When the fit is off, you tend to get data that's technically correct but not meaningful, feedback that people don't trust or act on, and a process that creates more resistance than insight. It’s the stripped-bolt problem: you can keep turning, but you won’t get where you’re trying to go.
What to Look for When Choosing a 360° Assessment
Here are five practical questions to ask before you choose a tool. And they matter far more than any feature list.
1. What are you trying to do?
This sounds obvious, but it’s the step that gets skipped most often. Before you start comparing tools, get clear on what you want to achieve. It might be building self-awareness, supporting development planning, reinforcing a leadership model, diagnosing cultural patterns, informing succession decisions, or something else entirely.
Different tools are built for different purposes. A developmental instrument may be excellent for coaching but weak for succession planning decisions. A competency-based tool tied to your leadership framework may be great for alignment but less useful for personal insight.
If you don’t get clear on purpose first, you’ll default to whatever looks polished, and “polished” isn’t the same as “right for us.”
2. Will the report help people make sense of the feedback?
This is where a lot of tools fail; they generate plenty of information, but the results are hard to interpret for leaders who aren't accustomed to reading assessment reports.
When you review a sample report, pay attention to:
How ratings are aggregated and summarized
Whether rater groups are clearly separated (manager/peers/direct reports, etc.)
How gaps are shown (self vs. others) and what guidance is provided
Whether the report highlights patterns and priorities, or simply “dumps data”
A good report guides insight without overwhelming the reader. If someone needs a specialist in the room just to understand what they're looking at, the tool may be creating more confusion than clarity.
3. What level of confidentiality can you realistically promise?
This one matters a lot. If participants aren't confident their feedback is anonymous and handled with integrity, they won't give you their most honest responses; they'll give you their safest ones. Filtered, politically careful feedback leads to unreliable data, and then you're making decisions based on noise.
Collaborating with a third-party partner like Cenera to administer the assessment process and aggregate results can make a meaningful difference by increasing trust in the process from the start.
4. How easy is it for people to participate?
Participation sounds like a logistical detail, but it directly affects the quality of results. If the assessment itself is too long, confusing, or clunky to navigate, completion rates drop, rater fatigue increases, and the feedback becomes less reliable.
Look for a process that’s clear, efficient, and respectful of people’s time.
5. Will it translate into action?
A 360° assessment shouldn’t stand alone. Its value comes from how well it connects to your leadership programs, competency model, coaching approach, and day-to-day development conversations.
Just as important is the support around the assessment process itself – who debriefs results, how leaders make sense of the feedback, and what follow-up (coaching, development planning, manager involvement) is in place.
Without integration and support, even the best tool becomes a one-time event. With it, feedback can translate into real goals, practice, and sustained behaviour change.
Final Thought
When a 360° assessment process fails, it’s rarely because the feedback “doesn’t work”; it’s usually because the instrument, and the experience around it, didn’t match the goals you set out to accomplish.
If you’re selecting a tool right now, take a little more time up front to get clear on purpose, ask questions about fit, and think about what will make the feedback usable, not just how sophisticated the report looks. It’ll save you from stripping the bolts before you even get started. And if you’ve ever had to drill a bolt out and start over, you know exactly why that matters.
Cenera has a team of experts who will work closely with you to choose the right tool, administer it for participants, and provide leaders with the support they need to make it a worthwhile investment. www.cenera.ca/contact-us