Calibrations in 15Five enable People Ops and HR leaders to visualize the distribution of review ratings across different managers and easily adjust those ratings in real-time— ensuring that reviews are being done in a fair and unbiased way across teams.
In this article, you will learn...
- What are calibrations, and why are they important?
- The experience: what calibrations look like in a review cycle
- Best practices
- Calibration process timeline
- Additional resources
What are calibrations, and why are they important?
Performance calibration meetings help to ensure that the standards used to rate performance are balanced across an organization. Typically, calibration meetings create a common, consistent language from which to clearly communicate performance management standards. The following happen in a calibration session:
- Discuss the most important criteria to differentiate top performers
- Review the proposed ratings of employees
- Determine the alignment with the criteria identified for top performers
- Make adjustments as appropriate
In theory, this helps drive greater consistency in the rating process.
Issues with the traditional process
In our Reviewing the Performance Review Report, we found that the number one concern HR leaders expressed surrounding performance reviews was ensuring that the review process is fair and equitable. Reviews are perceived as fair when they are consistent, accurate, unbiased, and open to voice and input. When a review is perceived as fair, people will be more willing to accept the outcomes, even if they are undesirable.
Thus, it is alarming that while 79% of HR leaders and 70% of managers said that the performance review process at their organization is fair and equitable, only half (50%) of individual contributors agreed.
Furthermore, we found that HR leaders (75%) are more likely than managers (66%) and individual contributors (43%) to think that the performance review process provides an accurate representation of employee performance.
It’s clear that, while fairness and equity are highly valued in the performance review process, many organizations are still not succeeding in designing and scaling a process that actually feels fair and equitable to all members of the organization. A main barrier to fairness and equity is lack of clarity and alignment around what a particular performance rating actually means.
For example, let’s say there are two employees with two different managers. Both employees are performing what is considered “at level” for their role. When it comes time for performance reviews, Manager A gives her employee 5 out 5, because she thinks the employee is meeting the expectation of the job description. Manager B, on the other hand, gives her employee 4 out 5, because she thinks her employee is meeting the expectations of the job description but still has room to grow.
Just simply viewing the meaning of a 4 vs. a 5 and/or what number on the scale correlates to “at level” can cause a lot of discrepancy between different managers and their teams. Given the example above, the two teams may both overall be performing at level, but Manager A’s team will appear to be performing better than Manager B’s team, simply because of the difference in how the two managers assigned ratings.
When you get multiple managers who all think about ratings a bit differently, plus a combination of everyone’s implicit biases, it makes creating a fair and objective review process difficult. This becomes even more important when things like pay raises, promotions, or terminations are on the line.
There needs to be a step of the performance review process where performance ratings are reviewed and adjusted to eliminate bias. That’s where Calibrations come in.
The 15Five way
Our calibrations solution helps ensure that reviews are being done in a fair and unbiased way— quickly, more collaboratively, and within the context of your work.
Calibrations in 15Five occur after manager reviews have been submitted in a review cycle. As manager reviews are submitted, answers populate the calibration session, and once all manager reviews are submitted, calibration sessions can be held.
During a calibration session, contributors can visualize and discuss the distribution of employee ratings across teams and managers. From there, they can make adjustments to ratings to ensure that employees are being rated in a fair and unbiased way.
Contributors can calibrated answers given to any sections of manager reviews: the Private Manager Assessment (PMA), company value ratings, competency ratings, and/or objective ratings.
To reduce the idiosyncratic rater bias, our Private Manager Assessment questions ask managers what they would do with each team member, rather than what they think of that individual. By answering our PMA questions, managers are able to ground their answers in as many objective measures of behaviors and results as possible to ensure a more objective and fair assessment of performance over time.
The experience: what calibrations look like in a review cycle
Review cycle settings
To include calibrations in a review cycle, the following criteria must be met:
- The review cycle includes manager reviews (since manager review answers are what's calibrated in a review cycle)
- During review cycle creation, the review admin checks the box next to "Include calibration sessions in this review cycle" and sets a 'Calibrations due by' milestone (this date should fall after the manager review milestone and before the 'Start sharing on' milestone)
- If you decide after launching a review cycle that you want to include calibrations, you can "turn on" calibrations until the 'Start sharing on' milestone. If you try to enable calibrations for a cycle for the purpose of calibrating the PMA section, and your question template does not have any questions in the PMA, you will receive an error message. If you decide that you no longer wish to include calibrations in a review cycle, you can "turn off" calibrations until at least one calibration session has been locked. These actions can be taken by editing review cycle settings. Only review admins can make these changes.
- Review admins and cycle collaborators can create calibration sessions, view all calibration sessions (even if they're not appointed as a contributor), and lock calibration sessions.
- Contributors can view calibration sessions they're a contributor in, as well as change manager review answers within a calibration session.
In the review cycle
In a review cycle that includes calibrations, review admins, cycle collaborators, and anyone who is designated as a contributor will see a 'Calibrations' tab in the top navigation. Click this tab to open the Calibration Dashboard.
Check out our "Hold a calibration session" article for a full walkthrough of what sections are included in a calibration session.
The following calibration-specific notifications may be sent in a review cycle. All of these notifications can be managed individually in individual notification settings or company-wide in company notification settings.
|Invited as a contributor to a calibration session||Email & in-app notification||Sent to all contributors (except the person who created the session) upon session creation|
|Calibration session locked||Email & in-app notification||Sent to all session contributors (except person locking the session) upon session locked|
|Calibration sessions are due in 3 days||Email & in-app notification||Sent to all calibration session contributors 3 days before the Calibrations due by date|
|Calibration session deleted||Email & in-app notification||Sent to all contributors and review admins (except the person who deleted the session) upon session deletion|
|Mentioned in activity feed comment||Email and in app notification||Sent to anyone who is mentioned in activity feed comment immediately upon the mention|
|Calibrate reviews in the [Session Name] session. Due in [x] Days.||To-do shown in Highlights and Cycle overview page||Shows for all contributors when they’ve been added as a contributor to a created calibration session|
To increase fairness up-front, we recommend creating standard performance rubrics,
setting clear performance expectations, and writing out definitions of scale answers.
- These conversations should happen long before Best-Self Review® time, and ideally during initial role clarity conversations.
- Set clear process expectations for managers and participants alike—having a clear process agreed upon before the review process, including calibrations, will create a fair and transparent environment.
The Private Manager Assessment questions in 15Five lay the foundation for an unbiased
review. The Private Manager Assessment results, when combined with Calibrations, add another layer of fairness and consistency.
- To reduce the idiosyncratic rater bias, 15Five's Private Manager Assessment questions ask managers what they would do with each team member, rather than what they think of that individual.
- Unlike Private Manager Assessments, calibration sessions rely on people other than the manager to provide input around performance. The combination of these methods increases consistency and can reduce bias.
Ensure that calibration conversations are grounded in Private Manager Assessment data, in addition to as many measures of behaviors and results as possible (e.g., Competencies and Objectives). Data has objectivity and when combined with Private Manager Assessment results creates a holistic performance picture.
- In addition to the Private Manager Assessment answers, you can include answers from other sections of manager reviews in a calibration session- including company value ratings, competency ratings, and objective ratings. If you choose not to include these additional sections in calibration sessions, you should review Competency results and Objectives progress when calibrating.
TIP: Have reports on Competencies, Objectives progress, etc ready to reference during the calibration session.
- Have your scale labels and definitions handy for the session, as well as performance agreements, company values, and competencies.
- To increase transparency and traceability, include points of emphasis behind the decision to calibrate in the activity feed of the calibration session.
Who? What? When? Let’s talk logistics.
- Who: With 15Five calibrations, you can choose whose review results you want to calibrate and separate employees into distinct calibration sessions. You can include as many calibration sessions in a review cycle as you want.
Within each calibration session, you are asked to appoint contributors. Contributors are the people who are responsible for holding a calibration session— that is, discussing results and making changes as needed.
We suggest that you aim to have the same or similar ratio of managers vs leadership vs People Ops in each session, and that you appoint a moderator (People Ops/HR) to ensure everyone in the session has equal opportunity to share their insight.
- How: How you lay out your sessions is up to you. A couple of common breakdowns are: by department, by level, and by role. Base this on your company/department dynamics, psych safety, and logistics of holding multiple sessions.
- What: You can select which sections of manager reviews you want to include in a calibration sessions: Private Manager Assessment answers, company values ratings, competency ratings, and objectives ratings. Answers from both manager reviews and additional manager reviews are included in calibration sessions.
If a contributor changes a PMA answer from either the direct manager or an additional manager review during a calibration sessions, all answers to that question for the participant will change to the new answer.
Have reports on Competencies, Objectives progress, etc ready to reference during the calibration session. Have your scale labels and definitions handy, as well as performance agreements, company values, and competencies. Include any notes or points of emphasis behind the decisions in the notes section of the calibration session. This not only increases transparency, but can be referenced at any time to spot trends.
- When? Calibrations should be used as often as feasible—if you’re able to calibrate every review cycle where PMA questions are included, that is ideal. If not, we strongly suggest cycles where comp/promotions are on the line to contain calibrations.
Calibration sessions are held after managers have submitted their manager reviews and before the sharing and/or finalizing dates.
Calibration process timeline
- Enable calibration sessions. Calibrations can only be enabled in review cycles that include manager reviews. When enabled, calibrations fall between the 'Manager reviews due by' date and the 'Start sharing on' date. A review admin must check the "Include calibration sessions in this review cycle" box when creating the review cycle for calibrations to be included.
- Create calibration sessions. Before you reach the manager review due date, you should create calibration sessions. Creating sessions and assigning contributors before the due date will give contributors more time to hold calibration sessions before the 'Start sharing on' date.
- Manager reviews are submitted. As manager reviews are submitted, calibration sessions will populate. If additional manager reviews are submitted in your review cycle, their answers will also populate the calibration session.
- Hold calibration sessions. Contributors hold calibration sessions to change manager review answers as needed. Changed answers will not be visible to managers until you lock the calibration session.
- Lock calibration sessions. At this point, answers are updated in the manager reviews and summary to reflect calibrations.
- Managers share review results, hold final meetings, and finalize review results. The 'Start sharing on' date comes around and managers then have the ability to share results, hold final meetings, and finalize the results with their direct reports.
Check out these additional resources ⬇️
- Help Center article 💡: Create a calibration session
- Help Center article 💡: Hold a calibration session
- Help Center article 💡: Lock a calibration session
- Blog post ✍️: Ensure Fairness and Consistency During Performance Reviews with Calibrations
- Webinar 👩💻: Minimizing Bias in Your Performance Reviews