How to Use ONA Data for Performance Development in Confirm
Most performance data tells you what happened last quarter. ONA tells you something different: who your organization actually depends on to function.
Organizational Network Analysis maps collaboration patterns across your company: who communicates with whom, who gets pulled into decisions, who people go to when they need help. In Confirm, this data appears alongside traditional performance metrics, giving managers a second axis of information that manager ratings alone can't capture.
The result: you find the people your ratings miss, catch the managers who hoard credit, and make development investments in the right people, not just the most visible ones.
This playbook is a recipe for reading Confirm's ONA data and turning it into specific development decisions.
The Recipe at a Glance
Outcome you're trying to achieve: A clear picture of each employee's organizational influence and collaboration patterns, cross-referenced with their performance rating, so you can make development decisions grounded in how the org actually works, not just what managers report.
Ingredients:
- ONA enabled in your Confirm account (requires integration with at least one collaboration tool: Slack, Teams, email, or calendar)
- At least 6–8 weeks of ONA data collected (patterns need time to stabilize)
- A completed or in-progress performance cycle (so you have ratings to compare)
- 30 minutes with a manager or HR partner to interpret the data together
Time required: 30 minutes to configure and run your first ONA report; 1–2 hours to review and act on findings.
When to use this: Before and after performance cycles, during succession planning, when identifying who to invest development resources in, or when you suspect your ratings are missing someone important.
When NOT to use this: As a primary evaluation metric. ONA is context, not a rating system. High network influence doesn't mean high performance. It means the org relies on this person. That's worth investigating. It's not a conclusion.
Step 1: Connect Your Collaboration Data
Confirm's ONA engine runs on collaboration signals from your existing tools. To start:
Go to Settings → Integrations → Collaboration.
Confirm integrates with:
| Tool | Data type | Setup time |
|---|---|---|
| Slack | Message volume, channel participation, DM patterns | 10–15 minutes |
| Microsoft Teams | Same as Slack | 10–15 minutes |
| Google Workspace (Calendar) | Meeting patterns, co-attendance, 1:1 frequency | 5 minutes |
| Microsoft 365 (Calendar) | Same as Google | 5 minutes |
| Email (metadata only) | Thread volume, response time, cross-team contact | Requires admin access |
You don't need all of these; one is enough to get started. Calendar data tends to produce the cleanest ONA signal for most mid-market orgs. Slack/Teams data adds richness but requires more volume to be statistically meaningful.
Once connected, Confirm begins collecting data. Come back in 6–8 weeks with enough signal to see patterns clearly.
Step 2: Read Your ONA Dashboard
In Confirm, go to People Analytics → ONA Dashboard.
The dashboard shows three key visualizations:
Network influence score: A percentile score (0–100) representing how central each employee is to organizational collaboration. A score of 85 means this person is more influential than 85% of employees.
Collaboration reach: How many unique people this person collaborates with across teams, compared to others at their level.
Influence-to-visibility ratio: The comparison that matters most. This plots network influence against performance rating. Employees in the top-right are both highly rated and highly influential. Employees in the top-left are highly influential but lower-rated. That's where to look first.
The network graph (toggle it on with the Show Network button) shows each employee as a node, with connection lines representing collaboration frequency. Thick lines mean frequent, two-way communication. Thin lines mean occasional contact.
You're looking for:
- Connectors: People with lines to many nodes across departments
- Isolated nodes: People with minimal connections, especially if they're in roles that should require collaboration
- Clusters: Teams that only talk to themselves; this is a collaboration gap worth investigating
Step 3: Find the People Your Ratings Are Missing
The most valuable ONA analysis is the influence-performance gap.
In the ONA dashboard, filter to Show: High Influence, Low-Mid Rating.
This view shows employees with significant organizational pull (people the network depends on) who are rated average or below. Four things typically explain this pattern:
- Manager blind spot: The manager has limited visibility into how much this person does for others across teams
- Visibility problem: The employee is highly collaborative but doesn't advocate for their own work; their contributions flow through others
- Misaligned criteria: The employee is rated on individual output, but their highest-value work is organizational: enabling others
- Legitimate performance gap: High influence, genuinely underperforming on their actual scope; a coaching conversation is needed
You won't know which is true from the ONA data alone. What you have is a list of names worth a 20-minute conversation with their manager.
The conversation is simple: "We're seeing high network activity for [Name]: a lot of other people appear to collaborate with and rely on them. Does that match what you observe? Is that reflected in their current rating?"
That conversation changes ratings and development plans more often than you'd expect.
Step 4: Use ONA Signals to Target Development Investments
ONA data tells you more than who's underrated. It tells you what kind of development would make the most impact.
| ONA pattern | What it suggests | Development action |
|---|---|---|
| High influence, single department | Strong within-team reputation; limited cross-functional reach | Assign stretch project with cross-team stakeholders |
| Low influence, high rating | Strong individual performer; limited collaborative contribution | Explore whether role expectations match reality; consider connector opportunities |
| Rising influence trajectory | Influence score increasing quarter-over-quarter | Monitor for high-potential signals; consider accelerated development |
| Influence concentrated in one relationship | Most collaboration goes through one person (often their manager) | Investigate dependency; broaden exposure |
| High outbound, low inbound | This person collaborates out, but others don't pull them in | Explore fit, role clarity, or visibility issues |
In Confirm, you can click any employee's node in the network graph to see their collaboration trend over time. A rising line is a development signal. A flat or declining line in someone previously influential is worth a conversation.
Step 5: Build ONA Findings Into Development Conversations
ONA data is a conversation starter, not a verdict.
When you sit down with a manager to discuss a development plan for someone with a notable ONA pattern, bring two things:
- The ONA dashboard view showing the employee's influence score and trend
- A specific observation from the network graph: "She's one of three people that engineering, product, and sales all have active collaboration lines to. That's unusual at her level."
Then ask questions rather than making claims:
- "Does this match what you see from your vantage point?"
- "Is this level of cross-functional collaboration intentional, or is it just how she works?"
- "If she's doing this much enabling work, is that captured in her current rating and development plan?"
The manager's answers will shape what comes next. Sometimes the answer is "yes, she's exceptional and we've been discussing a promotion," or "I didn't realize how much she was doing across teams. I need to adjust how I think about her contributions."
Both are useful. Both required looking at the ONA data first.
Step 6: Use ONA for Flight Risk and Succession Planning
Two more use cases where ONA data in Confirm produces immediate value:
Flight risk identification
A sudden drop in someone's network influence (fewer collaboration signals, shrinking reach) is often a leading indicator of disengagement before it shows up in attitude or productivity. In Confirm, the ONA dashboard flags employees whose influence scores have declined more than 20% quarter-over-quarter.
These are worth a stay interview: "You've been here two years and you're someone the company relies on. I want to make sure we're investing in you the right way. Is there anything about your role or trajectory that you're uncertain about?"
You're not diagnosing flight risk. You're opening a door.
Succession planning
When a key role opens, traditional succession planning asks "Who's next in the org chart?" ONA asks a different question: who is already functioning as an informal leader in the network?
In Confirm's People Analytics view, you can filter by employees whose collaboration patterns overlap significantly with a departing leader's network, specifically people who already have relationships with that person's key stakeholders. Those employees are succession candidates who might not appear on a traditional nine-box.
What ONA Is Not
Worth stating directly because it's misused:
- ONA is not a surveillance tool. Confirm uses metadata (who interacts with whom, how often), not content. No message reading, no monitoring of what people say.
- ONA is not a rating system. High influence does not mean high performance. It means the org depends on this person. The cause of that dependency matters.
- ONA scores don't belong in a performance review. Share ONA insights with managers as context; don't include the score in the review itself or share it with employees without careful framing.
Used well, ONA data surfaces what manager observation misses. That's its job. The interpretation and action remain human decisions.
FAQ
How is ONA data collected in Confirm?
Confirm reads collaboration metadata from integrated tools: message volume patterns, calendar co-attendance, email thread participation. It doesn't access content: no reading of messages, emails, or meeting recordings. Only aggregate interaction patterns.
How long until ONA data is useful?
Useful signal typically appears after 6–8 weeks of data collection. The first few weeks will show obvious patterns (very frequent collaborators, completely isolated individuals) but not the nuanced trends that inform development decisions. Give it a full quarter for calibration-grade data.
Should employees know their ONA data exists?
Yes. Confirm HR recommends transparency. Let employees know that Confirm uses collaboration metadata to provide organizational context in performance conversations. Most employees respond positively when they understand it's about being seen more accurately, not monitored.
Can ONA data replace peer reviews?
Not exactly, but they're complementary. Peer reviews capture qualitative impressions. ONA captures behavioral patterns. Used together, they produce a fuller picture than either alone.
What if the ONA data contradicts a manager's rating?
That's the point. When there's a meaningful gap between someone's network influence and their performance rating, that's a signal worth investigating. Not to automatically override the manager's assessment, but to understand why the gap exists. Most of the time, the conversation produces either a rating adjustment or a better understanding of the employee's role and contributions.
