How Linear Turns AI Suggestions into Force Multipliers
TRIGGER
Support teams needed to route incoming issues to the right team and detect duplicates, but fully automated routing risked misclassification and missed context that only humans could catch—particularly for nuanced feature requests that need aggregation with existing projects.
APPROACH
Linear's CX team uses Triage Intelligence as a first pass on incoming issues: it detects duplicates (especially useful for aggregating feature requests), applies labels for product areas, and suggests the right team or assignee. Input: raw customer issue from Intercom/Slack with AI-suggested labels, duplicates, and assignee. Output: triaged issue routed to correct team with human-verified categorization. After this automated pass, a rotating 'goalie'—an engineer or PM designated weekly via triage responsibility settings—makes the final routing decision. The goalie either takes bugs themselves, assigns to the right expert, or links feature requests to candidate projects with added context.
PATTERN
“'Confirm this is right' is cognitively 10x cheaper than 'figure out where this goes.' AI suggestions become force multipliers when you preserve a human checkpoint—the goalie validates or overrides a pre-populated decision rather than triaging from scratch, maintaining accuracy on edge cases without the mental load.”
✓ WORKS WHEN
- Issue volume is high enough that manual triage from scratch creates meaningful bottleneck (>50 issues/week)
- Misrouting has real cost—wrong team assignment delays resolution or frustrates customers
- Categories have fuzzy boundaries that require judgment (feature request vs bug, which product area)
- Team has existing rotation or on-call structure that can absorb goalie responsibility
- AI can provide useful signal even if not 100% accurate (duplicates, labels, suggested assignee)
✗ FAILS WHEN
- Volume is low enough that triage overhead exceeds review overhead (<20 issues/week)
- Categories are clear-cut and deterministic (status codes, error types with exact matches)
- No one has context to validate AI suggestions—goalie becomes rubber stamp
- Triage decisions require information not available to the AI (customer tier, active incident context)
- Team resists rotation or goalie role creates accountability gaps