How Canva Bridges Analytics to Non-Technical Stakeholders
TRIGGER
Impact analysis outputs contained statistical tables and technical metrics that non-technical stakeholders (product managers, finance analysts) struggled to interpret, creating a bottleneck where data scientists had to manually translate results before decisions could be made.
APPROACH
Canva integrated Cortex (Snowflake's managed LLM service providing access to Claude, LLaMA, Mixtral) directly into their analytics app to generate natural language summaries of impact analysis results. Input: completed analysis tables with statistical outputs. Output: readable summary text explaining results in plain language. The LLM calls happen within Snowflake's environment, keeping data within their ecosystem for compliance. Users can generate summaries on-demand after any impact analysis completes.
PATTERN
“Building custom LLM integrations then failing security review because business metrics can't leave the data warehouse. Managed LLM services in your data platform (Snowflake Cortex, etc.) trade model flexibility for zero compliance overhead.”
✓ WORKS WHEN
- Output consumers include non-technical stakeholders who need to act on insights without data science support
- Data governance requires keeping sensitive business data within a specific ecosystem (Snowflake, AWS, etc.)
- Analysis outputs follow predictable structures that can be summarized with consistent prompts
- Latency tolerance allows for LLM inference time (seconds) on top of analysis computation
- Organization already has access to managed LLM services within their data platform
✗ FAILS WHEN
- All consumers are technical and prefer raw data over summaries for auditability
- Outputs require nuanced interpretation that risks oversimplification in summarization
- Cost sensitivity prohibits LLM calls for each analysis (high-volume, low-value queries)
- Compliance requirements are satisfied by existing infrastructure, making managed-service constraint unnecessary
- Results need to be deterministic and reproducible—LLM variation introduces inconsistency