Annotation QA Analyst - Content Platform
We design Spotify’s consumer experience—end-to-end, moment-to-moment, across every screen, platform, and partner integration. Our mission is to make listening feel effortless, personal, and joyful for billions of users around the world. That means turning complexity into clarity across hundreds of touchpoints—from our mobile and desktop apps to the smart speakers, TVs, cars, and integrations where Spotify shows up every day. If it touches a consumer, we shape it. We bring deep insight into human behavior, design, and technology to craft experiences that feel intuitive, expressive, and unmistakably Spotify.
To power the best audio network in the world by enabling creators and content distributors to deliver their content frictionlessly and equip Spotify teams with the richest possible catalog. That’s what the Content Platform team is all about. We use our deep understanding of consumer expectations to enrich the lives of millions of our users worldwide, bringing the music and audio they love to the devices, apps and platforms they use every day.
The Annotation Platform Ops team works alongside Product and Engineering teams at the heart of the Content Platform R&D studio to develop rich, interconnected datasets that enable delightful consumer experiences through a fusion of machine and human expertise. We focus on use cases across all of Spotify’s domains - Music, Podcasts, and Audiobooks.
We are seeking an Annotation QA Analyst to help create a broad collection of labeled datasets that Content Platform teams use to train, evaluate, and better understand models and systems. This is an exciting opportunity to work at the center of ML and AI-driven development.
What You'll Do
- Review annotated data to ensure it meets Spotify’s quality standards and policies, prioritizing work based on business needs
- Deliver high-quality, timely results for Product and Engineering teams using established QA frameworks and metrics such as agreement rates and consensus
- Handle complex edge cases, helping define ground truth and reduce ambiguity across datasets
- Identify patterns, insights, and areas for improvement, and communicate findings clearly to both technical and non-technical partners
- Contribute to feedback loops between annotation teams, R&D collaborators, and content policy experts to improve workflows and outputs
- Help develop and refine annotation guidelines, supporting annotator training and continuous improvement
- Collaborate closely with teammates across multiple projects and domains
Who You Are
Where You'll Be
- This role is based in New York.
- We offer you the flexibility to work where you work best! There will be some in person meetings, but still allows for flexibility to work from home.