The Challenge
A digital marketing agency was manually pulling performance data from 8 ad platforms daily — Meta, Google Ads, TikTok, LinkedIn, Twitter, Snapchat, Pinterest, and DV360. This consumed 4 hours of analyst time every morning, and reports were often stale by the time clients received them.
The Solution
Built a Python-based ETL pipeline that pulls fresh data from all 8 platform APIs every hour into BigQuery. Connected BigQuery to Google Looker Studio for real-time client dashboards. Added an anomaly detection layer that sends Slack alerts when spend, CTR, or ROAS deviates by more than 15%.
🎯 Primary Goal
Eliminate manual bottlenecks and create a scalable, automated workflow that runs reliably without constant oversight.
⚡ Key Constraint
Implementation had to happen without disrupting ongoing operations — a zero-downtime, iterative approach was essential.
The Outcome
Reporting time dropped from 4 hours to 15 minutes. Clients now access live dashboards 24/7. The anomaly detection system flagged a $12K budget overspend within 2 hours — something that would have been missed for days under the old process.
"We were spending 4 hours a day manually pulling reports from 8 platforms. Ademola built a pipeline that does it in seconds. Now we actually use the data to make decisions."
— Tom Pringle, Head of Growth
Technical Implementation
The solution was built using a modular architecture ensuring each component could be tested, updated, and scaled independently. Key technical decisions included:
- Webhook-first design to ensure real-time data flow with no polling delays
- Error handling and retry logic at every integration point
- Comprehensive logging and monitoring for immediate issue detection
- Documentation and training sessions to ensure client team independence