Our client is the company which pioneers the future of cross-platform media measurement, arming organizations with the insights they need to make decisions with confidence. Central to this aim are our people who work together to simplify the complex on behalf of our clients & partners.
It is a trusted partner for planning, transacting and evaluating media across platforms. With a data footprint that combines digital, linear TV, over-the-top and theatrical viewership intelligence with advanced audience insights, its platform allows media buyers and sellers to quantify their multiscreen behavior and make business decisions with confidence.
You’ll be responsible for building next generation data warehouse together with migrating the data and processes from existing one (which would be deprecated later this year). The data is main supplier for a broad range of clients and products, including industry leading ad agencies, national television networks and other products. As a member of this fast-moving team you’ll have large impact on the evolution and adoption of the data processing as well as on the success of the business. It’s worth mentioning that this company processes and stores dozens of petabytes of data which is coming from TV/Web and their current infrastructure processes more than 15 bln requests per day.
Existing data warehouse is organized in Greenplum database and different job/procedures implemented via different set of tools/programming languages. The goal is to migrate it to AWS/Snowflake together with new warehouse design, improved usability and performance optimizations. Jobs should be reviewed, optimized and implemented using Spark (additional tools to be defined).