CBS Corporation Data Engineer in San Francisco, California
CBS BUSINESS UNIT: CBS Interactive
JOB TYPE: Full-Time Staff
JOB LOCATION: San Francisco, CA
Truly premium content. At true scale. Only CBSi. CBS Interactive is the premier online content network for information and online operations of CBS Corporation as well as some of the top native digital brands in the entertainment industry. Our brands dive deep into the things people care about across entertainment, technology, news, games, business and sports. With over 1 billion users visiting our properties every quarter, we are a global top 10 web property and one of the largest premium content networks online.
Check us out on  The Muse,  Instagram and  YouTube for an inside look into 'Life At CBSi' through employee testimonials, office photos and company updates.
Our team is a diverse and agile group of engineers that run data operations for CNET Media Group Business Intelligence team. We are responsible for developing tagging, data pipelines and data products to drive user growth, engagement and revenue opportunities. This position will focus primarily on CBS Interactive’s CNET Media Group properties, including CNET, GameSpot, TVGuide, ZDNet and Techrepublic among others.
As a Data Engineer, you'll be working on developing workflows to ingest, store, and process data leveraging Google Cloud Platform products and services with emphasis on scalability and reliability. You will be working closely with Business Intelligence, Product, Revenue Optimization and other Engineering teams to build and enhance our BigQuery Data Warehouse. This role provides opportunity to work on the latest cloud and open source technologies to develop and evolve our data analytics platform.
Collaborate with stakeholders to understand data needs and provide end-end data solutions
Develop and enhance ELT/ETL pipelines to ensure data availability and data quality
Create new data models to support data products and intuitive analytics
Use your Python and SQL coding skills to process and transform data
Research and promote data engineering best practices
Migration of existing on-prem data pipelines to cloud
Develop and enhance internal revenue data models and pipelines
Explore usage of new GCP data products and features (Data Catalog, BI Engine, BigQuery ML)
What you bring to the team:
You have -
Bachelor's degree in Computer Science or equivalent experience in a related field
3+ years of hands-on experience working in data warehousing or data engineering environment
Strong Python and SQL programming skills
2+ years experience developing data solutions on GCP or AWS
Strong experience in authoring, scheduling and monitoring of workflows (Airflow, Luigi)
Experience in ingestion of data from external APIs and data stores
Experience in design, build and operationalization of big data pipelines on distributed processing back-ends (Cloud Dataflow, Spark, Flink)
Strong communication & interpersonal skills
Can-do attitude on problem solving, quality and ability to execute
You might also have -
Experience implementing best practices for monitoring, alerting, and metadata management
Knowledge of Git, Jinja2, Docker, Bitbucket, and Bamboo
Google Cloud Certified - Professional Data Engineer certification would be a plus
Equal Opportunity Employer Minorities/Women/Veterans/Disabled