Department/Group Description:
The Disney Decision Science and Integration (DDSI) analytics consulting team is responsible for supporting clients across The Walt Disney Company including Direct-to-Consumer & International, Media Networks (e.g., ABC, ESPN), Studio Entertainment (e.g., The Walt Disney Studios, Disney Theatrical Group) and Parks, Experiences & Consumer Products. DDSI leverages emerging technology, data analytics, data science with a focus on optimization, statistical and econometric modeling to explore opportunities, shape business decisions and drive business value.
The Data Engineering (DE) team within DDSI is involved in various data engineering and data platform support activities. As a member of the team, we’re involved in activities such as data acquisition and validation, designing and implementing ETL/ELT data pipelines, designing and implementing databases while evolving our next generation data platform to fulfill the needs of our applications, data services, ad-hoc analytics, and self-service/POC initiatives.
Job Description:
The Data Engineer role will be hands-on data engineering development across multiple projects. You will work with technologies such as AWS, Docker, Gitlab, Airflow, Python, Snowflake, or PostgreSQL. Daily activities include designing and implementing database schema/tables/views, design and coding of data pipelines, implementing data services API’s, automation, testing, performance tuning, monitoring, proactive data validation, building data visualizations, and evolving our data engineering code base.
Beyond the needs of projects, this role will support our existing data pipelines and enhance current functionality to support our partner teams in data science and analytics.
Basic Qualifications:
- 3-5 years of data engineering experience
- Proven experience and expertise using Python, SQL, Docker, Snowflake and/or PostgresSQL
- High Proficiency in SQL coding
- Experience managing and deploying code using GitLab/GitHub
- Experience leveraging containerization technologies such as Docker or Kubernetes
- Experience leveraging job scheduling software like Apache Airflow
- Experience with Agile project management (i.e. Scrum)
- Strong understanding of relational and dimensional database design
- Knowledgeable on cloud architecture and product offerings, preferably AWS
Preferred Qualifications:
- 3-5 years of Python programing experience – High proficiency
- Hands-on experience with SnowSQL in Snowflake
- Hands-on experience using Apache Airflow for workflows and automation
Preferred Education:
Bachelor’s degree (Computer Science, Mathematics, Engineering or related field preferred)