Docker
Senior Data Engineer
Docker
€80k - €100k
Europe (Remote)
Python
SQL
Snowflake

Senior Data Engineer

Overview

Docker is a remote first company with employees across Europe, APAC and the Americas that simplifies the lives of developers who are making world-changing apps. We raised our Series C funding in March 2022 for $105M at a $2.1B valuation. We continued to see exponential revenue growth last year. Join us for a whale of a ride!

Job Description

Docker is looking for a Senior Data Engineer to join our Data Engineering team which is led by our Director of Data Engineering. The team transforms billions of data points generated from the Docker products and services into actionable insights to directly influence product strategy and development. You'll leverage both software engineering and analytics skills as part of the team responsible for managing data pipelines across the company: Sales, Marketing, Finance, HR, Customer Support, Engineering, and Product Development.

Responsibilities

  • - Manage and develop ETL jobs, warehouse, and event collection tools and test process, validate, transport, collate, aggregate, and distribute data
  • - Build and manage the Central Data Model that powers most of our reporting
  • - Integrate emerging methodology, technology, and version control practices that best fit the team
  • - Build data pipelines and tooling to support our ML and AI projects
  • - Contribute to enforce SOC2 compliance across the data platform
  • - Support and enable our stakeholders and other data practitioners across the company
  • - Write and maintain documentation of technical architecture

Required Skills

  • - 4+ yrs of relevant industry experience
  • - Experienced in data modeling and building scalable data pipelines involving complex transformations
  • - Proficiency working with a Data Warehouse platform (Snowflake or BigQuery preferred)
  • - Experience with data governance, data access, and security controls
  • - Experience with Snowflake and dbt is strongly preferred
  • - Experience creating production-ready ETL scripts and pipelines using Python and SQL and using orchestration frameworks such as Airflow/Dagster/Prefect
  • - Experience designing and deploying high-performance systems with reliable monitoring and logging practices
  • - Familiarity with at least one cloud ecosystem: AWS/Azure Infrastructure/Google Cloud
  • - Experience with a comprehensive BI and visualization framework such Tableau or Looker
  • - Experience working in an agile environment on multiple projects and prioritizing work based on organizational priorities
  • - Strong verbal and written English communication skills

Benefits

  • - Freedom & flexibility; fit your work around your life
  • - Home office setup; we want you comfortable while you work
  • - 16 weeks of paid Parental leave
  • - Technology stipend equivalent to $100 net/month
  • - PTO plan that encourages you to take time to do the things you enjoy
  • - Quarterly, company-wide hackathons
  • - Training stipend for conferences, courses and classes
  • - Equity; we are a growing start-up and want all employees to have a share in the success of the company
  • - Docker Swag
  • - Medical benefits, retirement and holidays vary by country

About the company

Docker provides a suite of development tools, services, trusted content, and automations, used individually or together, to accelerate the delivery of secure applications.