Netlify
Staff Data Pipeline Engineer
Netlify
$168k - $227k
Worldwide (Remote)
Kafka
MongoDB
Snowflake

Staff Data Pipeline Engineer

Overview

At Netlify, the Data, Insights & Systems team sits at the heart of our go-to-market strategy. We power the infrastructure that drives our product insights, billing, and customer experience.

Job Description

Netlify is the most popular way to build, deploy, and scale modern web applications, empowering everyone from fast moving solo developers to enterprise teams. Millions of developers and organizations rely on Netlify as their platform of choice for composing web applications that are production ready, performant, and flexible.

Responsibilities

  • - Design, build, and optimize high-performance data pipelines that power critical business and customer-facing systems.
  • - Implement and maintain real-time streaming pipelines using Kafka and both relational and non-relational databases (MongoDB, Snowflake, ClickHouse) to support operational and analytical use cases.
  • - Refactor legacy data systems to improve performance, scalability, and reliability, while ensuring billing systems remain stable and data flows remain accurate for downstream processes.
  • - Build robust observability and alerting systems to surface data quality or reliability issues before they impact customers or billing.
  • - Contribute to systems engineering efforts within the data organization, emphasizing observability, monitoring, and infrastructure-as-code best practices.
  • - Partner closely with Software Engineers, Data Analysts, Data Engineers, and Systems Admins to enable reliable data modeling, reporting, and invoicing across self-serve and enterprise billing environments.
  • - Drive architectural decisions and evaluate tooling and frameworks to improve performance and cost efficiency.
  • - Mentor other engineers on data pipeline best practices, from instrumentation to consumption.
  • - Document architecture, runbooks, and operational processes to ensure cross-team confidence and sustainability.
  • - Balance innovation with pragmatism, knowing when to experiment versus stabilize given our small team and large production footprint.
  • - Identify and implement internal process improvements: automating manual workflows, optimizing data delivery, and redesigning infrastructure for greater scalability.

Required Skills

  • - Extensive experience building and operating real-time, event-driven data processing systems at scale.
  • - Hands-on experience managing and scaling Kafka clusters and streaming data pipelines in a production environment.
  • - Deep expertise in event streaming architectures and messaging systems (Kafka, Kinesis, or similar).
  • - Strong proficiency with Snowflake, ClickHouse and data modeling for analytical and operational workloads.
  • - Experience working across multi-cloud environments (GCP and AWS preferred).
  • - Proficiency in SQL and programming languages Go, Python, or Java.
  • - Familiarity with observability and monitoring tools (Datadog, Grafana, Monte Carlo etc.).
  • - Experience with source/version control (Git) and CI/CD workflows for data systems.
  • - Demonstrated ability to write production code and design scalable, fault-tolerant data infrastructure.
  • - Track record of mentoring engineers, fostering a culture of data reliability, and improving team capabilities.
  • - Excellent communication skills and collaborative mindset for working cross-functionally with data, engineering, product and finance teams.

Benefits

  • - Robust benefits and participation in Netlify''s equity plan.
  • - Opportunity to work on a globally distributed, fully remote team that values autonomy, transparency, and shared ownership.

About the company

Netlify is the essential platform for the delivery of exceptional and dynamic web experiences, without limitations.


All Job Openings at Netlify