Senior DataOps Engineer

DLOCAL
Lead
Remoto 🌐
Publicado em 09 de junho de 2025

Descrição da Vaga

Why should you join dLocal? dLocal enables the biggest companies in the world to collect payments in 40 countries in emerging markets. Global brands rely on us to increase conversion rates and simplify payment expansion effortlessly. As both a payments processor and a merchant of record where we operate, we make it possible for our merchants to make inroads into the world’s fastest\-growing, emerging markets. By joining us you will be a part of an amazing global team that makes it all happen. Being a part of dLocal means working with 1000\+ teammates from 30\+ different nationalities and developing an international career that impacts millions of people’s daily lives. We are builders, we never run from a challenge, we are customer\-centric, and if this sounds like you, we know you will thrive in our team. **What’s the opportunity?** As a Senior DataOps Engineer, you'll be a strategic professional shaping the foundation of our data platform. You’ll design and evolve scalable infrastructure on Kubernetes, operate Databricks as our primary data platform, enable data governance and reliability at scale, and ensure our data assets are clean, observable, and accessible. ### **What will I be doing?** * Architect and evolve **scalable infrastructure** to ingest, process, and serve large volumes of data efficiently, using **Kubernetes** and **Databricks** as core building blocks. * Design, build, and maintain **Kubernetes\-based infrastructure**, owning deployment, scaling, and reliability of data workloads running on our clusters. * Operate **Databricks as our primary data platform**, including workspace and cluster configuration, job orchestration, and integration with the broader data ecosystem. * Work in improvements to existing **frameworks and pipelines** to ensure performance, reliability, and cost\-efficiency across batch and streaming workloads. * Build and maintain **CI/CD pipelines** for data applications (DAGs, jobs, libraries, containers), automating testing, deployment, and rollback. * Implement **release strategies** (e.g., blue/green, canary, feature flags) where relevant for data services and platform changes. * Establish and maintain **robust data governance practices** (e.g., contracts, catalogs, access controls, quality checks) that empower cross\-functional teams to access and trust data. * Build a framework to move raw datasets into **clean, reliable, and well\-modeled assets** for analytics, modeling, and reporting, in partnership with Data Engineering and BI. * Define and track **SLIs/SLOs** for critical data services (freshness, latency, availability, data quality signals). * Implement and own **monitoring, logging, tracing, and alerting** for data workloads and platform components, improving observability over time. * Lead and participate in **on\-call rotation** for data platforms, manage incidents, and run structured **postmortems** to drive continuous improvement. * Investigate and resolve **complex data and platform issues**, ensuring data accuracy, system resilience, and clear root\-cause analysis. * Maintain high standards for **code quality, testing, and documentation**, with a strong focus on **reproducibility and observability**. * Work closely with the **Data Enablement team, BI, and ML stakeholders** to continuously evolve the data platform based on their needs and feedback. * Stay current with **industry trends and emerging technologies** in DataOps, DevOps, and data platforms to continuously raise the bar on our engineering practices. ### **What skills do I need?** * Bachelor’s degree in **Computer Engineering, Data Engineering, Computer Science**, or a related technical field (or equivalent practical experience). * Proven experience in **data engineering, platform engineering, or backend software development**, ideally in **cloud\-native environments**. * Deep expertise in **Python or/and SQL,** with strong skills building data or platform tooling. * Strong experience with **distributed data processing frameworks** such as **Apache Spark** (Databricks experience strongly preferred). * Solid understanding of **cloud platforms**, especially **AWS** and/or **GCP**. * Hands\-on experience with **containerization and orchestration**: Docker, Kubernetes / EKS / GKE / AKS (or equivalent) * Proficiency with **Infrastructure\-as\-Code** (e.g., Terraform, Pulumi, CloudFormation) for managing data and platform components. * Experience implementing **CI/CD pipelines** (e.g., GitHub Actions, GitLab CI, Jenkins, CircleCI, ArgoCD, Flux) for data workloads and services. * Experience in **monitoring \& observability** (metrics, logging, tracing) using tools like Prometheus, Grafana, Datadog, CloudWatch, or similar. * Experience with **incident management**: Participating in or leading on\-call rotations. * Handling incidents and running postmortems * Building automation and guardrails to prevent regressions * Strong **analytical thinking and problem\-solving skills**, comfortable debugging across infrastructure, network, and application layers. * Able to work **autonomously and collaboratively.** Nice to have* Experience designing and maintaining **DAGs with Apache Airflow** or similar orchestration tools (Dagster, Prefect, Argo Workflows). * Familiarity with modern data formats and table formats (e.g., **Parquet, Delta Lake, Iceberg**). * Experience acting as a **Databricks admin/developer**, managing workspaces, clusters, compute policies, and jobs for multiple teams. * Exposure to **data quality, data contracts, or data observability** tools and practices. What do we offer? Besides the tailored benefits we have for each country, dLocal will help you thrive and go that extra mile by offering you:* Flexibility: we have flexible schedules and we are driven by performance. * Fintech industry: work in a dynamic and ever\-evolving environment, with plenty to build and boost your creativity. \- Referral bonus program: our internal talents are the best recruiters \- refer someone ideal for a role and get rewarded.* Learning \& development: get access to a Premium Coursera subscription. * Language classes: we provide free English, Spanish, or Portuguese classes. * Social budget: you'll get a monthly budget to chill out with your team (in person or remotely) and deepen your connections! * dLocal Houses: want to rent a house to spend one week anywhere in the world coworking with your team? We’ve got your back! **Flexibility in how you work:** We focus on impact and productivity over fixed hours. This means our teams have flexible schedules and, depending on your role and location, you will combine self‑managed focus time with moments of in‑person connection in our collaboration hubs. What happens after you apply? Our Talent Acquisition team is invested in creating the best candidate experience possible, so don’t worry, you will definitely hear from us. We will review your CV and keep you posted by email at every step of the process! Also, you can check out our webpage, Linkedin and Youtube for more about dLocal! We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.

Vaga originalmente publicada em: indeed

Receba vagas como esta no seu email

Crie um alerta gratuito e seja o primeiro a saber de novas oportunidades

Criar Alerta Gratuito

Alertas que entendem o que você quer

Não receba qualquer vaga. Receba apenas as que combinam exatamente com o que você busca.

Alerta genérico

Filtro:

Python

Você recebe tudo isso:

Vaga de Python + Django
Vaga de Python + Flask
Vaga de Python + ETL/Data
Vaga de Python + Machine Learning
...e muito ruído no seu email
Alerta inteligente

Filtro:

Python+FastAPI

Você recebe apenas:

Desenvolvedor Python + FastAPI
Backend Engineer (FastAPI)
API Developer - Python/FastAPI

Zero ruído. Só vagas relevantes para você.

Outros exemplos de filtros precisos:

JavaScript+React+Remoto
Java+Spring Boot+Sênior
Go+Kubernetes

Filtros Combinados

Combine linguagem + framework + nível + localização. Seja tão específico quanto quiser.

Email Diário

Receba um resumo diário apenas com vagas que passam nos seus filtros. Sem spam.

Kanban Visual

Organize suas candidaturas em um quadro Kanban. Acompanhe cada processo seletivo.

Planos simples, sem surpresas

Comece grátis e faça upgrade quando quiser

Gratuito

R$ 0para sempre
  • Busca de vagas ilimitada
  • Salvar até 10 vagas
  • 1 quadro Kanban
Criar Conta Grátis
Popular

Premium

R$ 9,90/mês
  • Tudo do plano gratuito
  • Vagas salvas ilimitadas
  • Quadros Kanban ilimitados
  • Alertas de vagas por email
  • Suporte prioritário
3 dias grátis, sem cartão

Pronto para encontrar sua vaga ideal?

Junte-se a milhares de desenvolvedores que já usam o Job For Dev

Encontre as melhores oportunidades para desenvolvedores no Job For Dev

Senior DataOps Engineer - DLOCAL | Job For Dev