Data Engineer LATAM (Python/PySpark/AWS Glue/Amazon Athena/SQL/Apache Airflow)
Descrição da Vaga
Job description Let’s be direct: We’re looking for a technical powerhouse. If you’re the developer who: * Is the clear technical leader on your team * Consistently solves problems others can’t crack * Ships complex features in half the time it takes others * Writes code so clean it could be published as a tutorial * Takes pride in elevating the entire codebase Then we want to talk to you. This isn’t a role for everyone, and that’s by design. We’re seeking developers who know they’re exceptional and have the track record to prove it. What you’ll do * Build, optimize, and scale data pipelines and infrastructure using Python, TypeScript, Apache Airflow, PySpark, AWS Glue, and Snowflake. * Design, operationalize, and monitor ingest and transformation workflows: DAGs, alerting, retries, SLAs, lineage, and cost controls. * Collaborate with platform and AI/ML teams to automate ingestion, validation, and real\-time compute workflows; work toward a feature store. * Integrate pipeline health and metrics into engineering dashboards for full visibility and observability. * Model data and implement efficient, scalable transformations in Snowflake and PostgreSQL. * Build reusable frameworks and connectors to standardize internal data publishing and consumption. Required qualifications * 4\+ years of production data engineering experience. * Deep, hands\-on experience with Apache Airflow, AWS Glue, PySpark, and Python\-based data pipelines. * Strong SQL skills and experience operating PostgreSQL in live environments. * Solid understanding of cloud\-native data workflows (AWS preferred) and pipeline observability (metrics, logging, tracing, alerting). * Proven experience owning pipelines end\-to\-end: design, implementation, testing, deployment, monitoring, and iteration. Preferred qualifications * Experience with Snowflake performance tuning (warehouses, partitions, clustering, query profiling) and cost optimization. * Real\-time or near\-real\-time processing experience (e.g., streaming ingestion, incremental models, CDC). * Hands\-on experience with a backend TypeScript framework (e.g., NestJS) is a strong plus. * Experience with data quality frameworks, contract testing, or schema management (e.g., Great Expectations, dbt tests, OpenAPI/Protobuf/Avro). * Background in building internal developer platforms or data platform components (connectors, SDKs, CI/CD for data). Additional Information: * This is a fully remote position. * Compensation will be in USD. * Work hours are aligned with the EST time zone (9 AM to 6 PM EST) or PT time zone.
Vaga originalmente publicada em: linkedin
Receba vagas como esta no seu email
Crie um alerta gratuito e seja o primeiro a saber de novas oportunidades
Alertas que entendem o que você quer
Não receba qualquer vaga. Receba apenas as que combinam exatamente com o que você busca.
Filtro:
Você recebe tudo isso:
Filtro:
Você recebe apenas:
Zero ruído. Só vagas relevantes para você.
Outros exemplos de filtros precisos:
Filtros Combinados
Combine linguagem + framework + nível + localização. Seja tão específico quanto quiser.
Email Diário
Receba um resumo diário apenas com vagas que passam nos seus filtros. Sem spam.
Kanban Visual
Organize suas candidaturas em um quadro Kanban. Acompanhe cada processo seletivo.
Planos simples, sem surpresas
Comece grátis e faça upgrade quando quiser
Premium
- Tudo do plano gratuito
- Vagas salvas ilimitadas
- Quadros Kanban ilimitados
- Alertas de vagas por email
- Suporte prioritário
Pronto para encontrar sua vaga ideal?
Junte-se a milhares de desenvolvedores que já usam o Job For Dev
Encontre as melhores oportunidades para desenvolvedores no Job For Dev