I

Data Engineer - 1861

In All Media

Yesterday
New
Min Qualification:

Job descriptions & requirements


Data Engineer
Position: Data Engineer
Location: Remote from LATAM
Contract Type: Full-time vendor
Time Zone Alignment: CT
About In All Media In All Media is a nearshore managed service provider focused on team augmentation and digital product delivery. We assemble senior, LATAM-based squads from our vetted Coderfull community (500+ engineers) that integrate seamlessly with client teams to deliver software, data, cloud, and AI initiatives with speed and rigor.
Project Overview Join a specialized data squad dedicated to a large-scale modernization initiative. This project focuses on the migration of approximately 1,000 datasets to Snowflake, utilizing Apache Airflow for orchestration. While the core architecture and technical approach are pre-defined by the internal team, you will be responsible for the execution, adaptation of pipelines, and ensuring the reliability of the new data ecosystem.
Key Responsibilities

  • Data Migration: Execute the migration of high-volume datasets to Snowflake, ensuring architectural alignment and data integrity.
  • Pipeline Adaptation: Modify and optimize existing data pipelines to leverage Snowflake’s native capabilities and performance.
  • Orchestration & Workflow: Develop, test, and monitor complex DAGs in Apache Airflow to ensure seamless scheduling.
  • Data Reliability: Implement comprehensive testing strategies to validate data quality throughout the migration lifecycle.
  • Cloud Infrastructure: Manage and interact with AWS services, specifically S3 data lakes, partitioning strategies, and lifecycle policies.

Must-Have Skills

  • Advanced SQL: Mastery of complex joins, CTEs, window functions, and performance tuning for large-scale data sets.
  • Snowflake Proficiency: Hands-on experience with Snowflake data modeling and warehouse management.
  • Python for Data: Strong ability to develop and test pipelines using Python and Airflow.
  • Big Data Tools: Practical experience with Apache Spark and Apache Kafka for data processing and streaming.
  • AWS Ecosystem: Deep understanding of S3 (data lakes, partitioning, and storage optimization).
  • Testing Mindset: Proven experience in designing and executing testing strategies for data pipelines.

Nice-to-Have Skills

  • Experience in high-velocity migration projects (legacy systems to cloud).
  • Familiarity with data governance and security best practices within Snowflake.
  • Certifications in AWS or Snowflake.

Common Requirements

  • Solid understanding of data structures, algorithms, and software design principles.
  • Commitment to CI/CD best practices and documentation.
  • Fluent English for daily technical collaboration and project reporting.
  • Ability to work within an Agile environment and follow pre-defined technical roadmaps.


<

Important safety tips

  • Do not make any payment without confirming with the BrighterMonday Customer Support Team.
  • If you think this advert is not genuine, please report it via the Report Job link below.

Job applications are closed.

This job role is not currently accepting applications. Please explore similar vacancies

View Similar Jobs

Similar jobs

Lorem ipsum

Lorem ipsum dolor (Location) Lorem ipsum Confidential
3 years ago

Stay Updated

Join our newsletter and get the latest job listings and career insights delivered straight to your inbox.

v2.homepage.newsletter_signup.choose_type

We care about the protection of your data. Read our

We care about the protection of your data. Read our  privacy policy .

Follow us On:
Get it on Google Play
2026 BrighterMonday

Or your alerts