Submitting more applications increases your chances of landing a job.

Here’s how busy the average job seeker was last month:

Opportunities viewed

Applications submitted

Keep exploring and applying to maximize your chances!

Looking for employers with a proven track record of hiring women?

Click here to explore opportunities now!
We Value Your Feedback

You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for

Would You Be Likely to Participate?

If selected, we will contact you via email with further instructions and details about your participation.

You will receive a $7 payout for answering the survey.


https://bayt.page.link/Np9jnJXEcc5j5bdf9
Back to the job results
Remote
Other Business Support Services
Create a job alert for similar positions
Job alert turned off. You won’t receive updates for this search anymore.

Job description

This role is open to candidates based in LATAM, Africa, and Eastern Europe. Please note that as this role supports U.S.-based clients, candidates must be available to work during U.S. business hours aligned with the client’s time zone.


Our client is an AI-driven technology company building forecasting and attribution intelligence products powered by high-quality, analytics-ready data. Their teams work cross-functionally across data engineering, analytics, data science, and product to deliver reliable insights that support customer onboarding, reporting workflows, and advanced AI use cases in a fast-moving, execution-focused environment.


Location

Fully remote | 9 AM - 5 PM EST


Role Overview

The Data Engineer will help build and maintain reliable, scalable data pipelines that support analytics, forecasting, and AI-driven products. This is a hands-on, execution-focused contract role centered on data quality, pipeline reliability, and collaboration across analytics, data science, and product teams.


The role operates within a modern analytics engineering stack using Python, dbt, and Dagster, with a strong emphasis on supporting customers onboarding and reporting workflows.


Key ResponsibilitiesData Pipeline Development & Orchestration
  • Build and maintain scalable, fault-tolerant ELT pipelines using Python


  • Orchestrate and monitor data workflows using Dagster


  • Troubleshoot pipeline failures, performance issues, and data inconsistencies


  • Monitor pipeline health using observability tools and metrics


Analytics Engineering & Data Modeling
  • Develop, optimize, and document dbt models following analytics engineering best practices


  • Model clean, analytics-ready datasets for BI, forecasting, and machine learning feature consumption


  • Contribute to refactoring and improvement of existing data workflows as product needs evolve


Data Quality & Reliability
  • Implement and maintain data quality checks and testing strategies


  • Follow established team standards for SLAs, code quality, and deployments


Cross-Functional Collaboration
  • Collaborate closely with data scientists to support forecasting and AI-driven use cases


  • Work cross-functionally with analytics and product teams to ensure data meets business and product requirements


QualificationsExperience
  • 3+ years of professional experience in data engineering or analytics engineering


  • Hands-on experience working with dbt (Core or Cloud)


  • Experience using Dagster or similar orchestration tools


  • Experience working with cloud data warehouses such as Snowflake, BigQuery, or Redshift


  • Experience collaborating with Product, Analytics, or Data Science teams


  • Ability to work independently and deliver results in a contract environment


Skills
  • Strong proficiency in Python, including libraries such as pandas, SQLAlchemy, or psycopg2


  • Advanced SQL skills, including CTEs, window functions, and query optimization


  • Familiarity with modern ELT tools such as Airbyte, Fivetran, Meltano, or dltHub


  • Strong troubleshooting skills for data pipelines, performance, and data quality issues


  • Ability to follow established standards for reliability, testing, and deployment


Opportunity

This contract role offers the opportunity to contribute directly to AI-driven products by building high-impact data infrastructure used for forecasting, attribution, and reporting. You’ll work within a modern analytics engineering stack, collaborate closely with technical teams, and gain hands-on exposure to real-world AI and analytics use cases in a fast-paced, product-driven environment.


Application Process:To be considered for this role these steps need to be followed:
  • Fill in the application form


  • Record a video showcasing your skill sets


This job post has been translated by AI and may contain minor differences or errors.

You’ve reached the maximum limit of 15 job alerts. To create a new alert, please delete an existing one first.
Job alert created for this search. You’ll receive updates when new jobs match.
Are you sure you want to unapply?

You'll no longer be considered for this role and your application will be removed from the employer's inbox.