Submitting more applications increases your chances of landing a job.

Here’s how busy the average job seeker was last month:

Opportunities viewed

Applications submitted

Keep exploring and applying to maximize your chances!

Looking for employers with a proven track record of hiring women?

Click here to explore opportunities now!
We Value Your Feedback

You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for

Would You Be Likely to Participate?

If selected, we will contact you via email with further instructions and details about your participation.

You will receive a $7 payout for answering the survey.


https://bayt.page.link/4jkYPfKw7vwTWJHe8
Back to the job results

DES2 - SnowFlake Developer

28 days ago 2026/05/08
Other Business Support Services
Create a job alert for similar positions
Job alert turned off. You won’t receive updates for this search anymore.

Job description

Company Description

Sutherland is looking for a skilled Python Data Engineer with strong experience in Apache Airflow, data pipeline development, and cloud data platforms (Snowflake / AWS). The role involves building and orchestrating scalable ETL/ELT workflows and automating data processes across multiple systems.



Job Description
  • Develop and maintain data pipelines using Python, Airflow (DAGs), and AWS/Snowflake components.
  • Build and automate data ingestion, transformation, and scheduling workflows.
  • Develop Airflow DAGs including custom operators, sensors, hooks, and manage pipeline monitoring.
  • Work on Snowflake-based ELT solutions including data loads, stored procedures, and queries.
  • Write efficient SQL queries and optimize performance for data transformations.
  • Collaborate with cross-functional teams to understand requirements and deliver scalable data solutions.
  • Troubleshoot pipeline failures and ensure high availability of production workflows.

Qualifications
  • 5–8 years of experience in Python development (advanced scripting and automation).
  • 3+ years of experience with Apache Airflow (DAG design, orchestration, scheduling).
  • Experience with Snowflake or any cloud data warehouse (Redshift / BigQuery / Databricks).
  • Experience with AWS services (S3, Glue, Lambda, Athena) or equivalent cloud technologies.
  • Strong hands-on experience with SQL (advanced querying, optimization).
  • Experience with ETL/ELT data workflows, data validation, data quality checks.
  • Familiarity with Git / CI-CD, JIRA, or similar tools.
  • Good communication skills and ability to work independently.
  • Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience)

Additional Information

All your information will be kept confidential according to EEO guidelines.




This job post has been translated by AI and may contain minor differences or errors.

You’ve reached the maximum limit of 15 job alerts. To create a new alert, please delete an existing one first.
Job alert created for this search. You’ll receive updates when new jobs match.
Are you sure you want to unapply?

You'll no longer be considered for this role and your application will be removed from the employer's inbox.