Submitting more applications increases your chances of landing a job.

Here’s how busy the average job seeker was last month:

Opportunities viewed

Applications submitted

Keep exploring and applying to maximize your chances!

Looking for employers with a proven track record of hiring women?

Click here to explore opportunities now!
We Value Your Feedback

You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for

Would You Be Likely to Participate?

If selected, we will contact you via email with further instructions and details about your participation.

You will receive a $7 payout for answering the survey.


https://bayt.page.link/kekYYGYB8eY2Ues18
Back to the job results

Data Engineer-Data Platforms-Google

7 days ago 2026/05/30
IT Services
Create a job alert for similar positions
Job alert turned off. You won’t receive updates for this search anymore.

Job description

Introduction

A career in IBM Consulting is built on long-term client relationships and close collaboration worldwide. You'll work with leading companies across industries, helping them shape their hybrid cloud and AI journeys. With support from our strategic partners, robust IBM technology, and Red Hat, you'll have the tools to drive meaningful change and accelerate client impact. At IBM Consulting, curiosity fuels success. You'll be encouraged to challenge the norm, explore new ideas, and create innovative solutions that deliver real results. Our culture of growth and empathy focuses on your long-term career development while valuing your unique skills and experiences.





In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.





Your role and responsibilities

As a Data Engineer specializing in Google's data platforms, you will design, build, and maintain data engineering solutions on Google's Cloud ecosystem. This role requires expertise in utilizing various Google services for batch and real-time data pipelines, data migration, and data layer design. Your primary responsibilities will include:



  • Design Data Pipelines: Design and develop batch and real-time data pipelines for Data Warehouse and Datalake using Google services such as DataProc, DataFlow, PubSub, BigQuery, and Big Table.
  • Develop Data Engineering Solutions: Utilize Google Cloud Storage, BigTable, BigQuery DataProc with Spark and Hadoop, and Google DataFlow with Apache Beam or Python to build and maintain data engineering solutions.
  • Manage Data Platforms: Schedule and manage the data platform using Google Cloud Scheduler and Cloud Composer (Airflow), ensuring efficient data pipeline operations.
  • Implement Data Migration: Develop and implement data migration solutions using Google services, ensuring seamless data transfer between systems.
  • Optimize Data Layer: Design and optimize the data layer using Google services such as BigQuery, Big Table, and Cloud Spanner, ensuring efficient data storage and retrieval.


Required education
Bachelor's Degree

Preferred education
Master's Degree

Required technical and professional expertise
  • Google Cloud Ecosystem Expertise: Exposure to designing, building, and maintaining data engineering solutions on Google's Cloud ecosystem, including services such as Google DataProc, DataFlow, PubSub, BigQuery, Big Table, Cloud Spanner, CloudSQL, and AlloyDB.
  • Data Pipeline Development Experience: Exposure to developing and managing batch and real-time data pipelines for Data Warehouse and Datalake using Google services and open-source technologies like Apache Airflow, dbt, Spark/Python, or Spark/Scala.
  • Google Cloud Services Proficiency: Experience working with Google Cloud Storage, BigTable, BigQuery DataProc with Spark and Hadoop, and Google DataFlow with Apache Beam or Python to build and maintain data engineering solutions.
  • Data Platform Management Knowledge: Exposure to scheduling and managing the data platform using Google Cloud Scheduler and Cloud Composer (Airflow) for efficient data pipeline operations.
  • Data Layer Design Understanding: Experience working with data layer design using Google services such as BigQuery, Big Table, and Cloud Spanner for efficient data storage and retrieval.


Preferred technical and professional experience
  • Open-Source Technologies: Exposure to utilizing open-source technologies like Apache Airflow, dbt, Spark/Python, or Spark/Scala for developing and managing batch and real-time data pipelines.
  • Data Migration Solutions: Experience working with Google services to develop and implement data migration solutions, ensuring seamless data transfer between systems.
  • Cloud Composer Expertise: Exposure to using Cloud Composer (Airflow) for scheduling and managing the data platform, ensuring efficient data pipeline operations.


Years of Experience:
6-10




This job post has been translated by AI and may contain minor differences or errors.

You’ve reached the maximum limit of 15 job alerts. To create a new alert, please delete an existing one first.
Job alert created for this search. You’ll receive updates when new jobs match.
Are you sure you want to unapply?

You'll no longer be considered for this role and your application will be removed from the employer's inbox.