Submitting more applications increases your chances of landing a job.

Here’s how busy the average job seeker was last month:

Opportunities viewed

Applications submitted

Keep exploring and applying to maximize your chances!

Looking for employers with a proven track record of hiring women?

Click here to explore opportunities now!
We Value Your Feedback

You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for

Would You Be Likely to Participate?

If selected, we will contact you via email with further instructions and details about your participation.

You will receive a $7 payout for answering the survey.


https://bayt.page.link/noxup8zCPKnLwbgk6
Back to the job results

Data Engineer

30+ days ago 2026/03/25
Other Business Support Services
Create a job alert for similar positions
Job alert turned off. You won’t receive updates for this search anymore.

Job description

ABOUT VERTIGO GAMES We create amazing games that rank at the top on both iOS & Android, loved and played by 150 + million fans worldwide!
Check out our smash-hit games: 🎮  Critical Strike ⚔️  Polygun Arena Now, we're looking for a passionate  Data Engineer  to join our dynamic team in  Istanbul .
 This is an  on-site role , requiring you to work  5 days a week  from our office in  Levent .
In this role, we deeply care about your passion for building reliable, scalable data systems that support our games.
We highly encourage you to get familiar with both of our games before you apply and complete your application only if you're genuinely excited to design efficient pipelines, optimize data flows, and enable high-quality analytics that power our product and growth decisions.
Responsibilities Own and evolve our Airflow & DBT-based data pipeline architecture.
Develop reliable and cost-efficient data workflows that ensure timely and accurate data delivery.
Build and maintain ETL/ELT processes to ingest data from external and internal sources.
Collaborate with analysts, engineers, and product teams to design scalable data models.
Implement and optimize data warehouse structures (BigQuery or others) for analytical efficiency.
At least 1 proven DWH project experience in cloud-native architectures (GCP or AWS) Monitor and troubleshoot pipeline failures, optimize for performance and cost-efficiency.
Ensure data quality, consistency, and governance across the stack.
A Compensation Package That Reflects Your Contribution: We keep it simple.
Competitive pay that matches the work you deliver.
Meal Allowance: Enough for a solid, satisfying meal.
Delicious In-Office Catering: Fresh meals, good coffee, sweet treats.
No place for hunger, ever.
Private Health Insurance: Complementary private health insurance so you can get care without second thoughts.
Continuous Learning Support: A monthly budget for courses and platforms, because staying sharp is part of the job.
Equity That Actually Makes You a Partner: We offer real equity, not symbolic.
Once you reach a certain contribution level, you earn a meaningful stake in the company.
When we grow, you grow with us.
Meaningful Time Off: Starting from your first year, you receive bonus company-wide rewind holidays: A special extra break even before standard annual leave kicks in.
And your birthday is a free day on us.
Referral Bonus: Introduce great talent to the team and earn a reward when they join.
Milestone Awards: As you reach key milestones with us, you earn bonus rewards that recognize your long-term contribution.
A Culture Built Around Players & Ownership: Curious, collaborative, and focused.
We’re here to build great games together.
A Modern, Comfortable Office in Levent: Bright space, central location, one step from the metro designed to keep you in the "zone".
Game Room: A dedicated Xbox corner for fun breaks and quick gaming sessions whenever you need to unwind.
Office Events That Keep Us Connected: Fun team moments, regular happy hours, and in-office events throughout the year.
2+ years of experience in a data engineering or backend data-focused role.
Proficiency in  DBT  for data modeling and transformation.
Hands-on experience with  Apache Airflow  for orchestration and scheduling.
Solid understanding of  ETL/ELT pipelines  and cloud-based data workflows.
Experience working with  Google Cloud Platform  (e.
g., BigQuery, Cloud Run, Cloud Functions, Cloud SQL).
Proficiency in Python  for scripting and workflow automation.
Strong command of  SQL  and experience building data models in a warehouse environment.
Understanding of cost optimization practices in cloud environments.
Familiarity with CI/CD processes and version control systems like Git.
Good communication skills and the ability to work cross-functionally with analysts and developers.

This job post has been translated by AI and may contain minor differences or errors.

You’ve reached the maximum limit of 15 job alerts. To create a new alert, please delete an existing one first.
Job alert created for this search. You’ll receive updates when new jobs match.
Are you sure you want to unapply?

You'll no longer be considered for this role and your application will be removed from the employer's inbox.