Job description
Project Role : Data Modeler
Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data.
Must have skills : Snowflake Data Warehouse
Good to have skills : NA
Minimum 5 year(s) of experience is required
Educational Qualification : 15 years full time education
Summary: Seeking an experienced Data Engineer, Architect with deep expertise in Snowflake and dbt to lead end-to-end data architecture, pipeline design, and DataOps practices.
This role blends hands-on engineering with architectural leadership, driving scalable design, data quality, and performance optimization while guiding teams in best practices and governance Roles & Responsibilities: - Design and manage Snowflake data models, schemas, and views for analytics.
- Architect scalable ELT pipelines using dbt and Snowflake.
- Define data modeling standards and implement dbt macros, reusable components, and tests.
- Optimize query performance, compute resources, and cost efficiency.
- Ensure data quality, lineage, and integrity across pipelines.
- Implement and manage CI/CD pipelines for dbt and Snowflake deployments.
- Apply DataOps best practices in versioning, testing, monitoring, and integration.
- Manage environments and enforce governance, security, and deployment standards.
- Partner with analysts and product teams to deliver analytics-ready, trusted data.
- Mentor engineers on Snowflake, dbt, and DataOps best practices.
- Collaborate with DevOps and Cloud teams on scalability and reliability.
- Maintain clear documentation and development standards. Professional & Technical Skills: - 4+ years hands-on experience with Snowflake – data modeling, tuning, security, and cost optimization.
- Advanced dbt expertise – models, macros, tests, and documentation..
- Proficiency with GitLab CI/CD pipelines and collaborative development workflows.
- Strong SQL for complex transformations and query optimization.
- Proven experience in CI/CD implementation for data workflows (GitLab / GitHub Actions / Jenkins).
- Solid grasp of DataOps principles – version control, testing, automation, and deployment governance.
- Strong collaboration and communication skills with data and business teams. Good-to-Have Skills
- Python scripting for automation and tooling.
- Exposure to Airflow or other orchestration tools.
- Familiarity with cloud platforms (AWS / Azure / GCP).
- Experience with Talend, Monte Carlo, Collibra, or Immuta for data governance or quality. Additional Information: - The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.
- This position is based at our Bengaluru,Pune office.
- A 15 years full time education is required.
This job post has been translated by AI and may contain minor differences or errors.