Job description
Key Responsibilities:
Project & Delivery Management
- Lead end-to-end delivery of data engineering, migration, and analytics projects across multiple clients and domains.
- Develop detailed project plans, manage budgets, and ensure adherence to timelines, scope, and quality.
- Proactively identify risks, dependencies, and bottlenecks and drive timely resolution.
- Oversee governance, documentation, and reporting for all active engagements.
Team Leadership & Resource Management
- Lead and mentor cross-functional teams including data engineers, solution architects, and QA specialists.
- Manage resource allocation across multiple concurrent projects and optimize team productivity.
- Build a culture of excellence, accountability, and continuous improvement.
Client & Stakeholder Management
- Act as the primary point of contact for clients, driving project alignment with business objectives.
- Collaborate closely with client stakeholders to understand requirements and translate them into scalable data solutions.
- Deliver executive-level reporting and ensure client satisfaction throughout the delivery lifecycle.
Technical & Strategic Oversight
- Partner with architects and technical leads to design robust, scalable data lakehouse and warehouse architectures.
- Drive adoption of Databricks, Delta Lake, and cloud-native data solutions (AWS, Azure, GCP).
- Ensure best practices in ETL/ELT design, data governance, security, and operational excellence.
- Support pre-sales and solutioning activities for new client engagements as required.
Process & Quality Management
- Implement Agile/Iterative delivery frameworks and enforce delivery excellence standards.
- Track key delivery metrics (velocity, quality, cost, client satisfaction) and ensure continuous optimization.
- Uphold compliance with Go Digital’s delivery methodology, documentation, and governance standards.
Required Skills & Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field.
- 10+ years of total experience, with at least 4+ years in project or delivery management for data or cloud-based projects.
- Proven experience in managing enterprise-scale data engineering programs on platforms like Databricks, Azure, AWS, or GCP.
- Strong understanding of:
- ETL/ELT pipelines, data lakes, and data warehousing architectures.
- Spark, PySpark, SQL, and distributed data processing frameworks.
- CI/CD, DevOps, and DataOps methodologies.
- Excellent stakeholder management, leadership, and communication skills.
- Experience working in onshore-offshore delivery models within a consulting setup.
- PMP, CSM, or Agile certification preferred.
Preferred Experience:
- Hands-on exposure to AWS
- Experience in data modernization, migration, or cloud transformation programs.
- Familiarity with data governance, lineage, and cataloging tools (e.g., Unity Catalog, Collibra).
This job post has been translated by AI and may contain minor differences or errors.