Submitting more applications increases your chances of landing a job.
Here’s how busy the average job seeker was last month:
Opportunities viewed
Applications submitted
Keep exploring and applying to maximize your chances!
Looking for employers with a proven track record of hiring women?
Click here to explore opportunities now!You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for
Would You Be Likely to Participate?
If selected, we will contact you via email with further instructions and details about your participation.
You will receive a $7 payout for answering the survey.
Ready to build the future with AI?
At Genpact, we don’t just keep up with technology—we set the pace. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory, our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies’ most complex challenges.
If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what’s possible, this is your moment.
Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook.
Inviting applications for the role of Senior Principal Consultant, Azure Databricks
In this role, the Databricks Lead is responsible for providing technical direction and lead a group of one or more developer to address a goal.
Responsibilities
• Implement Databricks Lakehouse solutions (Delta Lake, Unity Catalog, SQL Endpoints/Photon, Workflows).
• Design, build, and optimize batch & streaming pipelines using Spark and Python/Scala with strong SQL.
• Define standards for CI/CD, data governance, security, and platform operations on Databricks.
• Provide technical leadership; review solution designs and guide delivery teams.
• Evangelize re-use through the implementation of shared assets.
• Enforce adherence to architectural standards/principles, global product-specific guidelines, usability design standards, etc.
• Proactively guide engineering methodologies, standards, and leading practices.
• Identify, communicate and mitigate Risks, Assumptions, Issues, and Decisions throughout the full lifecycle.
• Demonstrate strong analytical and technical problem-solving skills.
• Ability to analyse and operate at various levels of abstraction.
• Ability to balance what is strategically right with what is practically realistic.
• Supporting and developing our people, including learning & development, certification & career development plans
• Providing technical governance and oversight for solution design and implementation
• Should have technical foresight to understand new technology and advancement.
• Leading team in the definition of best practices & repeatable methodologies in Cloud Data Engineering, including Data Storage, ETL, Data Integration & Migration, Data Warehousing and Data Governance
• Should have Technical Experience in Azure, AWS & GCP Cloud Data Engineering services and solutions.
• Contributing to Sales & Pre-sales activities including proposals, pursuits, demonstrations, and proof of concept initiatives
• Development of Whitepapers, blogs, webinars and other though leadership material
• Working with Learning & Development teams to establish appropriate learning & certification paths for their domain.
• Build new Data capabilities, solutions, assets, accelerators, and team competencies.
• Manage multiple opportunities through the entire business cycle simultaneously, working with cross-functional teams as necessary.
Qualifications we seek in you!
Minimum Qualifications
• Excellent technical architecture skills, enabling the creation of future-proof, complex global solutions.
• Excellent interpersonal communication and organizational skills are required to operate as a leading member of global, distributed teams that deliver quality services and solutions.
• Ability to rapidly gain knowledge of the organizational structure of the firm to facilitate work with groups outside of the immediate technical team.
• Knowledge and experience in IT methodologies and life cycles that will be used.
• Managed the end to end project delivery; worked with stakeholders – client, leadership and team.
• Must have strong hands-on experience on various cloud services like Azure/ADF, ADLS/S3, Security, Monitoring, Governance
• Must have experience to design platform on Databricks.
• Hands-on Experience to design and build Databricks based solution on any cloud platform.
• Hands-on experience to design and build solution powered by DBT models and integrate with databricks.
• Must be very good designing End-to-End solution on cloud platform.
• Must have good knowledge of Data Engineering concept and related services of cloud.
• Must have good experience in Python ,PySpark and SQL.
• Must have good experience in setting up development best practices.
• Intermediate level knowledge is required for Data Modelling.
• Good to have knowledge of docker and Kubernetes.
• Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies.
• Exposure to infrastructure and application security technologies and approaches
• Familiarity with requirements gathering techniques.
Preferred Qualifications/ Skills
• Must have designed the E2E architecture of unified data platform covering all the aspect of data lifecycle starting from Data Ingestion, Transformation, Serve and consumption.
• Must have excellent coding skills either Python or Scala,Pyspark preferably Python and Pyspark
• Should have experience in Insurance domain
• Must have designed and implemented at least 2-3 project end-to-end in Databricks.
• Must have experience on databricks which consists of various components as below
o Delta lake
o dbConnect
o db API 2.0
o SQL Endpoint – Photon engine
o Unity Catalog
o Databricks workflows orchestration
o Security management
o Platform governance
o Data Security
• Must have knowledge of new features available in Databricks and its implications along with various possible use-case.
• Must have followed various architectural principles to design best suited per problem.
• Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments.
• Must have strong understanding of Data warehousing and various governance and security standards around Databricks.
• Must have knowledge of cluster optimization and its integration with various cloud services.
• Must have good understanding to create complex data pipeline.
• Must be strong in SQL and sprak-sql.
• Must have strong performance optimization skills to improve efficiency and reduce cost.
• Must have worked on designing both Batch and streaming data pipeline.
• Must have extensive knowledge of Spark and Hive data processing framework.
• Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS/S3, ADF/Lambda, CosmosDB/DynamoDB, ASB/SQS, Cloud databases.
• Must be strong in writing unit test case and integration test.
• Must have strong communication skills and have worked with cross platform team.
• Must have great attitude towards learning new skills and upskilling the existing skills.
• Responsible to set best practices around Databricks CI/CD.
• Must understand composable architecture to take fullest advantage of Databricks capabilities.
• Good to have Rest API knowledge.
• Good to have understanding around cost distribution.
• Good to have if worked on migration project to build Unified data platform.
• Good to have knowledge of DBT.
• Experience around DevOps including docker and Kubernetes.
• Software development full lifecycle methodologies, patterns, frameworks, libraries, and tools
• Knowledge of programming and scripting languages such as JavaScript, PowerShell, Bash, SQL, Java, Python, etc.
• Experience with data ingestion technologies such as Azure Data Factory, SSIS, Pentaho, Alteryx
• Should have experience with visualization tools such as Tableau, Power BI
• Experience in distilling complex technical challenges to actionable decisions for stakeholders and guiding project teams by building consensus and mediating compromises when necessary.
• Experience coordinating the intersection of complex system dependencies and interactions
• Experience in solution delivery using common methodologies especially SAFe Agile but also Waterfall, Iterative, etc.
Demonstrated knowledge of relevant industry trends and standards
Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook.
Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training
Why join Genpact?
• Lead AI-first transformation – Build and scale AI solutions that redefine industries
• Make an impact – Drive change for global enterprises and solve business challenges that matter
• Accelerate your career—Gain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills
• Grow with the best – Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace
• Committed to ethical AI – Work in an environment where governance, transparency, and security are at the core of everything we build
• Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress
Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up.
Let’s build tomorrow together.
Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation.
Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
You'll no longer be considered for this role and your application will be removed from the employer's inbox.