Submitting more applications increases your chances of landing a job.
Here’s how busy the average job seeker was last month:
Opportunities viewed
Applications submitted
Keep exploring and applying to maximize your chances!
Looking for employers with a proven track record of hiring women?
Click here to explore opportunities now!You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for
Would You Be Likely to Participate?
If selected, we will contact you via email with further instructions and details about your participation.
You will receive a $7 payout for answering the survey.
Key Responsibilities
− CI/CD Pipeline Development:
Design and maintain CI/CD pipelines for data workflows and machine learning jobs using tools like Azure DevOps, Jenkins, or GitHub Actions. For Databricks, implement automated deployment of notebooks, jobs, and Delta Live Tables, ensuring version control and environment consistency.
− Pipeline Monitoring & Reliability:
Implement monitoring solutions for data pipelines and Databricks jobs, track latency, throughput, and failures. Configure auto-scaling clusters and recovery strategies to guarantee high availability and resilience.
− Secrets & Environment Management:
Securely manage credentials, API keys, and Databricks Secret Scopes across development, staging, and production environments. Apply best practices for role-based access control (RBAC) and compliance.
− Deployment Automation:
Automate deployment of data infrastructure, Databricks clusters, and ML models using Infrastructure-as-Code (Terraform) and orchestration tools. Ensure reproducibility and reduce manual intervention.
− Observability & Alerting:
Set up end-to-end observability for pipelines using Databricks monitoring dashboards, integrate with Prometheus, Grafana, or cloud-native tools, and configure proactive alerting for SLA breaches and anomalies.
− Collaboration & Documentation:
Work closely with Data Engineers, AI Engineers, and Platform teams to ensure smooth integration. Document Databricks workflows, cluster configurations, and CI/CD processes for transparency and operational excellence.
Preferred Qualifications
Experience: 9+ years
− Databricks Expertise:
Hands-on experience with Databricks Workflows, Delta Lake, Unity Catalog, and MLflow for model tracking and deployment.
− Programming Skills:
Proficiency in Python (PySpark) and SQL for data processing and transformation. Ability to optimize queries for large-scale analytics.
− Database Knowledge:
Familiarity with Oracle, MySQL, PostgreSQL, and integration with Databricks for ingestion and analytics.
− Orchestration Tools:
Experience with Airflow, Prefect, Dagster, or Databricks Workflows for scheduling and monitoring complex pipelines.
− Data Transformation & Modeling:
Understanding of data modeling principles, Delta Lake architecture, and performance tuning for big data environments.
− CI/CD Tools:
Proficiency in Azure DevOps, GitHub Actions, Jenkins, and integration with Databricks for automated deployments.
− Infrastructure-as-Code:
Experience with Terraform or CloudFormation for provisioning Databricks resources and managing cloud infrastructure.
− Cloud Platforms:
Strong knowledge of AWS and Databricks, including S3 integration, IAM roles, and secure data access patterns.
More information about NXP in India...
You'll no longer be considered for this role and your application will be removed from the employer's inbox.