Submitting more applications increases your chances of landing a job.
Here’s how busy the average job seeker was last month:
Opportunities viewed
Applications submitted
Keep exploring and applying to maximize your chances!
Looking for employers with a proven track record of hiring women?
Click here to explore opportunities now!You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for
Would You Be Likely to Participate?
If selected, we will contact you via email with further instructions and details about your participation.
You will receive a $7 payout for answering the survey.
Purpose of the role
To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure.
Accountabilities
Analyst Expectations
All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.
Join us as an IFC - ETL Developer at Barclays where you will spearhead the evolution of our infrastructure and deployment pipelines, driving innovation and operational excellence. You will harness cutting-edge technology to build and manage robust, scalable and secure infrastructure, ensuring seamless delivery of our digital solutions.
To be successful as an IFC - ETL Developer you should have experience with:
Ab-Initio design, develop, and maintain complex ETL processes and data flows using Ab-Initio platform.
Unix, strong command-line proficiency for job scheduling, monitoring, and system administration.
Python, develop scripts for data validation, automation, and integration tasks.
Data Warehouse concepts/Design: Design and implement enterprise data warehouse solutions, dimensional modeling, star/snowflake schemas, and ETL architectures.
Strong knowledge of data pipeline optimization, performance tuning, and data quality management, with a proven track-record of delivering scalable data warehousing solutions.
Some other highly valued skills include:
AWS Cloud-based data storage and processing solutions.
Hadoop Big data ecosystem for large-scale data processing and analytics.
Strong problem-solving skills and ability to multi-task.
Experience with integration of multiple data sources and toolsets.
The location of the role is Pune, IN.
You'll no longer be considered for this role and your application will be removed from the employer's inbox.