Submitting more applications increases your chances of landing a job.
Here’s how busy the average job seeker was last month:
Opportunities viewed
Applications submitted
Keep exploring and applying to maximize your chances!
Looking for employers with a proven track record of hiring women?
Click here to explore opportunities now!You are invited to participate in a survey designed to help researchers understand how best to match workers to the types of jobs they are searching for
Would You Be Likely to Participate?
If selected, we will contact you via email with further instructions and details about your participation.
You will receive a $7 payout for answering the survey.
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs.
Your primary responsibilities include:
* Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements.
* Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization.
* Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too.
* Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python
* Hbase, Hive Good to have Aws -S3,
* athena ,Dynomo DB, Lambda, Jenkins GIT
* Developed Python and pyspark programs for data analysis.. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine).
* Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations.
* Understanding of Devops.
* Experience in building scalable end-to-end data ingestion and processing solutions
* Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
You'll no longer be considered for this role and your application will be removed from the employer's inbox.