Skills: – Java, Hive, Impala, Kafka, Spark, Mapreduce, HDFS, HBase and shell scripting
- Strong hands on experience with Java, web-services and APIs
- Strong knowledge and hands-on programming experience in Hadoop ecosystem
- Experience in Data ingestion (Batch and Real time) , Data Encryption, Reconciliation
- Should have worked on large data sets and experience with performance tuning and troubleshooting.
- Experience in all the life cycle phases of the project.
- Should be a strong communicator and be able to work independently with minimum involvement from client SMEs.
- Should be able to work in team in diverse/ multiple stakeholder environment.
- Experience in NoSQL Databases is preferred.
- Experience to Financial domain is preferred.
- Experience and desire to work in a Global delivery environment.
- Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
- At least 4 years of experience with Information Technology.
- At least 4 years of Design and development experience in Java and Big data related technologies
- Atleast 3 years of hands on design and development experience on Big data related technologies – Hive, Impala, Kafka, Spark, Java ,Mapreduce, HDFS, HBase, and shell scripting
** U.S. citizens and those authorized to work in the U.S. are encouraged to apply . We are unable to sponsor at this time.
To apply for this job please visit tinyurl.com.