Skills: – Java, Hive, Impala, Kafka, Spark, Mapreduce, HDFS, HBase and shell scripting
- Strong hands on experience with Java, web-services and APIs
- Strong knowledge and hands-on programming experience in Hadoop ecosystem
- Experience in Data ingestion (Batch and Real time) , Data Encryption, Reconciliation
- Should have worked on large data sets and experience with performance tuning and troubleshooting
- Experience in all the life cycle phases of the project
- Should be a strong communicator and be able to work independently with minimum involvement from client SMEs
- Should be able to work in team in diverse/ multiple stakeholder environment
- Experience in NoSQL Databases is preferred
- Experience to Financial domain is preferred
- Experience and desire to work in a Global delivery environment
The job entails sitting as well as working at a computer for extended periods of time. Should be able to communicate by telephone, email or face to face. Travel may be required as per the job requirements. Qualifications
- Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
- At least 4 years of experience with Information Technology.
- At least 4 years of Design and development experience in Java and Big data related technologies
- Atleast 3 years of hands on design and development experience on Big data related technologies – Hive, Impala, Kafka, Spark, Java ,Mapreduce, HDFS, HBase, and shell scripting
Mandatory Technical Skills:
** U.S. citizens and those authorized to work in the U.S. are encouraged to apply . We are unable to sponsor at this time.
This is a Full-Time Permanent job opportunity for you.
Only US Citizen, Green Card Holder, TN Visa, GC-EAD , H4-EAD & L2-EAD can apply.
No OPT-EAD & H1B Consultants please.
Please mention your Visa Status in your email or resume .
To apply for this job please visit tinyurl.com.