Hadoop Developer

Blue Cross Blue Shield of IL, MT, NM, OK & TX

HCSC is committed to diversity in the workplace and to providing equal opportunity and affirmative action to employees and applicants.
If you are an individual with a disability or a disabled veteran and need an accommodation or assistance in either using the Careers website or completing the application process, you can email us here to request reasonable accommodations.
Please note that only requests for accommodations in the application process will be returned. All applications, including resumes, must be submitted through HCSC’s Career website on-line application process. If you have general questions regarding the status of an existing application, navigate to “my account” and click on “View your job submissions”.
Job Purpose:

This position is responsible for developing, integrating, testing, and maintaining existing and new applications; knowing one or more programming languages; knowing one or more development methodologies / delivery models.

Required Job Qualifications:

*Bachelor Degree and 2 years Information Technology experience OR Technical Certification and/or College Courses and 4 year Information Technology experience OR 6 years Information Technology experience.

*Possess ability to sit and perform computer entry for entire work shift, as required.

*Possess ability to manage workload, manage mutliple priorities, and manage conflicts with customers/employees/managers, as applicable.

*Rapid prototyping.

Must have hands on experience in designing, developing, and maintaining software solutions in Hadoop cluster.

Must Have experience with strong UNIX shell scripting, SQOOP, eclipse, HCatalog .

Must Have experience with NoSql Databases like HBASE, Mongo or Cassandra

Must Have experience with Developing Pig scripts/Hive QL ,UDF for analyzing all semi-structured/unstructured/structured data flows.

Must have working experience with Developing MapReduce programs running on the Hadoop cluster using Java/Python.

Must have experience in Knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2) and considerations for scalable, distributed systems

Must demonstrate Hadoop best practices

Demonstrates broad knowledge of technical solutions, design patterns, and code for medium/complex applications deployed in Hadoop production.

Must have working experience in the data warehousing and Business Intelligence systems

Participate in design reviews, code reviews, unit testing and integration testing.

Assume ownership and accountability for the assigned deliverables through all phases of the development lifecycle.

*SDLC Methodology (Agile / Scrum / Iterative Development).

*System performance management.

*Systems change / configuration management.

*Business requirements management.

*Problem solving /analytical thinking.

*Creative thinking.

*Ability to execute.

Preferred Job Qualifications:

*Bachelor Degree in Computer Science or Information Technology.

Candidate must be willing to RELOCATE to any of these locations: Chicago IL, Dallas TX, Helena MT, Albuquerque NM, Tulsa OK

No H1 visa candidates , need only US Citizens or Green Card Holders.

To apply for this job please visit tinyurl.com.