Hadoop Developer

Hewlett Packard Enterprise

Hewlett Packard Enterprise creates new possibilities for technology to have a meaningful impact on people, businesses, governments and society. HPE brings together a portfolio that spans software, services and IT infrastructure to serve more than 1 billion customers in over 170 countries on six continents. HPE invents, engineers, and delivers technology solutions that drive business value, create social value, and improve the lives of our clients.

Learning does not only happen through training. Relationships are among the most powerful ways for people to learn and grow, and this is part of our HPE culture. In addition to working alongside talented colleagues, you will have many opportunities to learn through coaching and stretch assignment opportunities. You’ll be guided by feedback and support to accelerate your learning and maximize your knowledge. We also have a “reverse mentoring” program which allows us to share our knowledge and strengths across our multi-generation workforce.

Enterprise Services Information Technology Outsourcing (ITO) has a comprehensive Infrastructure Services portfolio that produces better business outcomes for our clients by reducing costs, enabling growth and managing risk. HPE ES ITO combines years of innovative technology and research with a broad portfolio and global reach to deliver superior value to our clients.

For more than 50 years, we have built a strong reputation of industry expertise and delivering the best client experience in the industry. Our clients rely on HPE as a partner they can trust to deliver mission-critical services and value to their enterprise.

Your key tasks:
Participates as a member of development teams as Hadoop Admin

Experienced and full understanding with large scale Hadoop environments build and support including design, capacity planning, cluster set up, performance tuning and monitoring.

Provide Hadoop Architecture Consulting to Customer in support of solution design activities.

Hadoop development and implementation.

Loading from disparate data sets.

Pre-processing using Spark, Hive and/or Pig.

Designing, building, installing, configuring and supporting Hadoop.

Translate complex functional and technical requirements into detailed design.

Perform analysis of vast data stores and uncover insights and data strategies.

Maintain security and data privacy.

Create High-speed querying and alerting from different data streams.

Being a part of a POC effort to help build new Hadoop capabilities.

Test prototypes and oversee handover to other operational teams to include configuration managers, testers and others.

Propose and implement best practices/standards.

Will provide mentoring and guidance to other Technical Consultants.

Represents team to clients. Fully understand Hadoop i.e. scripting, config, install and capacity planning.

Demonstrates technical leadership, and exerts influence outside of immediate team.

Contributes to strategic direction for overall architecture and operations.

Consults with team members and other organizations, clients and vendors on complex issues.

Develop innovative solutions to complex business and technology problems.

Consult with Customer to troubleshoot error conditions/and migrate load processes from legacy system.

Consult with Customer in optimal design of database projections.

Qualifications

What do we expect?

Education and experienced required:
Bachelor's degree in Computer science or equivalent experience

5+ years of total experience in DBA or application DBA activities

Strong understanding of Hadoop eco system such as HDFS, MapReduce, HBase, Zookeeper, Pig, Hadoop streaming, Sqoop, oozie and hive

Experience in installing, administering, and supporting operating systems and hardware in an enterprise environment. (CentOS/RHEL).

Expertise in typical system administration and programming skills such as storage capacity management, performance tuning

Proficient in Bash and shell scripting (e.g. ksh,)

Knowledge in programming and/or scripting, specifically java, python, scala, Pig Latin, HiveQL, SparkSQL/RDD’s and others.

Experienced with MapReduce and Yarn.

Experience in setup, configuration and management of security for hadoop clusters using Kerberos and integration with LDAP/AD at an Enterprise level

Experienced in solutioning and architecting Hadoop based systems.

Knowledge of proper development techniques, ie. Agile

Experienced with MapReduce and Yarn.

Knowledge of workflow/schedulers like Oozie

At least 1 year plus experience in managing a Hadoop cluster

Able to write, tune SQL queries

Understands high availability concepts

Understands server tuning concepts (parameters, resources, contention, etc)

Backup / restore/ Disaster Recovery experience

Proven ability in managing databases of 1-10TB in size (Vertica, Greenplum, Netezza, AsterData, Paracel, Exadata)

Must be a US Citizen

Must have the ability to work in the US

Must be eligible to obtain Clearance (varying levels – Public Trust, Secret, Top Secret, etc.)

Preference for candidates located in/around the /DC Area (Must be able to default to Herndon office between client engagements/projects)

Ability and desire to travel frequently to client locations, as requested by HPE Management

Must have excellent verbal and written communication skills and feel comfortable in group settings

Preferred Additional Experience

Experience integrating with enterprise BI platforms such as Spotfire, Tableau, QlikView, MicroStrategy, Business Objects, Cognos, etc…

Hadoop Certification

Hardware knowledge (including network, disk subsystems, etc)

Experienced with data warehousing concepts and techniques including extensive knowledge and use of star/snowflake schema

Unix/Linux skills (including scripting) and C++

Understanding of OLTP vs OLAP data administration needs

Good Knowledge and expertise with ODBC/JDBC data connectivity

Experience in ETL/ELT workflow management

QA/Testing process experience

Understanding and ability to participate in all phases of the SDLC including requirements gathering, business analysis, configuration management and quality control

  • Hewlett Packard Enterprise Values:

Partnership first: We believe in the power of collaboration – building long term relationships with our customers, our partners and each other

Bias for action: We never sit still – we take advantage of every opportunity

Innovators at heart: We are driven to innovate – creating both practical and breakthrough advancements

HPE is an EOE / Female / Minority / Individual with Disabilities / Protected Veteran Status

What do we offer?

Extensive social benefits, a competitive salary and shared values, make Hewlett Packard Enterprise one of the world´s most attractive employers. At HPE our goal is to provide equal opportunities, work-life balance, and constantly evolving career opportunities.

If you are looking for challenges in a pleasant and international work environment, then we definitely want to hear from you. Apply now below, or directly via our Careers Portal at www.hpe.com/careers

You can also find us on:
https://www.facebook.com/HPECareers

https://twitter.com/HPE_Careers

Job

  • Services

    Primary Location

  • United States-Virginia-Herndon

    Other Locations

  • United States-Maryland-Baltimore, United States-Texas-Plano, United States-Michigan-Pontiac, United States-Texas-El Paso

    Schedule

  • Full-time

    Shift

  • Day Job

    Travel

  • No

    Job Posting

  • Sep 28, 2016

    EEO Tagline

    Hewlett Packard Enterprise is EEO F/M/Protected Veteran/ Individual with Disabilities

  • To apply for this job please visit tinyurl.com.