Warning: count(): Parameter must be an array or an object that implements Countable in /home/anton702/public_html/wp-includes/post-template.php on line 317
Hadoop Developer (Principal Software Engineer) - IoT BigData Jobs

Hadoop Developer (Principal So...

CareFusion CareFusion

Job Description

Hadoop Developer (Principal Software Engineer)

Life-changers work here

At CareFusion, we create innovative ways to help our customers improve patient care. We rely on bold and inspired employees who share our commitment to helping solve some of healthcare's most critical challenges.

CareFusion is now part of Becton Dickinson , a global medical technology company focused on addressing many of the world's most pressing and evolving health needs. With our combined technology and expertise, we will become a global leader helping to transform the quality and cost of care for patients and clinicians worldwide. Join us in our mission to improve the future of healthcare and help all people lead healthy lives.

Job Title: Hadoop Developer (Principal Software Engineer)

CareFusion Business Description

Department Overview

The Data & Analytics department is responsible for developing and maintaining commercial grade software products, that are in compliance and supportive of the overall architectural and product strategy. Utilizing best practices within the industry, as well as the currently defined technology stack, this department is responsible for delivering high quality, highly stable, and robust applications that perform to present and future expectations.


The Hadoop Developer position will be responsible for the design and development of data related solutions (databases, ETL systems, code, scripts, data models, reports, documentation) and supporting various products developed by D&A. The position requires extensive hands-on experience developing data related code (Java, Pig, Hive, SQL, Flume, Kafka, Python, ETL code) and associated code in a Linux/Windows environment.

Specific Duties, Activities, and Responsibilities:

  • Ability to design big data solution components, including data analysis and design, HDFS layout, stream ingestion, data transforms and integration, data quality checks, software deployments, configuration management, automated testing, maintenance, troubleshooting and analytic support.
  • Design, plan, and develop programs to perform automated extract, transform and load data between data sources when working with large data sets (TBs+ range)
  • Understand relational and non-relational database design principles, including normalization, de-normalization, and ability to read and interpret source and target systems.
  • Provide documentation of requirements in the form of architecture diagrams, data models, source to target mappings, data dictionaries and detailed design documents.
  • Maintain developed source code in the repository for all databases and keep all build programs up to date.
  • Diagnose and resolve performance issues.
  • Responsible for adherence to corporate standards, build process guidelines, maintainability, unit tests
  • Work with Operations group to design and implement maintenance routines, automated monitoring solutions, backup and disaster recovery strategy.


  • BS in Computer Science or Information Systems.
  • 5+ years’ experience programming in Java or .Net environment.
  • 5+ years’ experience working with database platforms in large data environments (TB+).
  • 2+ years’ experience working with non-traditional data platforms: (Hadoop, Cassandra, HBase, Azure, etc.)
  • Strong written and verbal communication skills
  • Production delivery experience in Big Data related technologies Hadoop, Kafka, Flume, Oozie, HBase,
  • Java/Linux
  • Strong SQL skills
  • Strong understanding of Map Reduce and distributed file system
  • Database design skills including normalization and data warehouse design
  • Performance tuning and optimization
  • Strong analytical skills
  • Troubleshooting skills
  • Experience with Cloudera distribution, a plus
  • Experience in the healthcare industry, a plus


  • Hadoop 2.0/Hive/Pig/Impala/HBase
  • Java/Linux
  • MySQL/Microsoft SQL Server
  • Talend/Pentaho

Physical/Mental Requirements:

  • Ability to communicate clearly both verbally and in writing.
  • Ability to analyze complex application and business operational issues.

The Developer will be additionally be responsible for the following tasks:

  • Ensuring appropriate and adequate unit test cases are created and enacted
  • Ensuring appropriate and detailed documentation for developed modules
  • Supporting data analysts
  • Estimation and timely completion of tasks
  • Following quality assurance processes
  • Following and enhancing departmental development guidelines

CareFusion is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, gender identity, sexual orientation, national origin, genetic information, disability status, veteran status, or any other characteristic protected by law.


Requisition ID 160500YW
Primary Location California-San Diego

Travel Yes, 5 % of the Time
Schedule Full-time
Job Posting Sep 15, 2016, 1:51:51 PM

To apply for this job please visit tinyurl.com.