Big Data Architect

Big Data Architect

Seattle, WA 98109 (Westlake area) 2016-10-27 - –

fredhutch Fred Hutchinson Cancer Research Center

Summary
The Hutch Data Commonwealth (HDC) is a transformative organization that aims to bring novel big data resources and analytics to the fingertips of all Hutch investigators. This role will be a key member of the team that is defining, designing, implementing and ultimately delivering the big data architecture that will allow the scientists at Fred Hutchinson Cancer Research Center to prevent and cure disease.

Working independently, the Data Solutions Architect (Data Engineer V) will be responsible for identifying and resolving issues that impact the flow and analysis of data at Fred Hutch. The incumbent will research, design, develop, and socialize creative solutions to resolve those problems. While these issues may be addressed by traditional database technologies, the focus of this position is to create and develop the new HDC big data platform.

The successful incumbent will be an expert level individual contributor with the ability to execute highly complex, specialized projects and make significant departures from traditional approaches, if needed. The incumbent will possess well developed skills in design, development, implementation and management of data solutions, and be expected to clearly articulate technical information to all levels of the organization.

Reporting
The Data Solutions Architect reports to leadership in the HDC.

Responsibilities

  • Lead the research and development of data solutions that meet the big data needs of the HDC, ensuring system scalability, elasticity, security, performance and reliability.
  • Design and deploy solutions to ingest data, manage metadata, process streaming data, improve data discoverability, and support the analysis of data in the HDC big data platform.
  • Serve as a technical mentor to staff new to big data technologies.
  • Develop prototypes and pilot new data solutions to determine their viability for adoption.
  • Provide input on the development and enforcement of data standards, data governance, data management processes.
  • Lead efforts to improve processes and meet strategic goals for data management for the HDC.
  • Provide high level support for operational issues in the HDC big data platform. Communicate with vendors, cloud providers and Center IT systems engineers to ensure the smooth functioning, optimization and growth of the platform.
  • Prepare and present information on data solutions to all levels of the organization.
  • Independently organize and manage multiple complex projects simultaneously.
  • Prepare written technical design specifications, flow charts and data flow diagrams as needed for new and existing data solutions.

Qualifications
Minimum Qualifications

  • 5 years of hands-on experience with the Hadoop ecosystem, including technical work with some combination of HDFS, Hive, Spark, HBase, Nifi, ambari, Kafka, Sqoop.
  • Proven track record researching, implementing and socializing big data solutions.
  • In-depth knowledge and expertise of data technologies, along with solid programming, design and system analysis skills.
  • Minimum of 10 years of experience with various data platforms and data structures.
  • BS in Computer Science, Engineering or equivalent experience.
  • Expertise in one or more programming language such as Java, Scala, R or python.
  • Sharp analytical abilities, proven design skills and problem solving.
  • Proven record of rapid, positive contribution to the work of the team.
  • Preferred Qualifications

  • Graduate degree in computer science, engineering or related field.
  • Demonstrated leadership in driving operational excellence and best practices.
  • Demonstrated presentation skills.
  • Expertise and knowledge of gathering and documenting analytic/reporting requirements
  • Previous experience in a research/educational setting and/or support of clinical trials.
  • To apply for this job please visit tinyurl.com.