Big Data Architect

Big Data Architect

Maryland Heights, MO 63043 2016-10-26 - –

chartercom Charter Communications

Responsible for design, development and implementation of Big Data Projects. Oversee, perform and manage Big Data projects and operations. Resolve issues regarding development, operations, implementations, and system status. Research and recommend options for department direction on Big Data systems, automated solutions, server-related topics. Provide technical leadership on complex projects and research new technology for use by department.


• Implement end to end Hadoop solutions with a deep understanding of the Hadoop Ecosystem

• Integrate technical functionality (e.g. scalability, security, performance, data recovery, reliability, etc.)

• Research, evaluate, architect, deploy new tools, frameworks and patterns to build sustainable Big Data platforms

• Design and implement complex highly scalable statistical models and solutions that comply with security requirements

• Expert knowledge of whole Hadoop ecosystem like HDFS, Hive , Yarn, Flume, Oozie, Flume, Kafka, Storm, Spark and Spark Streaming including Nosql database knowledge

• Good understanding of Hadoop Administration

• Define and develop APIs for integration with various data sources in the enterprise

• Actively collaborate with other architects and developers in developing client solutions

• Architect solutions in building frameworks for data ingestion, loading and transformation

• Lead innovations by exploring, investigating, recommending, benchmarking and implementing new technologies for the Big Data platform.

• Ensure that solutions are delivered by working closely with various teams and organizations within the company including on-site and off-shore team members, production support, project managers, and platform and application support teams.

• Collaborate with external business partners such as vendors and technical consultants to provide support for system issues or introduce new solution.

• Design and document system standards and operating procedures.

• Provide technical roadmaps and Hadoop solutions to better position the business to meet new technical and competition challenges.

• Review and improve existing system solutions and processes towards best practices.

• Provide technical leadership and guidance on various projects and initiatives.

• Performance tuning of handling large scale data volume and transformations. Ability to translate high-level business requirements into detailed design

• Aptitude to identify, create, and use best practices and reusable elements

• Ability to solve practical problems and deal with a variety of concrete variables in situations where only limited standardization exits

• Strong desire to learn a variety of technologies and processes with a “can do” attitude

• Experience guiding and mentoring 2-5 developers on various tasks

• Own a complete functional area – from analysis to design to development and complete support


Skills / Abilities and Knowledge

Ability to read, write, speak and understand English.

Ability to communicate orally and in writing in a clear and straightforward manner

Ability to communicate with all levels of management and company personnel

Ability to handle multiple projects and tasks

Ability to make decisions and solve problems while working under pressure

Ability to prioritize and organize effectively

Ability to show judgment and initiative and to accomplish job duties

Ability to use personal computer and software applications (i.e. word processing, spreadsheet, etc.)

Ability to work independently

Ability to work with others to resolve problems, handle requests or situations

Ability to effectively consult with department managers and leaders


• BS in Information Technology, Computer Science, MIS or related field or equivalent experience.


• 10+ years of hands-on experience in handling large-scale software development and integration projects.

• 6+ more years of experience working with multiple projects through the entire SDLC in data warehouse environments and solid SQL skills

• 3+ years of hands-on experience with the technologies in the Hadoop ecosystem like Hadoop, HDFS, Spark, MapReduce, Pig, Hive, Flume, Sqoop, Cloudera Impala, Zookeeper, Oozie, Hue, Kafka

• Experience in Spark, Kerberos authorization / authentication and clear understanding of cluster security

• Proven ability to make decisions in technology selection and implementation approach based on long-term strategic objectives, while taking into consideration short-term implications for ongoing or planned implementations

• Demonstrated ability to apply technology in solving business problems and to communicate with both technical and non-technical audiences

• Exposure to high availability configurations, Hadoop cluster connectivity and tuning, and Hadoop security configurations

• Good understanding of Operating Systems (Unix/Linux), Networks, and System Administration experience

• Good understanding of Change Management Procedures

• Experience managing LDAP, Active Directory or Kerberos is desirable

• Experience with hardware selection and capacity planning

• Experience with Java, Python, Pig, Hive, or other languages a plus


• Experience with NoSQL databases like MongoDB, Cassandra etc.

• Experience with cloud technologies (AWS)

• Experience in programming languages like Java, Scala etc.

• Certification in Hadoop Operations or Cassandra is desired


Office environment

EOE Race/Sex/Vet/Disability

Charter is an equal opportunity employer that complies with the laws and regulations set forth in the following EEO Is the Law poster:

Charter is committed to diversity, and values the ways in which we are different.

To apply for this job please visit