Samsung Electronics America
Samsung Digital charter is to build the industry leading e-commerce platform for Samsung.com, to enables best in class consumer experience and seamless purchase experience. We’re committed to bringing passion and customer focus to the business of enterprise applications. We work hard in fast paced environment, challenge the status quo and relentlessly strive for excellence, yet we like to have a good time, with a fun filled atmosphere.
In today’s fast evolving technology world, one aspect remains common – reliance on data to drive the next wave of innovation. Strategic Analytics is Samsung’s Center-of-Excellence for driving the adoption of data driven decision making and product development across the company. The team’s core focus is developing best-in-class solutions that provide Samsung’s marketing and service organizations with a 360 degree view of the Samsung customers.
Strategic Analytics is powering a paradigm shift at Samsung and the global industry. We are looking for highly technical team members who are passionate about data, have the rigor needed to solve billion dollar problems, and possess an innate entrepreneurial spirit to explore the uncharted. Strategic Analytics combines the engineering backbone of a best-in-class Big Data Platform with the analytic expertise of advanced mining and predictive modeling.
If you want to work among the very best talent in the industry, working on the most innovative products in the world, Samsung is the place to be.
The Big Data Platform Engineer role is accountable for designing, maintaining, and operationalizing outputs from the Big Data Platform. The ideal candidate has deep experience in data management and analytics solutions using big data technologies, including Hadoop. This resource will be responsible for architecting and developing Hadoop based end-to-end architecture/framework, designing/implementing modularized ETL elements, performing capacity planning, leading overall big data ecosystem design/operation/upgrade/evolution. The resource must have in depth knowledge of the emerging Big Data market and hands-on experience with the Hadoop ecosystem.
The Big Data Platform Engineer is a Sr technical leadership role, who is responsible for overseeing the Hadoop platform architecture as well as big data development framework. This involves both strategic and tactical planning, execution and support of the Hadoop platform and Samsung big data analytical strategy. This role is responsible for providing direction within the team – especially to the Big Data Engineers – as well as communicating with senior management about technical decisions.
- Responsibilities include:
- Translate complex functional and technical requirements into architecture/platform design
- Design for now and future success
- Big data/Hadoop overall architecture including high availability, disaster recovery, multi-tenancy management, replication etc
- Design and implement big data development framework with standardized module to increase ETL development efficiency and quality
- Design and implement best practice for security and data privacy that adhere to Samsung business requirement and policy
- Fine tune overall system performance and continuously identify performance bottleneck/improvement opportunity
- Provide guidance and support to big data ETL engineer on performance and facilitate ETL performance tuning
- Production support and operation planning/hardening
Qualifications should include:
- Accountable for the Hadoop platform, from strategic design all the way to framework development and daily operations
- Provide technical direction within the organization
- Hands-on experience developing “Big Data” framework at scale (using Hadoop and related technologies to manage high volume high velocity structured and unstructured data)
- Establish monitoring and management practices to proactively provide necessary capacity and performance
- Define and champion engineering best practice
- Work with other infrastructure teams to ensure platform stability
- Hadoop Vendor liaison
- Develop procedural and technical documentation
- Ability to think out of the box and rapidly prototype and deliver innovative solutions.
- Ability to collaborate with Data Scientists and Business Analysts to define solution requirements and develop processes for provisioning data for wide ranging analytics
Necessary Skills / Attributes
- 12-15 years of directly related experience is required
- 8+ years’ experience in designing and implementing high available and high scalable big data systems with Hadoop technology
- 5-8 years of Python or Java/J2EE development experience with demonstrated technical proficiency
- Full lifecycle project development involving Hadoop and related technologies
- Experience overseeing Hadoop environments and operating big data system to enable business success
- Fluent in writing shell scripts [bash, korn]
- Writing high-performance, reliable and maintainable code
- Training and certification on Hadoop (preferably through Cloudera)
- Ability to setup, maintain, and implement key Hadoop components e.g. Kafka, Spark, Impala/Hive, Oozie etc
- Industry expertise of database structures, theories, principles, and practices
- Extensive experience working with AWS components [EC2, S3, SNS, SQS]
- Analytical and problem solving skills, applied to Big Data domain
- Proven understanding and hands on experience with Hadoop, Hive, Pig, Impala, and Spark
- Good aptitude in multi-threading and concurrency concepts
- B.S. or M.S. in Computer Science or Engineering
To apply for this job please visit tinyurl.com.