Warning: count(): Parameter must be an array or an object that implements Countable in /home/anton702/public_html/wp-includes/post-template.php on line 317
Hadoop Architect - IoT BigData Jobs

Hadoop Architect

Rakuten, Inc.

Rakuten, Inc. is the largest ecommerce company in Japan, and third largest ecommerce marketplace worldwide.

We seeks to empower merchants to deliver Omotenashi, a hospitality mindset, which helps sellers create lasting relationships with customers.

The Japanese word



optimism. Along with the global marketplaces, Rakuten supports an ever expanding list of acquisitions and strategic investments in disruptive industries and growing markets.

At Rakuten, we offer competitive salaries, benefits, annual bonuses, a stocked kitchen (including catered lunch daily), and a dynamic office environment. We love investing in our people and when it comes down to it, we think our entire team is pretty awesome. As a technology focused company, we understand the importance of an energizing atmosphere that promotes collaboration and innovation.


Our Big Data team is

responsible for delivering a large scale data platform to Rakuten Business

Units globally. This includes the implementation and maintenance of our Hadoop

platform. We are looking for someone who can develop a strategy by creating

architecture blueprints, validating designs and providing recommendations on

the enterprise platform strategic roadmap. Experience with design and

develop in high-volume real-time big data platforms is preferred.




Administration, management, and tuning of Hadoop environment.

Collaborate with the

infrastructure team to coordinate OS-level patching and to identify and solve hardware related issues.

Cluster & Node

Maintenance, Health Checks, Automation of Job Monitoring by creating


Assist development

team in identifying the root cause of slow performing jobs / queries


Capacity Planning /


Develop strategy to

automate management and deployment processes (DevOps).

Deploying and managing

all Hadoop platform components.

Planning and

conducting platform upgrades.

Work with development staff

to ensure all components are ready for release / deployment.

Collaborate with

Project Managers, Developers and business staff to develop products &


Participate in

managing and maintaining the product on an on-going basis



Strong UNIX/Linux

knowledge including the ability to understand the interaction between

applications and the operating system.

Ability to provide

recommendations and suggestions related to troubleshooting and performance


Experience designing

and administering a reasonably-sized Hadoop cluster (100+ nodes).

Experience in running,

using and troubleshooting the Apache Big Data stack i.e. Hadoop FS, Hive,

HBase, Kafka, Pig, Oozie, Yarn, Sqoop, Flume etc.

Ability to create

infrastructure capacity plans based on quantitative and qualitative data


Experience with

implementing and managing security for a multitenancy environment,

Familiarity with

networking stack from TCP/IP and up

Good work ethics with

extremely high standard of code quality, system reliability, and


Experience processing

large amounts of structured and unstructured data with MapReduce.

Experience with data

movement and transformation technologies.

Experience with tuning

and troubleshooting JVM environment.


to Have

Experience with data

virtualization using Presto-DB or JBOSS Tied or any other similar


Experience in SQL and

Relation Database developing data extraction applications.

To Recruiting Agencies:
Recruiters and/or agencies must hold a valid contract for service and obtain approval from Rakuten's Global Human Resources Department on an individual requisition basis in order to submit resumes. Rakuten will NOT bear any costs for any placement resulting from the receipt of an unsolicited resume and (Rakuten) reserves the right to pursue and hire the candidate without any financial responsibility to the recruiter or agency.

Primary Location


Americas-United States-Massachusetts-Boston




Employee Status



To apply for this job please visit tinyurl.com.