Big Data Integration Engineer

WarSeattle Wargaming Seattle

Wargaming is looking for a Big Data Integration Engineer with experience in ETL, Hadoop and Oracle. This is a fantastic opportunity to be part of a vastly growing and award winning global brand in the gaming industry that is at the cutting edge of PC and console gaming technology. Our games develop massive amounts of data – over 100 million lines of data from just a few dozen game sessions – and this role will be critical in using that data to better understand player behavior so we can continually improve our games and increase player happiness.

In this role, the Big Data Integration Engineer would support and perform activities involved in the design and development of ETL/ELT/data integration process and programs which include data analysis, source to target data mapping, job scheduling, and the development and testing of PL/SQL packages.

What you will do:

  • Manage and maintain ETL functions between Oracle and Tableau. .
  • Build/maintain transformation scripts for different types of data.
  • Build/maintain loader scripts for several different BI data stores.
  • Work with game engineers to build data integration processes between Hadoop and the game engine.
  • Create custom reporting frameworks where an off the shelf solution is not a fit.
  • Work with hybrid data warehousing architecture (Hadoop, Oracle, Nosql, Graph DBs)
  • Build and provision analytical sandboxes for data scientists and analysts. Parse massive datasets into manageable chunks for business customers.

Requirements:

  • Experience with Oracle SQL (DML & DDL), PL/SQL, packages, stored procedures, triggers.
  • Experience working with analysts to understand business needs and deliver the necessary data resource to support the development of actionable insights.
  • Strong interest and experience in using big data to help better understand customer/player behavior.
  • Must have solid understanding of different types of data streams (real time, traditional warehouse, big data) and when/how to use each
  • Exposure to Hadoop technologies a plus
  • Exposure to Python, Java and related frameworks is a plus
  • Experience in creating integration tasks for very large mixed workload data warehouses (Streaming/Near Real time, Batch cycles)

To apply for this job please visit tinyurl.com.