Gogo Commercial Aviation
Gogo is bursting at the seams with data, and we need your help to contain it! We looking for a Data Engineer with extensive ETL/ELT and cloud database experience to write processes that drives Gogo’s robust Business Intelligence reports.
This Data Engineer will focus their skills on populating our data warehouses and big data clusters to enable our BI analysts to consume data in near real time. They maintain and improve our data collection and presentations capabilities on an incredible team with a never ending passion for data!
- Develop robust end-to-end data solutions for structured and unstructured data including, but not limited to, ingestion, parsing, integration, auditing, logging, aggregation, normalization, and error handling
- Collaborate with cross functional team to resolve data quality and operational issues.
- Participate in an on call rotation to support the Information Management cluster
- Interact directly with end users to gather requirements and consult on data integration solutions. Regularly have their best interests in mind and proactively recommend value added items – even if not requested
- Maintain, support, and enhance most elements of Gogo’s Information Management Cluster with minimal assistance from your peers
- Provide mentoring / coaching to junior developers
- Receive and adhere to project delivery deadlines
- Migrate code across environments and leverage a source code management system
- Create jobs to perform auditing and error handling
- Proficiency in traditional RDBMS with an emphasis on SQL Server, Postgres, and MariaDB
- General understanding of ETL/ELT frameworks, error handling techniques, data quality techniques and their overall operation
- Generally proficient in performing data transformations via scripting, stored procedures, or an ETL framework
- Good understanding of the 4Vs of data and development strategies for accommodating them in integration
- Proficient in developing and supporting all aspects of a big data cluster: Ingestion (rsync), Processing (Apache Nifi), Parsing (Java), integration (Python, Spark, Scale and PIG), data movement (SQOOP), workflow management (OOZIE and ActiveBatch), and querying (HIVE and Impala)
- Reasonably Proficient in writing Apache Spark including an understanding of optimization techniques
- Proficiency in Unix and Linux operating systems
- Capable of navigating and working effectively in a DevOps model including leveraging related technologies: Jenkins, GitLab, etc….
- Strong foundational experience with SQL
Experience and Education
- A Bachelor’s Degree in Computer Science or related field required
- 4+ years of Data Integration experience
- 4+ years of hands on experience with one of the
following technologies: Hadoop, SQL Server, Redshift, PostgreSQL
Gogo’s worldwide inflight Wi-Fi services have made internet and video entertainment a regular part of flying. We are a diverse group of technologists, marketers, strategists, and any other function you can think of- all working together in extraordinary harmony. And that’s just the beginning.
We connect the aviation industry and its travelers with innovative technology and applications, and we do it all in a high-energy environment that welcomes the next challenge. Be prepared for a dynamic ride with people who are passionate about what they’re building.
Gogo is an equal opportunity employer and works in compliance with both federal and state laws. We are committed to the concept regarding Equal Employment opportunity. Qualified candidates will be considered for employment regardless of race, color, religion, age, sex, national origin, marital status, medical condition or disability. The EEO is the law and is available here .
Gogo participates in E-Verify. Details in English and Spanish . Right to Work Statement in English and Spanish .
To apply for this job please visit tinyurl.com.