As a Big Data Architect in the Aero Services organization, you will be responsible for defining the Aerospace-wide big data platform strategy that meets the needs of our data science teams and customer-facing web and mobile applications. This person will work closely with product management and engineering teams to understand the customer requirements/needs, aircraft systems capabilities and available disparate data pipes to determine the optimal approach to managing (load, store, access, and govern) the data in our Big Data platform.
Key to the growth of the Aero Services Org will be our ability to monetize critical data that is produced by our suite of Aerospace products.
- Understand and translate project requirements into technical requirements and solutions for data engineering team to execute
- Architect and design a Big Data analytics platform that is scalable, optimized and fault-tolerant
- Perform program reviews to ensure that data design elements are reusable and repeatable across projects
- Define and develop guidelines, standards, and processes to ensure the highest data quality and integrity in the data stores residing on the data lake
- Participate in setting strategy and standards through data architecture and implementation leveraging big data and analytics tools and technologies
- Work closely with data scientists and product managers to understand their data requirements for existing and future projects on data analytics applications
- Work with IT and data owners to understand the types of data collected in various databases and data warehouses
- Research and suggest new toolsets/methods to improve data ingestion, storage, and data access in the analytics platform
- Provide guidance/mentor data engineering team
- Possess hands-on technical experience in big data technologies
- Keen business acumen to recognize and recommend cost-effective and scalable platform solutions that best meet current and future business needs
- Ability to execute projects using an agile approach in a multi-disciplinary, matrixed environment
- Comfortable working in a dynamic, research and development environment with several ongoing concurrent projects
- Enjoys exploring and learning new technologies
You Must Have:
- Bachelor, Masters, or PhD degree in computer science, IT, engineering, or other relevant field with a 10+ years of data management experience
- Minimum of 5 years of hands-on experience in designing, deploying, and supporting enterprise data warehouses and distributed data processing platforms
- Minimum of 5 years of ETL/ELT experience in traditional data (MSSQL, MySQL, Oracle etc) and Big data platform using Hadoop ecosystem (Spark, Hive, Pig, Sqoop, Flume etc)
- Minimum of 3 years of experience in scripting languages (Perl, Python, Java etc)
- Minimum of 3 years of experience in NoSQL solutions (Hbase, Cassandra, MongoDB, CouchDB etc.) and managing unstructured data
- Must be a US Citizen due to contractual requirements.
- Certification in Hadoop and other big data tools and technologies
- Working experience in IOT projects
- Experience with data management on public cloud hosting services
- Deep knowledge in data mining, machine learning, natural language processing, or information retrieval
- Experience with Agile software development methodology
- Ability to work in a fast-paced and ambiguous environment
- Phoenix, AZ
Honeywell is an equal opportunity employer.
Qualified applicants will be considered without regard to age, race, creed, color, national origin, ancestry, marital status, affectional or sexual orientation, gender identity or expression, disability, nationality, sex, or veteran status.
To apply for this job please visit tinyurl.com.