As a member of the
Enterprise Architecture Data Platform team
Principal Data Platform Engineer
will bring breadth and depth of experience to help drive long-term strategies for Data Management and Analytics including big data, distrbuted computing, and advanced analytics. The Principal Data Platform Engineer will work as part of a small team that is responsible for design, build and maintenance of a comprehensive big data platform for Cargill. This includes defining architecture, developing ingestion/storage patterns, developing exposure and analysis patterns, and defining a service around the platform. The Data Engineer works closely with Solution Architects, Solution Analysts and Process Designers to understand and evaluate business requirements that require use of the big data platform. This role will play a key role in changing the culture in which new Platforms will be delivered and champion the culture and tooling which will support a DevOps model of delivery across Technology and Engineering programs.
This position will work across multiple Business Units to determine a right-fit roadmap and solution set to support their capability needs and strategic choices around Data and Analytics. The role will work across multiple teams to ensure consistency of approaches and within those parameters, help local teams determine the approach which works best for them.
This position will be engaged with programs that are transforming the way Global IT drives business value by moving technology delivery to a model that is composed of Lean principles, agile development approaches, and DevOps practices. These concepts include test driven development, CI/CD pipelines, configuration management and cloud adoption.
This position will partner closely with members of the Cloud Platform team to bring Data and Analytics Platform as a Service computing (PaaS) into the environment, understand it’s impact on the delivery model and help shape the technology solutioning and it’s appropriate use.
The key to the success of this position is having courageous & innovative approach to problem solving, deep engineering expertise, technical leadership, excellent communication (written and verbal, formal and informal), collaboration, flexibility, and a self-motivated working style with attention to detail.
To be successful, the candidate must demonstrated a passion for new and evolving technologies and be willing to push their application throughout the organization.
50% Data Platform Strategy and Execution
- Design and build across the comprehensive component set of the Cargill Data Platform supporting a Platform as a Service model that enables Big Data, Data Management, Analytics and Reporting.
- Design and build future state ingestion, storage, access and analytics frameworks and capabilities to meet the needs of Cargill.
- Deliver coaching on data management and analytics techniques and technologies which add in open-source and in-house developed software platforms that will ensure automated and continuous testing, integration, and deployment of software and infrastructure across multiple cloud providers as well as internal datacenters.
- Design and build complex enterprise solutions that solve business problems using technology.
- Lead the development of a developer focused environment that allows for a natural delivery method to fit multiple developer personas (i.e. Java/.Net).
- Assist in driving a software delivery model that supports the current multi-location, multi-continent, multi-cultural operating framework of Cargill.
- Lead and participate in continual analysis and planning to ensure Global IT toolsets and technologies are relevant, reliable and cost effective
15% Business Relationship Management and Consulting
- Work with ScrumMasters, Product Owners, Development Leads and QA Leads in CI/CD, Source Code Management, Containerized solutions and cloud technologies, especially with respect to Data Management and Advanced Analytics
- Regularly interface with architects, analysts, process designers, and BU/Function subject matter experts to understand and evaluate data and analytics capabilities and/or functional capability requirements
- Partner with Businesses to determine functional requirements and translate into platform specific design (including ingestion and storage patterns)
15% Architecture Definition Methodology and Implementation
- Agile Training/Tools: Responsible for working as part of a matrixed team to define and provide hands-on training for all critical software delivery tools and processes as well as the supporting tools that teams will use. You will also be expected to provide input for which toolset will best support our operating needs.
- Assess and help drive adoption of new technologies and methods within the team and across Cargill.
- Build prototype to prove out concepts
10% Coaching and Collaboration
- Coach and mentor development teams on usage and adoption of our Continuous Delivery toolsets and overall infrastructure as code and automation best-practices in the context of the Cargill Data Platform
- Work closely with application teams looking to shift to a more iterative delivery model and ensure that their full-stack is fully automated, tested, and successfully packaged into production releases.
- Build and coach on technical capabilities to enable faster innovation, accelerate time-to-market for all of our consumer experiences, and deliver industry leading developer experiences.
- Work closely with our application teams to ensure all capabilities align with actual application delivery needs and pain points
- Mentor Architects and Engineers on Cargill teams
10% Run Operations
- Collaborate with Ops and Architecture organizations to maintain awareness on the health of overall Data Platform
- Bachelor’s Degree in IT or Business Related field or equivalent work experience
- 15 years of IT and business/industry work experience and at least 5+ years influencing senior level management and key stakeholders.
- Experience with Hadoop and YARN based architectures
- Experience with infrastructure automation, infrastructure as code, automated application deployment, monitoring/telemetry, logging, reporting/dashboarding
- Experience in building high-performace infrastructures that are scalable and resilient
- Experience with container technologies, e.g. Docker, etc.
- Experience with test-driven development frameworks for application and infrastructure code.
- Ability to articulate complex architectures in a concise way
- Ability to creat clear and detailed technical diagrams and documentation.
- Experience with cloud-based infrastructure as a service platforoms, e.g.AWS, Google Compute Engine, Azure or OpenStack.
- Experience with configuration management and automation tools such as: Chef, Puppet, Salt and Ansible.
- Experience with development using Github, TSVS and TFS
- Experience with Windows and Linux sstems administration
- Experience with the Agile mindset
- Ability to travel up to 20% including international
- Bachelor’s or Master’s Degree in technical or business discipline
- Experience with business case development.
- Knowledge of all components of enterprise architecture
Equal Opportunity Employer, including Disability/Vet.
To apply for this job please visit tinyurl.com.