Big Data Platform Engineer ...

generalmills General Mills

Food. Purpose. You.

We serve the world by making food people love. As one of the world’s leading food companies, General Mills believes that food should make us better. Food brings us joy and nourishes our lives, connecting us to each other and the earth. General Mills operates in more than 100 countries and markets more than 100 consumer brands, including Cheerios, Häagen-Dazs, Nature Valley, Betty Crocker, Pillsbury, Green Giant, Old El Paso, Yoplait and more. Headquartered in Minneapolis, General Mills had global net sales of US$17.6 billion during fiscal 2015.

We seek out the best talent, then give them development resources, support and the chance to lead something big. Choosing a career with General Mills means joining a company where you can make a difference in the lives of millions of people. There is tremendous opportunity here for individuals who want to advance food through innovation and serve the world.



Technology at General Mills accelerates process transformation and business growth around the globe. To achieve business success, the Global Business Solutions team uses leading edge technology, innovative thinking and agile processes.

General Mills is seeking a Big Data Platform Engineer with Hadoop experience to act as a key technical leader in our Data and Analytics organization.

The General Mills Data and Analytics organization is currently in the process of building a big data platform based on Cloudera Hadoop as part of a multi-year strategic Data Lake program, to advance the data-driven decision making capabilities of our enterprise. If you are an agile learner, have strong problem solving skills and are able to function as part of a highly technical, cross functional team, we would like to hear from you.


In this role you will:

  • Act as a key Data & Analytics technical leader within General Mills
  • Collaboratively develop the technology & capability roadmap for the General Mills big data ecosystem
  • Lead the design and implementation of sustainable tools & processes to support the big data ecosystem
  • Generate and implement your own ideas on how to improve the operational and strategic health of big data ecosystem
  • Participate in the evaluation, implementation and deployment of emerging tools & process in the big data space.
  • Develop communication & education plans for technical teams on technologies and processes in our big data ecosystem
  • Partner with business analysts and solutions architects to develop technical architectures for strategic enterprise projects and initiatives.
  • Communicate data architecture to development teams and ensure business needs are met through consulting and design reviews
  • Collaboratively troubleshoot technical and performance issues in the big data ecosystem

Minimum qualifications

  • Bachelor’s Degree required; Computer Science, MIS, or Engineering preferred
  • 3 years of IT experience, 5+ preferred
  • 1 year of hands-on experience with the technologies in the Hadoop ecosystem
  • Experience working directly with business clients to design a solution that meets business requirements; ability to clearly articulate pros and cons of various technologies and platforms and architectural options, as well as being able to document use cases, solutions and recommendations.
  • Experience with tools and concepts related to data and analytics, such as dimensional modeling, ETL, reporting tools, data governance, data warehousing, structured and unstructured data.
  • Database development experience using Oracle, SQL Server, SAP BW or SAP HANA
  • Big Data Development experience using Spark and Hive/Impala
  • Effective verbal and written communication and influencing skills.
  • Effective analytical and technical skills.
  • Ability to work in a team environment
  • Ability to research, plan, organize, lead, and implement new processes or technology

Preferred Qualifications

  • Python, Scala or Java development experience
  • Familiarity with Kafka
  • Familiarity with the Linux operating system
  • Familiarity with Active Directory
  • Familiarity with virtualization or containers
  • Experience with agile techniques or methods



To apply for this job please visit