Warning: count(): Parameter must be an array or an object that implements Countable in /home/anton702/public_html/wp-includes/post-template.php on line 317
Principal Data Engineer - IoT BigData Jobs

Principal Data Engineer

CapitalOne Capital One

Plano 3 (31063), United States of America, Plano, Texas

Principal Data Engineer

Who We Are: Capital One is a technology company, a research laboratory, and a nationally recognized brand with over 65 million customers. We offer a broad spectrum of financial products and services to consumers, small businesses and commercial clients – and data is at the center of everything we do. In December 2015, Capital One was named a Blue Ribbon Company by Fortune Magazine as one of only 25 companies in the world to make their top company lists four times in 2015 (Fortune’s 100 Best Companies to Work For, Global 500, Fortune 500, World’s Most Admired Companies). Capital One was also named as Top Workplace in the Greater Chicago area for 2015 by The Chicago Tribune and 5th among the large companies. Come learn more about the great opportunities we have to offer! Responsibilities: – Lead and develop sustainable data driven solutions with current new gen data technologies to meet the needs of our organization and business customers – Build and run forums to train resources across the enterprise – Champion rollout of industry frameworks across the company (e.g. continuous integration) – Contribute to strategic roadmap for a technical domain – Handle highly ambiguous situations – e.g. severe operational breakdown of a mission critical platform, structuring sequencing projects with impacts across multiple platforms – Proactively sponsor process and technology improvements – Problem resolution, e.g. manages severe operational breakdown of a mission critical platform – Provide technical guidance to team members – Sought after by technical resources as the “guru” for a technical domain – Develop frameworks that are used by multiple teams and applications – Participate in external speaking engagements Basic Qualifications: – Bachelor’s Degree or military experience – At least 7 years coding in data management, at least 7 years in data warehousing or at least 7 years in unstructured data environments – At least 3 years experience with leading big data technologies (Cassandra, Accumulo, HBase, Spark, Hadoop, HDFS, AVRO, MongoDB, Zookeeper, or similar) Preferred Qualifications: – Master's Degree – 2+ years of experience with Agile engineering practices – 7+ years in-depth experience with the Hadoop stack (MapReduce, Pig, Hive, Hbase) – 7+ years experience with NoSQL implementation (Mongo, Cassandra, etc. a plus) – 7+ years experience developing Java based software solutions – 7+ years experience in at least one scripting language (Python, Perl, JavaScript, Shell) – 7+ years experience developing software solutions to solve complex business problems – 7+ years experience with Relational Database Systems and SQL – 7+ years experience designing, developing, and implementing ETL – 7+ years experience with UNIX/Linux including basic commands and shell scripting – Experience in industry, i.e. code committer, published, white paper, etc. – 7+ years of experience providing technical leadership on relevant applications – 7+ years of experience leading the full life-cycle of IT development and platform support

To apply for this job please visit tinyurl.com.