Data Engineer

Data Engineer

Dallas, TX 75254 (Far North area) 2016-11-05 - –

Copart, Inc

Copart is seeking an experienced Data Engineer to be part of the Data Services Team. This team works very closely with all aspects of data, both internal and external. We are looking for a professional with the Software Engineering skills to build data pipelines for efficient and reliable data movement across systems, and also to build the next generation of data tools to enable us to take full advantage of this data. In this role, your work will broadly influence the company's data consumers, executives and analysts.

Key Responsibilities:
Design, build and launch extremely efficient & reliable data pipelines to move data to our Data Warehouse/Data Mart.

Develop ETL routines to populate databases from multiple disparate data sources and create aggregates

Create and run data migrations across different servers and different databases including Enterprise CRM and ERP applications.

Design/develop new systems and tools to enable stakeholders to consume and understand data faster

Data cleansing and manipulation using your expert SQL & programming skills

Troubleshoot data issues and present solutions to the issues

Prepare activity and progress reports regarding database & data health and status

Design, code and automate data quality checks, metrics, standards and guidelines

Work across multiple teams in high visibility roles and own the solution end-to-end

Required Skills

3 years’ experience as a Database Engineer with exposure to Big Data solutions.

Recent experience in SQL tuning, indexing, partitioning, data access patterns and scaling strategies

Candidate must have a deep understanding of logical and physical data modeling for OLTP and OLAP systems. Ability to translate a logical data model into a relational or non-relational solution as appropriate

Programming/Scripting experience in Java and/or Scala

Scripting with Shell, Python, Ruby or similar.

Excellent analytical problem solving and decision making skills

Experience working with large complex sets of data in a high-availability environment

Experience with SCRUM agile methodology process and development practices

Bachelor’s degree or higher in computer technology or related field

Plus to have:
Experience in NoSQL/Big Data technologies

Experience in Big Data ingestion frameworks

Experience in Business Intelligence tools and technologies

Required Experience

To apply for this job please visit tinyurl.com.