Chicago, IL

Data Engineer



Job Description

Job Responsibilities Include:

  • Primary responsibility for our Data Platform
    • Data model ongoing design & development
    • Conceptual, logical and physical design (database, ODS, aggregates, etc.)
    • Database administration
    • Capacity analysis & management
  • MDM Lead
    • Identify key domains that'd benefit from an MDM approach (e.g. Product, Customer), along with best data sources & necessary attributes, and integrate into the the platform
    • Define governance strategy with associated roles & responsibilities (e.g. Data Steward, Quality Specialist)
    • Define & implement Policies & SOPs
    • Monitor operations, develop and report quality metrics to key stakeholders
  • Data Pipeline development:
    • Participate in Requirements Gathering: work with key business partner groups (e.g. Product Mgt) and other Data Engineering personnel to understand department-level data requirements for the the platform
    • Design Data Pipelines: work with other Data Engineering personnel on an overall design for flowing data from various internal and external sources into the the platform
    • Build Data Pipelines: leverage standard toolset and develop ETL/ELT code to move data from various internal and external sources into the the platform
  • Support Data Quality Program: work with Data QA Engineer to identify automated QA checks and associated monitoring & alerting to ensure the platform maintains consistently high quality data
  • Support Operations: triage alerts channeled to you and remediate as necessary
  • Technical Documentation: leverage templates provided and create clear, simple and comprehensive documentation for your development
  • Key contributor to defining, implementing and supporting:
    • Data Services
    • Data Dictionary
    • Tool Standards
    • Best Practices
    • Data Lineage
    • User Training
  • Define Best Practices and Guidelines for other Data Engineering team members
  • Lead the team in developing new technical skills necessary for cloud-native data engineering platform
    • Explores new tech
    • Shares and documents learnings
    • Productionalizes proof of concepts


Skills & Responsibilities:

  • Expert Python
  • Expert-level data modeler (back-end and semantic layer)
  • Expert-level ETL/ELT designer/developer
  • Strong database administration and operations experience & proficiency
  • Strong SQL
  • Structured & unstructured data expertise
  • Cloud environment development & operations experience (e.g. Google Cloud Platform/GCP experience a plus)
  • Excellent verbal and written communications
  • Strong team player
  • Working knowledge of eCommerce data a plus
  • Prior experience with Git, Terraform, GCP Deployment Manager, CICD, Docker, Kubernetes, Apache Airflow, Apache Beam, Apache Spark experience is a plus


Success Criteria:

  • Knowledge of data modeling concepts and data relationships
  • Advanced Analytical Thinking and Problem Solving skills
  • Solid experience in architecture, advanced reporting and dashboards
  • Strong SQL skills and experience with performance tuning are required
  • "Get it done” attitude with a high degree of autonomy, ownership and responsibility
  • Superior Communication and Business-Technical Interaction skills


To qualify, you must possess the following skills:

  • Bachelor's degree in computer science, management information systems, or a related discipline
  • 5+ years hands-on data warehouse-data modeling experience
  • 5+ years hands-on database admin/ops experience
  • 5+ years hands-on ETL/ELT design/development experience
  • Key resource on team(s) that have delivered successful enterprise-level analytics platforms


Company Description

Global leaders!

Recommended Skills

  • Airflow
  • Analytical
  • Apache Http Server
  • Apache Spark
  • Architecture
  • Backend
Browse other jobs