Data Engineer

Daniels Sharpsmart · Chicago, IL

Who are we looking for?
A technical Data Engineer who is responsible for designing, building, and implementing data infrastructure that powers the organization’s analytics, system integrations, and AI initiatives.  This role is responsible for architecting scalable pipelines that ingest data from a wide range of applications, APIs, databases, files, and SaaS platforms into Databricks.
 
Why Daniels Health?
Daniels Health is a mission-driven global innovator in healthcare waste management, operating across the U.S., Australia, Canada, Europe, and South Africa. We're experiencing significant global growth—expanding our U.S. footprint through key facility acquisitions, strengthening our presence in Europe, and growing operations in Australia, Canada, and South Africa. As a consistently recognized industry leader, we're scaling our data and AI capabilities to power smarter decision-making, seamless integrations, and sustainable innovation across our rapidly expanding global operations.

What you will do:

  • Architect, build, and maintain robust ELT/ETL pipelines ingesting data from custom applications, third‑party APIs, databases, files, and SaaS platforms into Databricks.
  • Develop and maintain reverse ETL pipelines to move curated data from Databricks back into operational or application systems.
  • Transform raw data into structured and governed bronze and silver datasets using PySpark, SQL, Databricks notebooks/jobs and similar languages.
  • Implement schema management, data normalization, change-data-capture processing, and incremental load strategies.
  • Optimize ETL performance, data storage, and job execution costs.
  • Candidate Profile:

  • 5+ years of data engineering or software engineering experience
  • Advanced SQL and Python expertise with proven ability to write efficient, scalable code
  • Expert level experience using Databricks
  • Familiarity with GitHub, version control workflows, and automated testing practices
  • Understanding of data quality frameworks and experience working in Agile/Scrum environments
  • Deep understanding of data modeling techniques, ETL/ELT, and medallion architecture.
  • Apply →