Say hello to Hagerty
Hagerty is a company built by drivers for drivers. We put our members at the center of everything we do and are dedicated to making it easier and more enjoyable for enthusiasts to drive and celebrate the machines they love. We’re proud to be the world’s largest insurer of collectible and enthusiast vehicles and are home to the Hagerty Drivers Club, the world’s largest car club. Our Marketplace business presents live and digital sales across the U.S. and Europe, we host a number of driving events and concours, and our award-winning automotive journalists produce the most popular car magazine globally, alongside internationally awarded videos. We’re committed to Never Stop Driving. Ready to get in the driver’s seat? Join us!
As a Data Engineer II, you will be joining a fast-paced, high functioning team to build and maintain data pipelines and services that support Hagerty’s Enterprise Data Hub (EDH). Internally engineered and developed, the EDH includes data processing & storage, services and APIs. In this role you will develop data pipelines, services, and cloud-based infrastructure to support the growth of Hagerty’s Insurance business and automotive lifestyle brand.
You will be partnering with a team of talented engineers working in an agile environment, leveraging modern cloud-based technologies to drive data-driven decision making in analytics and Hagerty’s data products.
What you’ll do
- Implement best practices around software development and big data engineering
- Develop and implement robust and scalable data pipelines using Python, SQL, parallel processing frameworks, and other AWS/Salesforce cloud solutions
- Develop and implement batch data pipelines using tools such as Apache Airflow, Snowflake, and numerous AWS products (EC2, Fargate, ECS, Lambda, and RDS)
- Develop streaming data integrations to support products across the Hagerty portfolio and support real-time reporting
- Develop Enterprise Data Hub platform infrastructure using Terraform infrastructure-as-code
- Develop and support Hagerty’s cloud-based data warehouse to enable analytics and product reporting
- Partner with internal and external stakeholders to collect requirements, recommend best practice solutions, and productionize new data ingestions/analytic workloads
- Develop solutions to catalog and manage metadata to support data governance and data democratization
- Partner with Data Quality Engineers to define and implement automated test cases and data reconciliation to validate ETL processes and data quality & integrity
- Mentor junior team members in software and big data engineering best practices
- Partner with Data Scientists to design, code, train, test, deploy and iterate machine learning algorithms and systems at scale
This might describe you
- You have strong problem-solving abilities and attention to detail
- You can authentically and effectively communicate (written and verbally) with various stakeholders
- You create and share technical artifacts and documentation to support development and maintenance of data products
- You have experience in successful delivery of data products as productionizable software solutions
- You ensure quality through rigorous code development, testing, automation, and other software engineering best practices
- You have experience developing solutions using Python and cloud-based infrastructure (AWS, Azure, or GCP)
- You demonstrate experience in imperative (e.g., Apache Airflow / NiFi) or declarative (e.g., Informatica/Talend/Pentaho) ETL design, implementation, and maintenance
- You have functional knowledge of relational databases and query authoring (SQL)
Pluses
- Associates degree, preferably in a technical/analytical field, or relevant work experience
- Additional 3+ years working in another role within an IT delivery team, such as a developer, engineer, data analyst, quality assurance analyst, ETL developer or DBA
- Developing infrastructure as code in a cloud-based environment (Terraform experience preferred)
- Experience cataloging and processing non-relational data
- Experience with open-source data processing technologies such as streaming services (Kafka / SQS), big data processing frameworks (MapReduce/Spark), big data file stores (EMRFS / HDFS)
- Experience working with evaluating different data containers based on workload needs (JSON, delimited files, Avro, Parquet)
- Experience with container-based development
Other things to note
- This position is open to U.S. remote work. However, team members who reside within 20 miles of the Traverse City headquarters will follow a hybrid schedule, working from the office three days per week.
- May require travel for quarterly events.
- Familiarity with public company requirements, including Sarbanes Oxley and key regulations, if applicable. For SOX compliant roles, responsible for designing, executing, and documenting internal controls where they have been identified as owners to prevent errors in financial reporting, processes, and business operations. Including attestation to the completeness, accuracy, and compliance of all financial reporting data, where applicable.
If you reside in the following jurisdictions: Illinois, Colorado, California, District of Columbia, Hawaii, Maryland, Minnesota, Nevada, New York, or Jersey City, New Jersey, Cincinnati or Toledo, Ohio, Rhode Island, Washington, British Columbia, Canada please email [email protected] for compensation, comprehensive benefits and the perks that set us apart.
At Hagerty, we share the road. We are an inclusive automotive community where all are welcomed, valued and belong regardless of race, gender, age, or car preference. We are united by our shared passion for driving, our commitment to preserve car culture for future generations and our desire to make a positive impact in the world.
#LI-Remote / #LI-Hybrid / #LI-Onsite
EEO/AA
US Benefits Overview
Canada Benefits Overview
UK Benefits Overview
If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!
Top Skills
Similar Jobs
What you need to know about the Boston Tech Scene
Key Facts About Boston Tech
- Number of Tech Workers: 269,000; 9.4% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Thermo Fisher Scientific, Toast, Klaviyo, HubSpot, DraftKings
- Key Industries: Artificial intelligence, biotechnology, robotics, software, aerospace
- Funding Landscape: $15.7 billion in venture capital funding in 2024 (Pitchbook)
- Notable Investors: Summit Partners, Volition Capital, Bain Capital Ventures, MassVentures, Highland Capital Partners
- Research Centers and Universities: MIT, Harvard University, Boston College, Tufts University, Boston University, Northeastern University, Smithsonian Astrophysical Observatory, National Bureau of Economic Research, Broad Institute, Lowell Center for Space Science & Technology, National Emerging Infectious Diseases Laboratories

