Advance Local Logo

Advance Local

Senior Data Engineer

Posted Yesterday
Remote
Hiring Remotely in USA
120K-140K Annually
Senior level
Remote
Hiring Remotely in USA
120K-140K Annually
Senior level
The Senior Data Engineer will design and maintain the data infrastructure, ensure data integration across platforms, and enforce best practices in data engineering.
The summary above was generated by AI
Job Summary & Responsibilities

Advance Local is looking for a Senior Data Engineer to design, build, and maintain the enterprise data infrastructure that powers our cloud data platform.  This position will combine your deep technical expertise in data engineering with team leadership responsibilities for data engineering, overseeing the ingestion, integration, and reliability of data systems across Snowflake, AWS, Google Cloud, and legacy platforms.  You’ll partner with data product and across business units to translate requirements into technical solutions, integrate data from numerous third-party platforms, (CDPs, DMPs, analytics platforms, marketing tech) into central data platform, collaborate closely with the Data Architect on platform strategy and ensure scalable, well-engineered solutions for modern data infrastructure using infrastructure as code and API-driven integrations.

 

The base salary range is $120,000 - $140,000 per year.

 

What you’ll be doing:

  • Lead the design and implementation of scalable data ingestion pipelines from diverse sources into Snowflake.
  • Partner with platform owners across business units to establish and maintain data integrations from third party systems into the central data platform.
  • Architect and maintain data infrastructure using IAC, ensuring reproducibility, version control and disaster recovery capabilities.
  • Design and implement API integrations and event-driven data flows to support real time and batch data requirements.
  • Evaluate technical capabilities and integration patterns of existing and potential third-party platforms, advising on platform consolidation and optimization opportunities.
  • Partner with the Data Architect and data product to define the overall data platform strategy, ensuring alignment between raw data ingestion and analytics-ready data products that serve business unit needs.
  • Develop and enforce data engineering best practices including testing frameworks, deployment automation, and observability.
  • Support rapid prototyping of new data products in collaboration with data product by building flexible, reusable data infrastructure components.
  • Design, develop, and maintain scalable data pipelines and ETL processes; optimize and improve existing data systems for performance, cost efficiency, and scalability.
  • Collaborate with data product, third-party platform owners, Data Architects, Analytics Engineers, Data Scientists, and business stakeholders to understand data requirements and deliver technical solutions that enable business outcomes across the organization.
  • Implement data quality validation, monitoring, and alerting systems to ensure reliability of data pipelines from all sources.
  • Develop and maintain comprehensive documentation for data engineering processes and systems, architecture, integration patterns, and runbooks.
  • Lead incident response and troubleshooting efforts for data pipeline issues, ensuring minimal business impact.
  • Stay current with the emerging data engineering technologies, cloud services, SaaS platform capabilities, and industry best practices.

 

Our ideal candidate will have the following:

  • Bachelor's degree in computer science, engineering, or a related field
  • Minimum of seven years of experience in data engineering with at least two years in a lead or senior technical role
  • Expert proficiency in Snowflake data engineering patterns
  • Strong experience with AWS services (S3, Lambda, Glue, Step Functions) and Google Cloud Platform
  • Experience integrating data from SaaS platforms and marketing technology stacks (CDPs, DMPs, analytics platforms, CRMs)
  • Proven ability to work with third party APIs, webhooks, and data exports
  • Experience with infrastructure such as code (Terraform, Cloud Formation) and CI/CD pipelines for data infrastructure
  • Proven ability to design and implement API integrations and event-driven architecture
  • Experience with data modeling, data warehousing, and ETL processes at scale
  • Advanced proficiency in Python and SQL for data pipeline development
  • Experience with data orchestration tools (airflow, dbt, Snowflake tasks)
  • Strong understanding of data security, access controls, and compliance requirements
  • Ability to navigate vendor relationships and evaluate technical capabilities of third-party platforms
  • Excellent problem-solving skills and attention to detail
  • Strong communication and collaboraion skills 

Top Skills

Airflow
AWS
Cloud Formation
Dbt
GCP
Python
Snowflake
SQL
Terraform

Similar Jobs

7 Hours Ago
Remote or Hybrid
3 Locations
102K-171K Annually
Senior level
102K-171K Annually
Senior level
eCommerce • Information Technology • Retail • Industrial
As a Senior Data Engineer, you will design and maintain scalable data pipelines, develop data products, and collaborate across teams to deliver insights.
Top Skills: SparkAws GlueCloudFormationDockerGithub ActionsKafkaKubernetesPostgresPythonSap S/4HanaScalaSnowflakeSQLTerraform
7 Hours Ago
Easy Apply
Remote
United States
Easy Apply
118K-184K Annually
Senior level
118K-184K Annually
Senior level
AdTech • Digital Media • Marketing Tech • Software • Automation
The Sr Data Engineer will design, implement, and maintain deployment and ETL pipelines while integrating various data sources and ensuring efficient developer experiences.
Top Skills: Apache AirflowArgo Ci/CdArgo WorkflowsBazelCircle CiDockerFlywayGitHarnessJavaJenkinsKubernetesLookerPower BIPythonSnowflakeSQLThoughtspot
10 Days Ago
Remote or Hybrid
2 Locations
160K-180K Annually
Senior level
160K-180K Annually
Senior level
AdTech • Consumer Web • Digital Media • eCommerce • Marketing Tech
The Senior Data Engineer will build and optimize data integration pipelines, manage data quality, implement business requirements, and ensure system efficiency. A strong emphasis on coding standards and collaboration is required.
Top Skills: Apache BeamApache KafkaSparkAWSGoogle Cloud PlatformPub/SubPythonSQL

What you need to know about the Boston Tech Scene

Boston is a powerhouse for technology innovation thanks to world-class research universities like MIT and Harvard and a robust pipeline of venture capital investment. Host to the first telephone call and one of the first general-purpose computers ever put into use, Boston is now a hub for biotechnology, robotics and artificial intelligence — though it’s also home to several B2B software giants. So it’s no surprise that the city consistently ranks among the greatest startup ecosystems in the world.

Key Facts About Boston Tech

  • Number of Tech Workers: 269,000; 9.4% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Thermo Fisher Scientific, Toast, Klaviyo, HubSpot, DraftKings
  • Key Industries: Artificial intelligence, biotechnology, robotics, software, aerospace
  • Funding Landscape: $15.7 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Summit Partners, Volition Capital, Bain Capital Ventures, MassVentures, Highland Capital Partners
  • Research Centers and Universities: MIT, Harvard University, Boston College, Tufts University, Boston University, Northeastern University, Smithsonian Astrophysical Observatory, National Bureau of Economic Research, Broad Institute, Lowell Center for Space Science & Technology, National Emerging Infectious Diseases Laboratories

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account