NUVIEW Logo

NUVIEW

Data Engineer

Reposted 6 Hours Ago
Remote
Hiring Remotely in United States
100K-140K Annually
Mid level
Remote
Hiring Remotely in United States
100K-140K Annually
Mid level
The Data Engineer will design and maintain data pipelines, optimize cloud data warehouses, and collaborate with analytics teams while mentoring junior staff.
The summary above was generated by AI
About NuView Analytics

At NuView Analytics, we help companies accelerate the time to insights from their data. We do this in three ways: data analytics, data diligence, and fractional data science. Our clients are growth-stage companies looking to drive additional value from the data they are sitting on. Through our values of humility, intellectual rigor, and stewardship, we help companies gain a new perspective on their business through their data.

The Role

We're looking for a Data Engineer to join our growing team and help clients build scalable, reliable data infrastructure. This role is remote but candidate must reside in the United States. Initial onboarding will be onsite. You'll work across the modern data stack, designing pipelines, architecting warehouses, and enabling the analytical layer that our clients depend on. This is a high-impact, client-facing role that combines deep technical execution with strategic thinking.

Responsibilities
  • Design, build, and maintain scalable data pipelines for clients across industries
  • Architect and optimize cloud data warehouse solutions, adapting to each client's stack, which may include Snowflake, BigQuery, Redshift, Microsoft Fabric, or similar platforms
  • Lead data integration projects from source system to analytical layer, including scoping, delivery, and handoff
  • Work fluidly across a range of modern data tools and platforms as client engagements demand, picking up new technologies quickly and applying best practices regardless of the toolset
  • Collaborate with analysts and data scientists to ensure data is clean, reliable, and well-modeled
  • Champion data quality, testing, and observability best practices across client engagements
  • Produce and maintain clear technical documentation including pipeline architecture, data dictionaries, lineage maps, and runbooks so clients can understand and own their infrastructure long-term
  • Document engineering decisions, standards, and workflows in a way that supports knowledge transfer to both clients and junior team members
  • Research and evaluate new technologies and advocate for tooling investments that benefit the firm
  • Train and mentor junior team members on engineering standards, pipeline design, and best practices
  • Participate in client-facing communication, including requirements gathering and progress updates
  • Flex support when capacity allows: contribute to analyst-side deliverables such as Power BI dashboard development, ad-hoc reporting, or data visualization. We're a lean team and value versatility
Projects Include
  • ETL/ELT pipeline development and optimization
  • Data warehouse modeling (dimensional, medallion/lakehouse architectures)
  • Data integration across client systems such as CRM, ERP, marketing, and operational systems
  • Infrastructure setup across the modern data stack (ingestion, transformation, orchestration)
  • Implementations across platforms such as Microsoft Fabric, Databricks, and Snowflake, meeting clients where they are
  • Data modeling and deployment across medallion architecture layers (bronze, silver, gold)
  • Data quality frameworks and automated pipeline testing
  • Cloud infrastructure provisioning and cost optimization (Azure, AWS, GCP)
  • Technical documentation projects including data dictionaries, ER diagrams, lineage documentation, and metrics catalogs
  • Power BI semantic model development and dashboard support when business needs require it
Qualifications
  • Bachelor's Degree in Computer Science, Engineering, Mathematics, or a related field
  • 2–5 years of relevant data engineering or software engineering experience
  • SQL Expert: complex query authoring, query optimization, stored procedures
  • Python Required: pipeline scripting, automation, data processing
  • Transformation Tools: dbt required; Spark experience a plus
  • Ingestion Tools: Fivetran, Airbyte, Rivery, Microsoft Fabric Data Factory, or similar
  • Orchestration: Airflow, Prefect, Azure Data Factory, Microsoft Fabric, or equivalent
  • Cloud Platforms: Azure (preferred), AWS, or GCP experience
  • Data Warehouses: Snowflake, BigQuery, Redshift, Microsoft Fabric, Azure Synapse, or equivalent
  • Version Control: Git required; branching strategies, pull requests, and code review workflows
  • Strong communication skills with the ability to translate technical concepts for non-technical stakeholders
  • Self-starter who thrives in a remote environment and can manage multiple client workstreams
  • Player-coach mindset: capable of leading projects while growing junior teammates
  • For certain client projects, there may be a need to be in person or for limited travel to client offices. 
  • Intellectually curious about evolving data tooling, architecture patterns, and AI-augmented engineering

NuView Analytics is an equal opportunity employer. We celebrate diverse perspectives and are committed to building an inclusive team.

Similar Jobs

Yesterday
Remote or Hybrid
CA, USA
100K-155K Annually
Senior level
100K-155K Annually
Senior level
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
The Data Engineer will architect and develop robust data frameworks for data platforms, focusing on data transformations and automated workflows. Responsibilities include collaborating with stakeholders, ensuring data quality, and maintaining high engineering standards.
Top Skills: AirflowDbtGitGitlab Ci/CdJenkinsPythonSnowflake
3 Days Ago
In-Office or Remote
Mid level
Mid level
Digital Media • Gaming • Software • Esports • Automation
The Data Engineer will create, maintain, and migrate regulatory reporting systems to Google Cloud Platform (GCP), develop data pipelines, and ensure accuracy for regulatory submissions.
Top Skills: BigQueryCi/CdCloud ComposerCloud FunctionsCloud StorageGCPGitGitlabGoogle Cloud PlatformPub/SubSQLTerraform
10 Days Ago
Easy Apply
In-Office or Remote
2 Locations
Easy Apply
155K-210K Annually
Mid level
155K-210K Annually
Mid level
Healthtech • Software
As a Data Engineer, you will transform raw healthcare data into quality intelligence products, lead product development, and mentor junior engineers while collaborating cross-functionally.
Top Skills: BigQueryDagsterDbtGoogle Cloud StoragePython

What you need to know about the Boston Tech Scene

Boston is a powerhouse for technology innovation thanks to world-class research universities like MIT and Harvard and a robust pipeline of venture capital investment. Host to the first telephone call and one of the first general-purpose computers ever put into use, Boston is now a hub for biotechnology, robotics and artificial intelligence — though it’s also home to several B2B software giants. So it’s no surprise that the city consistently ranks among the greatest startup ecosystems in the world.

Key Facts About Boston Tech

  • Number of Tech Workers: 269,000; 9.4% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Thermo Fisher Scientific, Toast, Klaviyo, HubSpot, DraftKings
  • Key Industries: Artificial intelligence, biotechnology, robotics, software, aerospace
  • Funding Landscape: $15.7 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Summit Partners, Volition Capital, Bain Capital Ventures, MassVentures, Highland Capital Partners
  • Research Centers and Universities: MIT, Harvard University, Boston College, Tufts University, Boston University, Northeastern University, Smithsonian Astrophysical Observatory, National Bureau of Economic Research, Broad Institute, Lowell Center for Space Science & Technology, National Emerging Infectious Diseases Laboratories

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account