Join a high-performing, tight-knit team at a fast-growing company using the Internet of Things (IoT) to transform how organizations maintain compliance, enhance safety, and reimagine operations. SmartSense by Digi and Jolt are trusted by some of the world’s most recognizable brands including CVS Health, Walgreens, Walmart, McDonald’s, Jack in the Box, Hartford HealthCare, and Children’s Minnesota to protect their operations and the people they serve. We’re looking for team-oriented change agents who want to help shape the future of IoT.
- Join a tightly knit team solving hard problems the right way
- Understand the various sensors and environments critical to our customers’ success
- Know the data flows and technology that are currently in use to transform raw data into analytic products
- Build relationships with the awesome team members across other functional groups
- Learn our code practice, work in our code base, write tests, and collaborate with us in our workflows
- Contribute to on-boarding processes and make recommendations to make on-boarding process better
- Demonstrate your capabilities defining solutions, implementing, and delivering data products for your user stories and tasks
- Contributing to systems and processes to implement and automate quality on data pipeline deliverables
- Implement data tests for quality and focuses on improving inefficient tooling and adopting new transformative technologies while maintaining operational continuity.
- Contribute to the quality of data and the pipeline after working closely with the product team and stakeholders to understand how our products are used
- Identify opportunities to improve our infrastructure, operational performance, and data pipeline deliverables
- Evaluate new technologies and build proof-of-concept systems to enhance Data Engineering capabilities and data products
- Contribute to improving the efficiency of our pipeline scripts, automation, and general data operations
- Demonstrate command and accountability for the design and implementation of new features
- Develop and support data operations and efficiencies in production
- Demonstrate competencies in data modeling new and existing capabilities while progressing the maturity of our data
- Influence your peers through your excellence in delivering high quality data products and code reviews
- Deliver operational data from the data platform to software and analytic teams producing aggregate metrics from real time data streams
- Establish a reputation for reliability in data contextualization and troubleshooting with the team
- Improve the velocity of development of data ingestion, orchestration, fusion, transformation, and data analysis.
- Deliver infrastructure required for optimal extraction, transformation, and data loading in predictive analytic contexts
- Transform ETL development with optimizations for efficient storage, retention policies, access, and computation while accounting for cost
- Contributing to the strategic maturity of all our operations and delivery of product requests
- Define orchestrations of data transformations that distill information to highly valuable signals for ML models
- Collaborate with your teammates to deliver a data analytics and AI platform for advanced analytic data product development
- Join a tightly knit team solving hard problems the right way
- Understand the various sensors and environments critical to our customers’ success
- Learn the data models and flows that are currently in use to transform raw data into analytic products
- Build relationships with the awesome team members across other functional groups
- Learn our code practice, work in our code base, write tests, and collaborate with us in our workflows
- Contribute to on-boarding processes and make recommendations to make on-boarding process better
- Demonstrate your capabilities defining solutions, implementing, and delivering data products for your user stories and tasks
- Implement data quality tests, support existing pipelines & procedures, and optimize warehouse
- Work closely with the product team and stakeholders to understand how our products are used
- Identify opportunities to improve our infrastructure, operational performance, and data pipeline deliverables and influence us all to be better
- Evaluate new technologies and build proof-of-concept systems to enhance Data Engineering capabilities and data products
- Contribute to improving the efficiency of our automation and general data operations
- Design and implement new features and be accountable for their performance
- Deliver high quality operational data
- Generate high quality documentation and detailed analysis
- Articulate conceptual, logical, and physical data models in confluence
- Join the on-call rotation for your team supporting product services and responding to incidents.
- Establish a reputation as a partner in data analysis and contextualization with clear articulations about our data space for targeted internal audiences.
- Deliver infrastructure required for optimal extraction, transformation, and data loading in predictive analytic contexts
- Transform ETL development with optimizations for efficient storage, retention policies, access, and computation while accounting for cost
- Contribute to the strategic maturity of our operations and delivery of product requests
- Key Player in the design and delivery of the data pipelines and engineering infrastructure which support machine learning systems at scale
- Collaborate with your teammates to advance our architecture in support of the predictive analytics roadmap
- Bachelor’s or master’s degree in a technical or quantitative field.
- 5+ years of hands-on Data Engineering experience, delivering production-grade solutions at scale.
- Expert in Snowflake, with proven ability to design, optimize, and deploy high-quality solutions for large-scale environments.
- Advanced SQL and Python skills, including writing efficient, reusable, and well-documented code.
- Proven experience building and maintaining ETL/data pipelines, including orchestration, monitoring, and optimization for performance and cost.
- Strong knowledge of data warehousing, data lakes, and relational/non-relational databases.
- Experience with managed cloud services (AWS or GCP) and implementing secure, scalable data solutions.
- Experience delivering and articulating data models to support enterprise and data product needs.
- Proficiency in DBT, including authoring transformations and automated tests.
- Experience implementing automated testing frameworks (unit tests, integration tests, data-quality checks) for data pipelines.
- Strong Git and Agile/Scrum experience, including code reviews and collaborative workflows.
- Excellent communication skills to articulate complex technical concepts simply and collaborate effectively across teams.
- Experience participating in design and code reviews and communicating feedback respectfully.
- Must have experience authoring stories and bugs independently and in team grooming sessions.
- Core technologies: SQL, Python, JavaScript, Snowflake, RESTful, Atlassian, DBT, Git, AWS
- Experience using GenAI tools (e.g., Windsurf, Claude, Copilot, Cursor) to accelerate development and improve data workflows.
- Proven ability to build REST APIs using Python web frameworks such as FastAPI.
- Familiarity with the Data Science lifecycle, including Machine Learning DataOps and supporting ML model training pipelines.
- Hands-on experience with orchestration tools such as Airflow or Luigi for managing complex data workflows.
- Knowledge of data governance practices, including handling PII and implementing secure data access paradigms.
- Experience working with time-series telemetry data, including aggregation and optimization for analytics.
- Snowflake SnowPro Certification is a plus; familiarity with other cloud data platforms or Lakehouse architectures is desirable.
- Experience integrating with BI platforms and building data workflows for analytics and reporting.
- Background in supporting production environments, including participation in on-call rotations and incident response.
- Experience working with Kubernetes or other container-orchestration system.
- Experience deploying data pipelines and data models to a production environment.
- Experience operating and monitoring production data pipelines.
*Please note that we are unable to provide visa sponsorship for this position. This includes, but is not limited to, work visas, employment-based visas, or residency sponsorship. Candidates must have valid work authorization in the United States at the time of application. Visa applications of any kind will not be considered.
Digi International offers a distinctive Total Rewards package including a short-term incentive program, new hire stock award, paid parental leave, open (uncapped) PTO, and hybrid work environment in addition to our competitive medical, health & wellbeing and compensation offerings.
The anticipated base pay range for this position is $95,000 – $149,000. Pay ranges are determined by role, job level and primary job location. The range displayed reflects the reasonable range we anticipate paying for this position and reflects the cost of labor within several U.S. geographic markets. The specific salary offered within the range will depend on various factors including, but not limited to the candidate’s relevant and prior experience, education, skills, and primary work location. It is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each position. Pay ranges are typically reviewed and updated annually.
At Digi, we embrace diversity and inclusion among our teammates. It is critical to our success as a global company, and we seek to recruit, develop and retain the most talented people from a diverse candidate pool. We are committed to providing an environment of respect where equal employment opportunities are available to all applicants and teammates.
Top Skills
Similar Jobs
What you need to know about the Boston Tech Scene
Key Facts About Boston Tech
- Number of Tech Workers: 269,000; 9.4% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Thermo Fisher Scientific, Toast, Klaviyo, HubSpot, DraftKings
- Key Industries: Artificial intelligence, biotechnology, robotics, software, aerospace
- Funding Landscape: $15.7 billion in venture capital funding in 2024 (Pitchbook)
- Notable Investors: Summit Partners, Volition Capital, Bain Capital Ventures, MassVentures, Highland Capital Partners
- Research Centers and Universities: MIT, Harvard University, Boston College, Tufts University, Boston University, Northeastern University, Smithsonian Astrophysical Observatory, National Bureau of Economic Research, Broad Institute, Lowell Center for Space Science & Technology, National Emerging Infectious Diseases Laboratories


