Wynd Labs Logo

Wynd Labs

Data Engineer

Reposted Yesterday
Remote
100K-140K
Mid level
Remote
100K-140K
Mid level
As a Data Engineer, you will build and optimize data pipelines, manage ETL workflows, and ensure efficient data access and quality within the infrastructure.
The summary above was generated by AI
Data Engineer

$100k - $140k

Who We Are.

Wynd Labs is an early-stage startup that is on a mission to make public web data accessible for AI through contributions to Grass.

Grass is a network sharing application that allows users to share their unused bandwidth. Effectively, this is a residential proxy network that directly rewards individual residential IPs for the bandwidth they provide. Grass will route traffic equitably among its network and meter the amount of data that each node provides to fairly distribute rewards.

In non-technical terms: Grass unlocks everyone's ability to earn rewards by simply sharing their unused internet bandwidth on personal devices (laptops, smartphones).

This project is for those who lead with initiative and seek to challenge themselves and thrive on curiosity.

We operate with a lean, highly motivated team who revel in the responsibility that comes with autonomy. We have a flat organizational structure, the people making decisions are also the ones implementing them. We are driven by ambitious goals and a strong sense of urgency. Leadership is given to those who show initiative, consistently deliver excellence and bring the best out of those around them. Join us if you want to set the tone for a fair and equitable internet.

The Role.

We are seeking a Data Engineer with expertise in building and maintaining robust data pipelines and integrating scalable infrastructure. You will join a small, talented team and play a critical role in designing and optimizing our data systems, ensuring seamless data flow and accessibility. Your contributions will directly support our mission to position Grass as a key player in the evolution of data-driven innovation on the internet.

Who You Are.

  • Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or a related technical field.
  • Extensive experience with database systems such as Redshift, Snowflake, or similar cloud-based solutions.
  • Advanced proficiency in SQL and experience with optimizing complex queries for performance.
  • Hands-on experience with building and managing data pipelines using tools such as Apache Airflow, AWS Glue, or similar technologies.
  • Solid understanding of ETL (Extract, Transform, Load) processes and best practices for data integration.
  • Experience with infrastructure automation tools (e.g., Terraform, CloudFormation) for managing data ecosystems.
  • Knowledge of programming languages such as Python, Scala, or Java for pipeline orchestration and data manipulation.
  • Strong analytical and problem-solving skills, with an ability to troubleshoot and resolve data flow issues.
  • Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes) technologies for data infrastructure deployment.
  • Collaborative team player with strong communication skills to work with cross-functional teams.

What You'll Be Doing.

  • Designing, building, and optimizing scalable data pipelines to process and integrate data from various sources in real-time or batch modes.
  • Developing and managing ETL/ELT workflows to transform raw data into structured formats for analysis and reporting.
  • Integrating and configuring database infrastructure, ensuring performance, scalability, and data security.
  • Automating data workflows and infrastructure setup using tools like Apache Airflow, Terraform, or similar.
  • Collaborating with data scientists, analysts, and other stakeholders to ensure efficient data accessibility and usability.
  • Monitoring, troubleshooting, and improving the performance of data pipelines and infrastructure to ensure data quality and flow consistency.
  • Working with cloud infrastructure (AWS, GCP, Azure) to manage databases, storage, and compute resources efficiently.
  • Implementing best practices for data governance, data security, and disaster recovery in all infrastructure designs.
  • Staying current with the latest trends and technologies in data engineering, pipeline automation, and infrastructure as code.

Why Work With Us.

  • Opportunity. We are at at the forefront of developing a web-scale crawler and knowledge graph that allows ordinary people to participate in the process, and share in the benefits of AI development.
  • Culture. We’re a lean team working together to achieve a very ambitious goal of improving access to public web data and distributing the value of AI to the people. We prioritize low ego and high output.
  • Compensation. You’ll receive a competitive salary and equity package.

Top Skills

Apache Airflow
AWS
Aws Glue
Azure
CloudFormation
Docker
GCP
Java
Kubernetes
Python
Redshift
Scala
Snowflake
SQL
Terraform

Similar Jobs

11 Days Ago
Easy Apply
Remote or Hybrid
2 Locations
Easy Apply
90K-100K
Junior
90K-100K
Junior
AdTech • Big Data • Information Technology • Marketing Tech • Sales • Software
The Data Engineer will develop and optimize ETL pipelines and systems in Google Cloud Platform, collaborating with engineers and scientists to support B2B data solutions.
Top Skills: AirflowBigQueryComposerDataflowGoogle Cloud PlatformKubernetesPub/SubPythonSQL
16 Days Ago
Remote or Hybrid
Orlando, FL, USA
5-5
Senior level
5-5
Senior level
AdTech • Cloud • Digital Media • Information Technology • News + Entertainment • App development
As a Data Engineer II, you will design, build, and maintain data pipelines, collaborate with product managers, and troubleshoot production issues.
Top Skills: Apache AirflowAWSDynamoDBGlueJavaPysparkPythonRedshiftS3ScalaSQLSQL
20 Days Ago
Easy Apply
Remote or Hybrid
3 Locations
Easy Apply
Senior level
Senior level
Real Estate • Security • Software • Cybersecurity • PropTech
The Data Engineer will build and maintain data pipelines, contribute to architecture decisions, support data governance, and mentor peers, focusing on scalability and reliability.
Top Skills: AirflowBigQueryDbtKafkaPythonRedshiftScalaSnowflakeSpark StreamingSQL

What you need to know about the Boston Tech Scene

Boston is a powerhouse for technology innovation thanks to world-class research universities like MIT and Harvard and a robust pipeline of venture capital investment. Host to the first telephone call and one of the first general-purpose computers ever put into use, Boston is now a hub for biotechnology, robotics and artificial intelligence — though it’s also home to several B2B software giants. So it’s no surprise that the city consistently ranks among the greatest startup ecosystems in the world.

Key Facts About Boston Tech

  • Number of Tech Workers: 269,000; 9.4% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Thermo Fisher Scientific, Toast, Klaviyo, HubSpot, DraftKings
  • Key Industries: Artificial intelligence, biotechnology, robotics, software, aerospace
  • Funding Landscape: $15.7 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Summit Partners, Volition Capital, Bain Capital Ventures, MassVentures, Highland Capital Partners
  • Research Centers and Universities: MIT, Harvard University, Boston College, Tufts University, Boston University, Northeastern University, Smithsonian Astrophysical Observatory, National Bureau of Economic Research, Broad Institute, Lowell Center for Space Science & Technology, National Emerging Infectious Diseases Laboratories

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account