Gigster Logo

Gigster

Data Engineer

Reposted 6 Days Ago
Remote
2 Locations
Mid level
Remote
2 Locations
Mid level
The Data Engineer will design and optimize ETL/ELT data pipelines, integrate public APIs, and support NLP model training with large datasets.
The summary above was generated by AI

Do you want to work on cutting-edge projects with the world’s best IT engineers? Do you wish you could control which projects to work on and choose your own pay rate? Are you interested in the future of work and how the cloud will form teams? If so - the Gigster Talent Network is for you.

Our clients rely on our Network for two main areas, Software Development and Cloud Services. In some cases, they need help building great new products, in others they want our expertise in migrating, maintaining, and optimizing their cloud solutions.

At Gigster, whether working with entrepreneurs to realize ‘the next great vision’ or with Fortune 500 companies to deliver a big product launch, we build really cool enterprise software on cutting-edge technology.


The Role:

We are seeking an experienced Data Engineer with deep expertise in data transformation at scale, particularly in integrating and processing data from third-party public APIs. This role is critical to enhancing and maintaining data pipelines that feed into Natural Language Processing (NLP) models.


What you’ll do:
  • Design, build, and optimize scalable ETL/ELT data pipelines using Apache Spark, Apache Kafka, and orchestration tools such as Prefect or Airflow

  • Integrate external data sources and public APIs with internal data systems

  • Work with large-scale datasets to support NLP model training and inference

  • Analyze existing pipelines and recommend enhancements for performance, reliability, and scalability

  • Collaborate with cross-functional teams, including data scientists and ML engineers

  • Own the end-to-end engineering process—from planning and technical design to implementation

  • Regularly report progress and outcomes to client stakeholders

 What we’re looking for:
  • Proficiency in Python and experience with data transformation and data engineering best practices

  • Strong experience with Apache Spark, Apache Kafka, and Google Cloud Platform (GCP)

  • Hands-on experience with workflow orchestration tools (e.g., Prefect, Airflow)

  • Demonstrated experience working with large datasets and real-time data processing

  • Experience building and maintaining ETL/ELT pipelines for analytical or machine learning use cases

  • Self-motivated, with excellent communication and project ownership skills


Preferred Qualifications:
  • Familiarity with financial services data or regulated data environments

  • Experience with Snowflake or Google BigQuery

  • Experience with PostgreSQL and GCS (Google Cloud Storage)
  • Exposure to NLP workflows and data requirements for machine learning models


Logistics:
  • This is a part-time, short term, 4 to 6 weeks contract
  • Preferred location: Remote US
 

Top Skills

Airflow
Apache Kafka
Spark
Google Bigquery
Google Cloud Platform
Google Cloud Storage
Postgres
Prefect
Python
Snowflake

Similar Jobs

3 Days Ago
In-Office or Remote
4 Locations
Senior level
Senior level
Big Data • Fitness • Healthtech • Software • Analytics • Energy
The Senior Data Engineer will develop solution architecture, automate data pipelines, and support integrations for healthcare clients, ensuring effective data management.
Top Skills: AWSCloud ArchitectureDatabase ArchitectureJavaKafkaNifiNosql DatabasesPythonSparkSQL
5 Days Ago
In-Office or Remote
San Francisco, CA, USA
117K-183K Annually
Mid level
117K-183K Annually
Mid level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
As a Data Engineer at Atlassian, you'll build analytical data models, create scalable data pipelines, and improve data quality while collaborating with various teams.
Top Skills: AirflowAWSDbtJavaPythonSparkSQL
5 Days Ago
In-Office or Remote
San Francisco, CA, USA
117K-183K Annually
Mid level
117K-183K Annually
Mid level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
As a Data Engineer, you will build analytical data models and scalable data pipelines, addressing business requirements and continuous improvement.
Top Skills: AirflowAWSDbtJavaPythonSparkSparksqlSQL

What you need to know about the Boston Tech Scene

Boston is a powerhouse for technology innovation thanks to world-class research universities like MIT and Harvard and a robust pipeline of venture capital investment. Host to the first telephone call and one of the first general-purpose computers ever put into use, Boston is now a hub for biotechnology, robotics and artificial intelligence — though it’s also home to several B2B software giants. So it’s no surprise that the city consistently ranks among the greatest startup ecosystems in the world.

Key Facts About Boston Tech

  • Number of Tech Workers: 269,000; 9.4% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Thermo Fisher Scientific, Toast, Klaviyo, HubSpot, DraftKings
  • Key Industries: Artificial intelligence, biotechnology, robotics, software, aerospace
  • Funding Landscape: $15.7 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Summit Partners, Volition Capital, Bain Capital Ventures, MassVentures, Highland Capital Partners
  • Research Centers and Universities: MIT, Harvard University, Boston College, Tufts University, Boston University, Northeastern University, Smithsonian Astrophysical Observatory, National Bureau of Economic Research, Broad Institute, Lowell Center for Space Science & Technology, National Emerging Infectious Diseases Laboratories

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account