Massive Rocket | Global Braze & Snowflake Agency Logo

Massive Rocket | Global Braze & Snowflake Agency

Senior Data Engineer (Kafka)

Posted 2 Days Ago
Be an Early Applicant
Remote
28 Locations
Senior level
Remote
28 Locations
Senior level
The Senior Data Engineer will design and manage Kafka-based data pipelines, ensuring data quality, security, and operational reliability while collaborating with cross-functional teams.
The summary above was generated by AI
Who We Are
Massive Rocket is a high-growth Braze & Snowflake agency that has made significant strides in connecting digital marketing teams with product and engineering units. Founded just 5 years ago, we have experienced swift growth and are now at a crucial juncture, aspiring to reach $100M in revenue. Our focus is on delivering human experiences at scale, leveraging the latest in web, mobile, cloud, data, and AI technologies. We pride ourselves on innovation and the delivery of cutting-edge digital solutions.

Every role at Massive Rocket is Entrepreneurial - Successful people at Massive Rocket will not only think about their role but understand the roles around them, their goals and contribute to the success and growth of their team, customers and partners.

What We Offer
🚀 Fast-moving environment – you will never stop learning and growing
❤️ Supportive and positive work culture with an emphasis on our values
🌍 International presence – work with team members in Europe, the US, and around the globe
🪐 100% remote forever
🧗🏼‍♂️ Career progression paths and opportunities for promotion/advancement
🍕 Organized team events and outings

What we’re looking for
As a Senior Data Engineer (Kafka) own our real‑time data streaming stack. Design, build, and operate Kafka-based pipelines (Confluent Cloud/self‑managed) that deliver reliable, compliant, and scalable event flows across systems. Partner with cross‑functional teams to turn business needs into event-driven solutions.

Collaborate with dynamic teams to deliver amazing solutions to our clients. Your keen insight into industry trends and best practices will keep our customer data management at the cutting edge, driving innovation and excellence.

Elevate your career with us and be a catalyst for transformative customer experiences.

Responsibilities

1) Platform & Pipelines
Architect topics, partitions, retention, and compaction; build ingestion with Kafka Connect/Debezium; implement stream processing (Kafka Streams/ksqlDB/Flink/Spark).

2) Data Quality & Governance
Enforce schemas (Schema Registry: Avro/Protobuf), validation, PII handling, GDPR/CCPA; monitoring & alerting for lag, errors, and anomalies.

3) Operations
Ensure HA, DR, and security (ACLs/RBAC, encryption); manage CI/CD, IaC (Terraform/Helm), cost/throughput tuning.

4) Collaboration
Work with Data/Analytics/Marketing/Product to define events, SLAs, and interfaces; provide best‑practice guidance and enablement.

5) Optimization & Innovation
Continuously reduce latency/cost, improve reliability, evaluate new streaming tools/patterns.



Required Skills and Qualifications:

Data Engineering: 6+ years of experience as a Data Engineer or Data Integration Engineer.
Kafka Expertise: 3+ years of hands-on production experience with Kafka (Confluent Platform/Cloud), including Kafka Connect, Schema Registry, and Kafka Streams/ksqlDB (or alternatives such as Flink/Spark Structured Streaming).
Programming: Proficiency in Python, with experience in building services, tooling, and test frameworks.
Event-Driven Systems: Strong understanding of event modeling, idempotency, DLQs, backpressure handling, and data formats (Avro, Protobuf, JSON).
Communication & Collaboration: Excellent communication skills (English C1 level); proven experience in agency, consulting, or client-facing environments.



Preferred Qualifications:

Cloud & Infrastructure: 2+ years working with AWS, Azure, or GCP; strong knowledge of Docker and Kubernetes; CI/CD pipelines; Git.
Containers & Orchestration: Proficiency with Kubernetes and Docker.
Observability & Reliability: Experience with monitoring and observability tools (Prometheus, Grafana, Datadog) and incident response processes.
CI/CD Tools: Familiarity with GitLab for pipeline automation.
Programming: Kotlin for Kafka Streams development (≈20% of workload).
Data Analysis: Working knowledge of Jupyter Notebook for exploratory analysis.
Infrastructure as Code & Operations: Experience with Terraform or CloudFormation for infrastructure setup and management.
Data Stores: Strong understanding of relational databases and big data management practices.
Customer Data Platforms (CDPs): Hands-on experience with mParticle.
Tag Management: Familiarity with Tealium (a strong plus).


If you're ready to launch your career to new heights at a company fuelled by passion and innovation, we want to hear from you!

Top Skills

Avro
Confluent Cloud
Datadog
Docker
Flink
Gitlab
Grafana
Jupyter Notebook
Kafka
Kafka Connect
Kafka Streams
Ksqldb
Kubernetes
Mparticle
Prometheus
Protobuf
Python
Spark
Tealium
Terraform

Similar Jobs

9 Hours Ago
Remote or Hybrid
3 Locations
Mid level
Mid level
Big Data • Food • Hardware • Machine Learning • Retail • Automation • Manufacturing
This role involves optimizing DevOps services, managing CI/CD processes, ensuring security in software development, and driving improvements across teams and technologies.
Top Skills: BambooCi/CdDevOpsGitJenkinsJfrog
9 Hours Ago
Remote or Hybrid
5 Locations
5-5
Senior level
5-5
Senior level
Big Data • Food • Hardware • Machine Learning • Retail • Automation • Manufacturing
The Security Detection Engineer Manager will lead a team to design detection strategies, analyze threats, and enhance cybersecurity measures. The role requires collaboration with incident response teams and continuous improvement of detection methodologies.
Top Skills: BashPowershellPythonSiem Systems
9 Hours Ago
Easy Apply
Remote
29 Locations
Easy Apply
Senior level
Senior level
Cloud • Security • Software • Cybersecurity • Automation
The Senior Marketing Operations Manager leads Account Based Marketing strategies, optimizing performance through data management and technology integration while ensuring campaign efficacy and account engagement.
Top Skills: 6SenseMutinySalesforce

What you need to know about the Boston Tech Scene

Boston is a powerhouse for technology innovation thanks to world-class research universities like MIT and Harvard and a robust pipeline of venture capital investment. Host to the first telephone call and one of the first general-purpose computers ever put into use, Boston is now a hub for biotechnology, robotics and artificial intelligence — though it’s also home to several B2B software giants. So it’s no surprise that the city consistently ranks among the greatest startup ecosystems in the world.

Key Facts About Boston Tech

  • Number of Tech Workers: 269,000; 9.4% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Thermo Fisher Scientific, Toast, Klaviyo, HubSpot, DraftKings
  • Key Industries: Artificial intelligence, biotechnology, robotics, software, aerospace
  • Funding Landscape: $15.7 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Summit Partners, Volition Capital, Bain Capital Ventures, MassVentures, Highland Capital Partners
  • Research Centers and Universities: MIT, Harvard University, Boston College, Tufts University, Boston University, Northeastern University, Smithsonian Astrophysical Observatory, National Bureau of Economic Research, Broad Institute, Lowell Center for Space Science & Technology, National Emerging Infectious Diseases Laboratories

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account