Employee Applicant Privacy Notice
Who we are:
Shape a brighter financial future with us.
Together with our members, we’re changing the way people think about and interact with personal finance.
We’re a next-generation financial services company and national bank using innovative, mobile-first technology to help our millions of members reach their goals. The industry is going through an unprecedented transformation, and we’re at the forefront. We’re proud to come to work every day knowing that what we do has a direct impact on people’s lives, with our core values guiding us every step of the way. Join us to invest in yourself, your career, and the financial world.
The role:
As a Risk Engineering - Senior Data Engineer, you will be a pivotal, hands-on expert within our data team. Leveraging your extensive experience, you'll design, build, and maintain highly scalable and robust data solutions that directly support our independent risk management goals. This role demands deep expertise in modern data warehousing, data modeling, and end-to-end pipeline development, preferably in an AI applications context. You will be an expert in our core stack: Snowflake, dbt, Apache Airflow, SQL, and Airtable. You will preferably have experience working with data products and technologies in an AI applications context. You will be responsible for design, build and develop of our data infrastructure.
You will be expected to autonomously drive and own projects from inception to completion, proactively identifying and mitigating risks. Your strategic thinking and technical leadership will not only deliver high-impact solutions but also help shape the data strategy for the team and department. You will provide mentorship and technical guidance to other engineers, championing best practices and fostering a culture of technical excellence.
What you’ll do:
- Lead the architecture and implementation of complex data solutions, collaborating with risk analysts and business stakeholders to translate advanced requirements into robust data models and pipelines.
- Design and build highly optimized data workflows on Snowflake, utilizing its advanced features for large-scale financial data processing, cost optimization, and performance tuning.
- Develop, manage, and optimize sophisticated ETL/ELT pipelines using dbt, Python, and Apache Airflow to ensure data integrity and address high-exposure risks within financial datasets.
- Integrate and manage data from various sources, including external data sources, into our core data platform, ensuring seamless and automated data flow for critical business applications.
- Design and configure workflows based in Airtable for complex orchestrations of tasks
- Drive the strategic evolution of our modern data platform, taking full ownership of its design, maintenance, and long-term direction.
- Establish and enforce advanced data governance and quality processes, acting as a primary point of contact for escalations related to data accuracy and compliance within the financial domain.
- Mentor and provide technical leadership to a team of data engineers, guiding them on best practices, complex problem-solving, and career development.
- Conduct expert-level code reviews and champion data engineering best practices across the team.
- Proactively troubleshoot and resolve highly complex data pipeline issues, preempting negative cross-functional impact.
What you’ll need:
- 4-7 years of hands-on experience in a senior data engineering role, with a strong background in data warehousing and modern data stack architectures.
- Demonstrated expertise in Snowflake, including deep knowledge of its architecture, advanced features, and performance optimization techniques for large datasets.
- Expert-level proficiency with dbt for complex data transformation, dependency management, and building production-grade data models.
- Advanced experience with Apache Airflow for orchestrating and managing mission-critical data workflows.
- Hands-on experience with streaming data platforms - Apache Kafka (preferred)
- Experience in building and maintaining CI/CD pipelines using tools like GitHub Actions, GitLab CI/CD
- Experience with Infrastructure as Code (IaC) using Terraform (preferred), CloudFormation, or Ansible.
- Familiarity with Airtable, specifically in configuring workflows, integrating it as a data source and automating data synchronization for business operations.
- Mastery of SQL for complex querying, data manipulation, and performance tuning.
- Experience in the banking or financial services industry is highly preferred, with a strong understanding of financial data models, compliance, and regulatory requirements.
- Advanced expertise in data governance, data cataloging, metadata management, and implementing robust data quality frameworks.
- Proven track record of providing technical leadership and mentoring other engineers.
Nice to have:
- Familiarity with data observability tools and practices.
- Knowledge of other cloud-native data platforms (e.g., AWS, GCP).
Top Skills
Similar Jobs at SoFi
What you need to know about the Boston Tech Scene
Key Facts About Boston Tech
- Number of Tech Workers: 269,000; 9.4% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Thermo Fisher Scientific, Toast, Klaviyo, HubSpot, DraftKings
- Key Industries: Artificial intelligence, biotechnology, robotics, software, aerospace
- Funding Landscape: $15.7 billion in venture capital funding in 2024 (Pitchbook)
- Notable Investors: Summit Partners, Volition Capital, Bain Capital Ventures, MassVentures, Highland Capital Partners
- Research Centers and Universities: MIT, Harvard University, Boston College, Tufts University, Boston University, Northeastern University, Smithsonian Astrophysical Observatory, National Bureau of Economic Research, Broad Institute, Lowell Center for Space Science & Technology, National Emerging Infectious Diseases Laboratories