Abacus Insights is a mission-driven, start-up technology company that is focused on improving health outcomes, lowering the cost of healthcare, and delivering a more seamless healthcare experience. At our core, we are passionate about advancing healthcare and improving people’s lives through technology.
Abacus Insights provides a flexible, efficient, and secure platform that organizes and exchanges data from various sources and formats, allowing healthcare companies to uncover differentiated insights that address their customers’ needs. Our employees know that they play an active role in keeping our customers data safe and are responsible for ensuring that our comprehensive policies and practices are met.
With our deep expertise in cloud-enabled technologies and knowledge of the healthcare industry, we have built an innovative data integration and management platform that allows healthcare payers access to data that has been historically siloed and inaccessible. Through our platform, these health insurance payers can ingest and manage all the data they need to transform their business by supporting their analytical, operational, and financial needs.
Through this mission and passion to aid people and population health, we have built a highly successful SaaS business that is heavily funded (since our founding in 2017 we have raised over 53 million) by leading VC firms who have deep expertise in the healthcare and technology industries. At Abacus, we are solving problems of massive scale and complexity in an industry that is not only ripe for disruption but requires innovation. We see massive growth in our future and would love for you to be a part of it!
Our Connector Engineering team is looking to bring on an experienced Senior Data Engineer. If you are interested in being a guiding voice on a critical feature delivery team, this position is for you. The connector engineering team is responsible for delivering the data pipelines that ingest and process diverse sources of data. This includes building large batch processing and streaming systems. You will be exposed to every area of our platform and AWS services (Serverless – Lambda, EMR – Hadoop/Spark, EKS – Kubernetes, etc.) but even more so, help drive the evolution of the product and team as we continue to grow rapidly. We believe heavily in architecture evolution and you will bring your experience on best practices as we build out new components of the platform.
- Help deliver against our roadmap for upcoming new features
- Designing, implementing, testing, and maintaining our data pipelines
- Optimizing data delivery, designing or re-designing infrastructure for greater scalability, availability, and reliability.
- Spearheading the implementation of data management and governance procedures within the company
- Ensuring data integrity and accuracy within our various data pipelines
- Become an expert in AWS and other cloud service providers (currently we deploy on AWS but who knows what the future holds)
- Help make the deployment process predictably predictable through proactive thinking, persistence, promoting best practices, and cross team collaboration with our technical and non-technical teams
- Consult on architecture to separate sensitive healthcare data among multiple customers in a multi-tenant environment
What you bring to the Abacus Team:
- In-depth knowledge of the design and maintainability of “big data” data architectures, notably data lakes, columnar databases, large batch processing (Spark, Hadoop), stream processing (Kafka) and message queuing
- In-depth experience with AWS data services (EMR, Glue, Athena, Redshift, Kinesis, SQS, DynamoDB or MongoDB), serverless architectures (Lambda) and container architectures (Kubernetes)
- In-depth experience building and shipping highly scalable and distributed systems on cloud platforms.
- Strong programming skills (Python, Java or other OOP languages)
- Strong experience with hands-on data analysis and modeling of large data sets.
- Strong project management and organizational skills.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Experience coupled with a willingness to learn new things
- Desire to contribute individually, with the ability both to self-direct work and to collaborate with others on the technology team
- Familiarity with infrastructure automation using Terraform and GitLab CICD
- Comfort with uncertainty and a desire to be close to business problems
- Background working in an Agile delivery framework
- 5+ years of experience with data engineering and “big data” architecture
- Experience with other Cloud Data Warehouses (Big Query, Snowflake)