*This role can be remote
Agero is powering the next generation of software-enabled driver safety services and technology, pushing the limits of big data to transform the entire driving experience. The majority of leading vehicle manufacturers and insurance providers use Agero’s roadside assistance, accident management, dispatch, consumer affairs and telematics innovations to strengthen their businesses and create stronger, lasting connections with their customers. Together, we’re making driving smarter and safer for everyone.
The Data Science and Analytics group at Agero is a central resource for innovative data products, scientific analysis, and actionable insights. We are a collaborative, consultative team that works cross functionally to:
- support partners throughout the organization in making informed, data driven decisions,
- unlock the value within our data to create innovative new product offerings, drive efficiency, and improve customer experience, and
- provide greater access to information and insights through dashboards, data self-service tools, and training.
We believe that data is a key asset, and thanks to Agero’s scale and history, is a true competitive advantage.
About the Role:
Agero’s Data Science and Analytics team is building a new high-performance, cloud-native data platform to support our analytical products and machine learning pipelines. We are looking for an exceptional DevOps Engineer to join the team to help develop and manage deployment processes for the platform itself, and the various data intensive applications that the team produces.
Our applications require movement of a lot of complex data and involve a wide variety of underlying technologies. The ideal candidate is someone that loves to be challenged, learn, and is driven to improve efficiency of the development team they are working with. We are an agile, results driven group focused on delivering impactful products. We highly value experimentation, adaptability, curiosity and critical thought.
The platform is built on AWS with Snowflake as the data warehouse. We rely heavily on Python and use Airflow to manage a variety of complex workflows. The platform provides the company as a whole with secure, centralized, and reliable access to all enterprise data. At the same time, it reduces effort required to ingest new sources of data or build workflows to support new modeling or reporting use cases. The team is also responsible for a wide variety of applications built on top of the platform including APIs, reporting, and machine learning products that are used throughout the business.
The primary responsibility of the role will be to design and implement automated processes to support various Data Science, Data Engineering, and Analytics products. This includes CI/CD processes for APIs, ETL workflows built on Airflow, and machine learning applications, automation of infrastructure provisioning and management, and management of security practices.
- Develop CI/CD processes and tooling for Airflow and API deployments
- Automate, manage, and test the provisioning of cloud infrastructure
- Develop processes and tooling for security practices
- Help to create testing framework for key applications
- Ensure applications remain highly available and resilient to outages
- Address performance issues as they relate to infrastructure capacity planning
- Help to create automated data validation processes
- Develop processes for securely and reliably managing data sets
- Help to automate migration of reporting and end-user applications from a legacy to our next generation analyticsto the new platform
- Define and implement the software development lifecycle for the delivery of high performance analytics API'sDevelop APIs that expose data to other internal applications
Skills, Experiences and Education:
- Experience developing and managing DevOps processes and tooling in a team setting
- 5+ years of coding experience with expertise in Python and SQL
- Experience with Docker, Kubernetes, and Serverless
- Cloud computing (Ideally AWS technologies including S3, Lambda, DynamoDB, API Gateway)
- Excellent communication skills both in written (technical documents, Python notebooks) and spoken (meetings, presentations) forms
- Willing and able to learn and meet business needs
- Independent, self-organizing, and able to prioritize multiple complex assignments
- Experience using Git and working on shared code repositories
- B.S. in Computer Science, Engineering or related technical discipline required
- API development
- Apache Airflow
Agero’s mission is to safeguard consumers on the road through a unique combination of platform intelligence and human powered solutions, strengthening our clients’ relationships with their drivers. We are a leading provider of driving solutions, including roadside assistance, accident management, consumer affairs and telematics. The company protects 115 million vehicles in partnership with leading automobile manufacturers, insurance carriers and other diversified clients. Managing one of the largest national networks of service providers, Agero responds to more than 12 million requests annually for assistance. Agero, a member company of The Cross Country Group, is headquartered in Medford, Mass., with operations throughout North America. To learn more, visit www.agero.com and follow on Twitter @AgeroNews.