Cogito is growing our Data Engineering team as we continue to improve how enterprise clients speak with their customers. As a Senior Software Engineer, you will provide engineering skills for the development, testing, and delivery of large-scale data assets and workflows. You've got industry experience working with large datasets. You are interested in reporting platforms and data visualization.
You’ll be the genius who understands data at Cogito, knows where to find it, and manages the process to make that data actionable for analytics. You love thinking about the ways the business can consume this data and then figuring out how to build data pipelines and data models provide them for business uses.
You'll own a problem end-to-end, so those skills will come in handy not just to collect, extract, clean and transform data, but also to understand the systems that generated it, and automate your analyses and reporting. On an ongoing basis, you'll be responsible for improving the data by adding new sources, coding business rules, and producing new metrics that support the business.
- Design, develop, test and deliver industry-strength data pipelines to collect, organize, and standardize data for generating insights and addressing reporting needs. This includes batch and real-time data flows.
- Design and develop new capabilities to allow for self-serve data and analytics.
- Collaborate with peers, ship quality products and help foster a great culture that is both high-performing and fun to be a part of.
- BS in Computer Science or a related field.
- 5+ years of progressively related experience, or equivalent in work plus education.
- Experience with and understanding of AWS and public cloud ecosystem.
- Strong programming experience in Java, Python or equivalent and ability to learn new languages as required.
- Strong SQL and NoSQL skills.
- Strong verbal and written communication skills.
- Superior attention to detail.
- Ability and willingness to learn new tools and technologies.
- Experience with solution building and architecting with public cloud offerings like AWS S3, Lambda functions, Presto/Athena or similar solutions, DynamoDB, Redshift, EMR/Spark.
- Deep understanding of big data challenges and ecosystems.
- Experience with Serverless architecture or other Big Data architecture best practices.
- Experience with a streaming platform such as Kinesis, Kafka, Spark Streaming, Flink, etc.
- Experience with PostgreSQL and/or Aurora.
- Working understanding of container technologies such as Docker.
- Familiarity with agile development methodologies.
- Experience with CI/CD pipelines, process and automation.
- Experience in test automation and ensuring data quality across multiple datasets used for analytical purposes.
- Fluency with Unix / Linux operating system and development environment.
• Your choice of comprehensive benefits for you and your family’s health, dental, vision, disability, and life insurance
• Frequent catered lunch and live product demos
• 401(k) retirement plan options
• Ongoing professional development and cross-training
• 20 days vacation time, 5 days sick time, 2 floating holidays and 11 company holidays (yes, Patriot’s Day is a holiday)
• 2 "Be Gentle" personal days
• Company paid parental leave upon hire
• Competitive pay, stock options, and annual bonus eligibility
• Casual dress and fun office atmosphere
• Pre-tax commuter benefits
• Stocked groceries in the kitchen
• Office location in the heart of Boston with convenience to the MBTA lines
Equal Opportunity Employer
Cogito is a proud equal opportunity employer. We are committed to fair hiring practices and to creating a welcoming environment for all team members. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, disability, age, familial status or veteran status.
Authorization to Work
Applicants for employment in the US must be authorized to work in the US.