Data Modeler (Analytical Systems) at MassMutual
At MassMutual, we’re passionate about helping millions of people find financial freedom and this passion has driven our approach to developing meaningful experiences for our customers. The Data Engineering team, part of MassMutual’s Enterprise Technology & Experience, is comprised of highly skilled, collaborative, problem solvers who are motivated to create innovative solutions that exceed the changing needs of our customers and move MassMutual and the industry – forward.
To continue our cutting-edge work, we are hiring a Data Modeler to join our team.
What great looks like in this role
- Our ideal Data Modeler has advanced knowledge of application, data, and infrastructure disciplines. You’ll use your skills to implement data strategies, build data flows, and develop conceptual data models. You’re capable of working independently and communicating effectively to provide feedback on policies, procedures, processes, and standards. The team culture of working collaboratively, cross-functionally, using new technologies combined with the work/life balance provided by MassMutual are core reasons people enjoy working on the Data Engineering team at MassMutual.
Objectives of this role
- Manage database design and data models for a specific application.
- Analyze existing data and recommend changes that lead to a reduction of stored data without impacting business requirements.
Daily and monthly responsibilities
- Perform as a technical data steward – understanding tables, data use, data replication, data connection, and data lineage – and take accountability for data hygiene.
- Review existing data model and recommend changes to optimize data requirements.
- Analyze source system data and look for data redundancy and duplication.
- Work with Application Architects to review exiting data models and suggest action to reduce input data for storage optimization.
- Work with the development team to implement data strategies, build data flows, and develop conceptual data models.
- Create logical and physical data models using best practices.
- Delivers and provides feedback for data modeling policies, procedure, processes and standards.
- Assists with capturing and documenting system flow and other pertinent technical information about data, database design, or systems.
- Bachelor’s degree in computer engineering, computer science, information systems, or related field.
- 5+ years experiences with data analytics, data modeling or data architecture and database design.
- Experience in data modeling or data architecture in transactional and operational reporting and analytical (EDW, Data Lake, NoSQL) solutions.
- Experience with capacity planning, database scripting and package deployment
- Good knowledge of data replication methodology.
- Good knowledge of data warehouse, data mart and Data Lake
- Experience with AWS and cloud-based databases and data warehouses.
- Authorized to work in the United States with or without sponsorship now or in the future.
- Advanced knowledge of application, data and infrastructure disciplines.
- Experience with Vertica Database.
- Experience with Life Insurance or any insurance related products
- Expertise in tuning and debugging SQL and resolving application specific bottlenecks.
- Experience facilitating meetings and providing presentations to stakeholders and senior leaders.
- Experience using (Erwin, TOAD, or any other data modeling tool) for data modeling.
- Data modeling, data warehousing, dimensional modeling, data modeling for big data and metadata management.
- Data Lake and Big Data modeling experience.
- Excellent communication, problem solving, organizational and analytical skills
- Understanding of enterprise and reporting modeling concepts, including dimensional modeling, snowflakes, slowly changing dimensions, schema on read, irregular dimensions, and surrogate, compound and intelligent keys.
- Advanced degree in computer engineering, computer science, information systems, or related field.
- Able to work independently.
- Experience with Hadoop, Spark, and Kafka
- Experience with Scala
- Strong communication and interpersonal skills.
- Ability to present design of the solution to various stakeholders.
- Experience in data modeling and design in the insurance and financial industries