Sr Database Engineer/Architect
Headquartered in the Boston area, Interactions, LLC is the world’s largest independent AI company. We operate at the intersection of customer experience and AI – two of today’s most innovative and dynamic industries. Leading global brands in a variety of industries rely on Interaction’s conversational AI technology to communicate with their customers every day.
At Interactions we are committed to transforming customer experience and passionate about the professional and personal development of our talented and enthusiastic team. We endeavor to create opportunities that advance the skills, interests, careers and lives of our employees. Come join our growing team!
Position Overview:
The Senior Database Engineer/Architect is responsible for designing, implementing, maintaining and managing the big data infrastructure for our Virtual Assistant Platforms.
Essential Job Functions*:
- Design, implementation and installation of distributed big data infrastructure for high volume/velocity multi-tiered data storage, high availability and fault tolerance.
- Experience in the installation and maintenance of clustered big data environments based on Hadoop and related technologies.
- Design, construct, implement and support data warehouse databases for optimal reporting and analytics.
- Optimize database operations for maximum performance.
- Work with engineering teams on near real time data pipeline and streaming technologies.
- Work with the Operations team in performing routine and periodic database maintenance and develop and implement automated maintenance, compliance and archival strategies.
- Design and implement security strategies to ensure role based access, auditing, and authenticated access to data for internal users, external users and automated tools.
- Maintain the database infrastructure in multiple locations to handle the high throughput use cases that are critical to the Platform.
Other Duties and Responsibilities:
Ability to demonstrate Interactions Values of:
- Being passionate about customer service
- Obsessing with our customer’s success
- Respecting each other
- Creating opportunity
- Embracing disruption
- Doing what we say we will do
Preparation, Knowledge, Skills and Abilities:
Required:
- Bachelor’s Degree in Computer Science or equivalent.
- 10+ years of relevant experience.
- Solid understanding of query optimization of database technologies built on top of Hadoop and related technologies.
- Experience delivering data ingestion / ETL solutions at scale including logging, monitoring, debugging, and security.
- Experience with distributed processing technologies like M/R and Spark, including internals like scheduling and resource management.
- Experience in data modeling and schema design.
- Practical experience implementing data warehouse architectures.
- Experience with data warehouse front-end and reporting tools.
- Experience with database sizing, server specification, and network architecture specification.
- Virtualization and Data migration experience.
- Design and architecture for high availability, redundancy and fault tolerance experience.
- Basic Systems Administration skills with Linux based systems
- Scripting experience with bash, Perl or Python.
Pluses:
- Expertise with Hadoop ecosystem internals (HDFS, Hive, HBase, Oozie, Pig, Sqoop, etc) - storage, tuning, replication, etc.
- Experience managing Hadoop clusters including provisioning new nodes, managing alerts, tuning performance, and managing security
- Prior experience with managing big data platforms in both CoLo and public cloud platforms
- Prior experience with PostgreSQL 9.x is desirable