DESCRIPTION SUMMARY:
This position is responsible for architecting real-time data processing centric solutions with a focus on complex applications, new algorithms, performance, integration of existing third party algorithms, and development of new visualizations. The Big Data Architect performs extensive deep dives with a wide-range of Big Data and traditional BI technologies. The Big Data Architect reviews, evaluates and recommends architecture strategy, and troubleshoots solutions.
ESSENTIAL FUNCTIONS:
- Drives the development of a Big Data and predictive analytics solution architecture This role is the technical architect and point person for the Big Data platform
- Pre-Sales consultation with various Product, Analyst, Customer Experience, Sales and other business teams
- Hands on role in architecting Big Data physical and logical topology
Effectively communicate solutions architecture to internal customer and project teams - Understands business objectives and suggests technical strategies to meet those objectives
- Senior technical contributor on multiple Big Data projects and assigns tasks to junior engineers, oversees the execution of tasks and provides mentorship and guidance as needed
- Contributes to business requirement definition and use case design as a technical expert. Converts business requirements into architectural designs and detailed technical designs
- Designs, architects and builds a data platform over Big Data Technologies
- Leads innovation by exploring, investigating, recommending, benchmarking and implementing data centric technologies for the platform
- Plans and executes large scale Hadoop/Hive deployment and fine tunes the performance of the predictive analytics solution
Identifies tasks, effort and dependencies based on software architecture and specifications - Translates designs and specifications into software components and writes software code/components using Java, Python, and Perl
Guide performance testing and recommend solutions for any performance bottle necks
Requirements
EDUCATION:
Bachelors in Computer Science, Engineering or Mathematics and 8 years of experience working with database and backend server technologies OR a Master's Degree in Computer Science, Engineering or Mathematics and 5 years' experience working with database and backend server technologies.
EXPERIENCE:
* 3 years with Big Data technologies
* 2 years developing with Apache Spark and Docker containers
* 1 year developing with AWS Stack using AWS Lambda
* 1 year developing with Mesos DC/OS
* 2 years developing with DynamoDB, MongoDB
* 3 years developing with HDFS and EMR
* 5 years developing data intensive applications using SQL
* 5 years developing backend server applications in Java