Abbott Logo

Abbott

Staff Data Engineer

Posted 6 Days Ago
Remote
Hiring Remotely in United States of America
97K-195K Annually
Senior level
Remote
Hiring Remotely in United States of America
97K-195K Annually
Senior level
The Staff Data Engineer is responsible for big data engineering, data wrangling, and analysis, implementing data pipelines, optimizing architecture, and collaborating with teams on data solutions.
The summary above was generated by AI

Abbott is a global healthcare leader that helps people live more fully at all stages of life. Our portfolio of life-changing technologies spans the spectrum of healthcare, with leading businesses and products in diagnostics, medical devices, nutritionals and branded generic medicines. Our 114,000 colleagues serve people in more than 160 countries.

     

JOB DESCRIPTION:

Interested in applying your wealth of technical knowledge and experience towards an opportunity in the medical field and improving the lives of people with diabetes?  The candidate will be responsible for big data engineering, data wrangling, and data analysis in the Cloud. The role will also contribute to defining and implementing Big Data Strategy for the organization along with driving the implementation of IT solutions for the business.  The candidate will be working with other data engineers, data analysts and data scientists to focus on applying data engineering, data science and machine learning approaches to solve business problems.
As a senior member of the Data Engineering & Analytics team, you will be building big data collection

and analytics capabilities to uncover customer, product and operational insights. Candidate should be able to work on a geographically distributed team to develop data pipelines capable of handling complex data sets quickly and securely as well as operationalize data science solutions. Additionally, they will be working in a technology-driven environment utilizing the latest tools and techniques such as Databricks, Redshift, S3, Lambda, DynamoDB, Spark and Python.

The candidate should have a passion for software engineering to help shape the direction of the team. Highly sought-after qualities include versatility and a desire to continuously learn, improve, and empower other team members. Candidate will support building scalable, highly available, efficient, and secure software solutions for big data initiatives.

Responsibilities

  • Design and implement data pipelines to be processed and visualized across a variety of projects and initiatives
  • Develop and maintain optimal data pipeline architecture by designing and implementing data ingestion solutions on AWS using AWS native services.
  • Design and optimize data models on AWS Cloud using Databricks and AWS data stores such as Redshift, RDS, S3
  • Integrate and assemble large, complex data sets that meet a broad range of business requirements
  • Read, extract, transform, stage and load data to selected tools and frameworks as required and requested
  • Customizing and managing integration tools, databases, warehouses, and analytical systems
  • Process unstructured data into a form suitable for analysis and assist in analysis of the processed data
  • Working directly with the technology and engineering teams to integrate data processing and business objectives
  • Monitoring and optimizing data performance, uptime, and scale; Maintaining high standards of code quality and thoughtful design
  • Create software architecture and design documentation for the supported solutions and overall best practices and patterns
  • Support team with technical planning, design, and code reviews including peer code reviews
  • Provide Architecture and Technical Knowledge training and support for the solution groups
  • Develop good working relations with the other solution teams and groups, such as Engineering, Marketing, Product, Test, QA.
  • Stay current with emerging trends, making recommendations as needed to help the organization innovate
  • Proactively planning complex projects from scope/timeline development through technical design and execution.
  • Demonstrate leadership through mentoring other team members.

Required Qualifications

  • Bachelors Degree in Computer Science, Information Technology or other relevant field
  • At least 5 to 10 years of recent experience in Software Engineering, Data Engineering or Big Data
  • Ability to work effectively within a team in a fast-paced changing environment
  • Knowledge of or direct experience with Databricks and/or Spark.
  • Software development experience, ideally in Python, PySpark, Kafka or Go, and a willingness to learn new software development languages to meet goals and objectives.
  • Knowledge of strategies for processing large amounts of structured and unstructured data, including integrating data from multiple sources
  • Knowledge of data cleaning, wrangling, visualization and reporting
  • Ability to explore new alternatives or options to solve data mining issues, and utilize a combination of industry best practices, data innovations and experience
  • Familiarity of databases, BI applications, data quality and performance tuning
  • Excellent written, verbal and listening communication skills
  • Comfortable working asynchronously with a distributed team

Preferred Qualifications

  • Knowledge of or direct experience with the following AWS Services desired S3, RDS, Redshift, DynamoDB, EMR, Glue, and Lambda.
  • Experience working in an agile environment
  • Practical Knowledge of Linux    

     

The base pay for this position is

$97,300.00 – $194,700.00

In specific locations, the pay range may vary from the range posted.

     

JOB FAMILY:Product Development

     

DIVISION:ADC Diabetes Care

        

LOCATION:United States of America : Remote

     

ADDITIONAL LOCATIONS:

     

WORK SHIFT:Standard

     

TRAVEL:Yes, 5 % of the Time

     

MEDICAL SURVEILLANCE:No

     

SIGNIFICANT WORK ACTIVITIES:Continuous sitting for prolonged periods (more than 2 consecutive hours in an 8 hour day), Keyboard use (greater or equal to 50% of the workday)

     

Abbott is an Equal Opportunity Employer of Minorities/Women/Individuals with Disabilities/Protected Veterans.

     

EEO is the Law link - English: http://webstorage.abbott.com/common/External/EEO_English.pdf

     

EEO is the Law link - Espanol: http://webstorage.abbott.com/common/External/EEO_Spanish.pdf

Top Skills

AWS
Databricks
DynamoDB
Go
Kafka
Lambda
Pyspark
Python
Redshift
S3
Spark

Similar Jobs

2 Days Ago
Easy Apply
Remote
Hybrid
Austin, TX, USA
Easy Apply
127K-185K
Senior level
127K-185K
Senior level
Fintech • Mobile • Social Impact • Software • Financial Services
The Staff Data Engineer will lead the data pipeline and modeling efforts at Self Financial. Responsibilities include collaborating with product management, creating data models, managing ETL/ELT processes, and ensuring quality standards for data management, with a focus on making data a strategic asset for the organization.
2 Days Ago
Remote
Hybrid
New York, NY, USA
130K-170K Annually
Senior level
130K-170K Annually
Senior level
AdTech • Cloud • Digital Media • Information Technology • News + Entertainment • App development
The Staff Data Engineer will design and build data pipelines, optimize data integration frameworks, support machine learning services, and work with various technologies, focusing on LLM analytics.
Top Skills: AWSAzureGCPHiveLangchainPythonSnowflakeSparkSQL
2 Days Ago
Remote
Hybrid
New York, NY, USA
150K-185K Annually
Expert/Leader
150K-185K Annually
Expert/Leader
AdTech • Cloud • Digital Media • Information Technology • News + Entertainment • App development
The Sr Staff Data Engineer will design data models, build data pipelines, collaborate with stakeholders, and optimize data solutions for strategic decisions.
Top Skills: AWSAzureDbtEltETLGCPMicrosstrategyPower BIPythonSQLTableau

What you need to know about the Boston Tech Scene

Boston is a powerhouse for technology innovation thanks to world-class research universities like MIT and Harvard and a robust pipeline of venture capital investment. Host to the first telephone call and one of the first general-purpose computers ever put into use, Boston is now a hub for biotechnology, robotics and artificial intelligence — though it’s also home to several B2B software giants. So it’s no surprise that the city consistently ranks among the greatest startup ecosystems in the world.

Key Facts About Boston Tech

  • Number of Tech Workers: 269,000; 9.4% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Thermo Fisher Scientific, Toast, Klaviyo, HubSpot, DraftKings
  • Key Industries: Artificial intelligence, biotechnology, robotics, software, aerospace
  • Funding Landscape: $15.7 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Summit Partners, Volition Capital, Bain Capital Ventures, MassVentures, Highland Capital Partners
  • Research Centers and Universities: MIT, Harvard University, Boston College, Tufts University, Boston University, Northeastern University, Smithsonian Astrophysical Observatory, National Bureau of Economic Research, Broad Institute, Lowell Center for Space Science & Technology, National Emerging Infectious Diseases Laboratories

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account