Greater Boston Area
3 days ago
Organize, clean and analyze raw data sets to continually test hypotheses.
Plan, organize and build infrastructure that supports the dynamic environment of a young company.
Create and maintain ETLs to and from a variety of sources and destinations. Familiarity or interest in tools in the space of Spark, Dask, Hadoop, and/or Airflow, Kafka, Alooma, AWS Data Pipeline is desired.
Follow best-practice engineering standards, such as architectural design, unit testing and test driven development.