Job Details
Location:
Redwood City, San Mateo County, California, USA
Posted:
Feb 06, 2020
Job Description
Description
Job Description
Shutterfly is seeking an experienced Data Engineer with Software Engineering skills to join the Data Warehouse Development team. You will own, manage and drive end-to-end solutions and data infrastructure. You will work with analytics and business partners to deliver data solutions in support of insights and analysis of a multi-million customer ecommerce business with both internal and external data. If you like applying your expertise with data-warehousing technical concepts, CS fundamentals and data and system architecture to multi-terabyte, multi-source data, come join the Shutterfly Data Warehouse Development team!
Responsibilities
- Build data expertise and own data quality for the pipelines you build
- Architect, build and launch new data models and data marts that provide intuitive analytics to your customers
- Design, build and launch extremely efficient & reliable data pipelines to move data (both large and small amounts) into and out of the Shutterfly Data Warehouse
- Design and develop new systems and tools to enable folks to consume and understand data faster
- Use your coding skills across a number of languages including Python and Java
- Have a clear understanding of the reports/analyses/insights to be driven by data and build data solutions to optimally support the analytics needs
- Integrate third party data to enrich our data environment and enable new analytic perspectives
- Become fully immersed in the context of Business Development and Partner business initiatives
- Work across multiple teams in high visibility roles and own solutions end-to-end
- Work with program managers, business partners and other engineers to develop and prioritize project plan
Qualifications
- 5+ years of experience with implementing big data business solutions at productions scale
- Master degree in Computer Science or a related field to enable strong CS fundamentals and experience developing with object-oriented programming (Java, Python)
- Expert knowledge of SQL
- History of building, maintaining and automating reliable and efficient ETL, ELT jobs
- Strong CS fundamentals and experience developing with object-oriented programming (Python, Java)
- Expertise with dimensional warehouse data models (star, snowflake schemas)
- Experience with working with various AWS services such as S3,EMR,EC2/ECS,Athena
- Experience with Cloud Data Warehouses, such as Amazon Redshift
- Understanding of Hadoop and Spark, including manipulating data with with Hive , Python or Java
- Experience with scaling data pipelines built using EMR, is preferred
- Understanding of streaming technologies and concepts used with data warehouses, is preferred
- Understanding of automation and orchestration platforms such as Airflow
- Experience with multi-Terabyte MPP relational databases, such as Teradata, and concepts , is preferred
#LI-HS1