Data Infrastructure Engineer

San Mateo, CA, USA

Snowflake Inc. logo
Snowflake Inc.
Apply now Apply later

Posted 1 month ago

There is only one Data Cloud. Snowflake’s founders started from scratch and designed a data platform built for the cloud that is effective, affordable, and accessible to all data users. But it didn’t stop there. They engineered Snowflake to power the Data Cloud, where thousands of organizations unlock the value of their data with near-unlimited scale, concurrency, and performance. This is our vision: a world with endless insights to tackle the challenges and opportunities of today and reveal the possibilities of tomorrow.

As a Data Infrastructure Engineer you will be responsible for the architecture, and reliability of data infrastructure tooling used in the analytics and machine learning of Snowflake Services. Our infrastructure stores, and processes our most critical business operations, and therefore reliability and performance are our highest level concerns. 

RESPONSIBILITIES:

  • Collaborate cross functionally to develop, test, deploy, and scale new solutions.
  • Design, implement, test, and deploy the tools that allow other members of data science and analytics teams to easily write and run effective data pipelines.
  • Develop best practices around observability and data processing, and implement the changes to make those practices a reality.
  • Implement observability systems to track data quality and consistency.

MINIMAL QUALIFICATIONS:

  • Bachelor's degree in Computer Science, a related technical field involving software engineering, or equivalent practical experience.
  • Experience programming in at least one of the following languages: Java, Scala, or Python.
  • Experience working with databases (e.g. SQL) and distributed big data infrastructure like Presto, Airflow, Hadoop, Spark, and Snowflake.
  • Systematic problem-solving methods, effective communication skills.

PREFERRED QUALIFICATIONS:

  • Experience working cross-functionally to establish an overall data architecture for a company's needs, building data pipelines, and establishing best data practices.
  • Experience with cloud environments (e.g. AWS, Azure, or GCP), or resource management systems (e.g. Kubernetes)
  • Experience improving efficiency, scalability, and stability of data systems.
  • Experience with alerting, monitoring and remediation automation in a large scale distributed environment

Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. 

How do you want to make your impact?

Job tags: Airflow AWS Azure GCP Hadoop Java Kubernetes Python Scala Spark SQL
Job region(s): North America
Job stats:  0  0  0
  • Share this job via
  • or