Data Engineer - Cloud Infrastructure Operations

Irvine, California, United States

Full Time
Twilio logo
Twilio
Apply now Apply later

Posted 3 weeks ago

Because you belong at Twilio

The Who, What, Why and Where

Twilio is growing rapidly and seeking a Data Engineer to be a key member of the Enterprise Data & Business Intelligence organization with the focus on cloud infrastructure data engineering services based in Denver, Colorado. You will partner with other engineers and product managers to translate data needs into critical information that can be used to implement scalable data platforms and self service tools. We are looking for someone who is passionate about solving problems using engineering and data, thrives in an evolving environment, brings an enthusiastic and collaborative attitude, and delights in making a difference. As a successful candidate, you must have a deep background in data engineering and a proven record of  solving data problems at scale leveraging distributed data systems. You are a self-starter, embody a growth mindset, and can collaborate effectively across the entire Twilio organization. 

Who?

Twilio is looking for an exceptional individual who lives Twilio Magic and has a demonstrated track record in conceiving and delivering Data Warehouse solutions at company-wide scale. 

  • 5-7 years of data engineering experience is a fast paced company that delivers software 
  • Knowledge of all phases of software development including requirements analysis, design, coding, testing, debugging, implementation, and support. 
  • Experience with AWS, EC2, Cloudhealth etc. 
  • Experience in one of the programming languages like Java, Scala, Python etc.
  • Experience in one of the distributed environments like Kafka, Spark, Hive, Presto etc. 
  • Experience working with various file formats like Parquet, Avro, Hudi for large volumes of data
  • Experience with modern warehouse data systems like Redshift, Snowflake etc.  is a plus
  • Experience building data solutions in full stack environment with micro services principles
  • Ability to work independently to find answers and solutions
  • Strong understanding of engineering best practices and design principles
  • Experience working in agile environment and iterative development 
  • Collaborative mindset and ability to work with distributed, cross-functional teams
  • Solid communication skills and the ability to clearly articulate your point of view
  • Bachelors/Masters degree in Computer Science required, or equivalent experience

What?

As a Data Engineer, you will live the Twilio Magic and:

BE AN OWNER 

  • Design and implement data management services for data trust, data compliance, data access and metadata management in the form of scalable and configurable while clearly articulating technical rationale behind your design and implementation choices
  • Participate in Agile/Scrum activities including planning, standups, retrospectives; Provide point of view on user stories. 

WEAR THE CUSTOMER’S SHOES

  • Partner with data architects, product managers and other engineers to ensure they have the right information about our services and platforms while ensuring happy customers.
  • Listen to your customers’ challenges, identify opportunities, craft solutions, and deliver the right value at the right time.

WRITE IT DOWN

  • Demonstrate excellent verbal and written communication - ensure that complex ideas, thoughts, and vision can be communicated simply and effectively. You are expected to thrive in a highly collaborative environment.

DRAW THE OWL 

  • You’ll build highly scalable platforms and services that support rapidly growing data needs in Twilio. There’s no instruction book, it’s yours to write. You’ll figure it out, ship it, and iterate. You’ll invent the future, but you won’t wing it.

Why?

The Enterprise Data and Business Intelligence is a central organization within Twilio that provides data infrastructure and related services in the form of data lake, data warehouse, business intelligence and data governance etc, that supports long term growth and sustainability. Our mission is to enable fact based decision making by providing clean, governed, accurate data in scalable and easy to use systems in a timely manner.  We play an integral role in shaping our business decisions that enable company’s growth and success and are the backbone of Twilio’s data driven culture. 

Twilio is a company that is empowering the world’s developers with modern communication in order to build better applications. Twilio is truly unique; we are a company committed to your growth, your learning, your development, and your entire employee experience. We only win when our employees succeed and we're dedicated to helping you develop your strengths. We have a cultural foundation built on diversity, inclusion, and innovation and we want you and your ideas to thrive at Twilio.

Where?

This position will be located in our office in Irvine, California. You will enjoy our office perks: catered meals, snacks, game room, ergonomic desks, massages, bi-weekly All Hands and more. What you will also get to experience is a company that believes in small teams for maximum impact; seeks well-rounded talent to ensure a full perspective on our customers’ experience,  understands that this is a marathon, not a sprint; that continuously and purposefully builds an inclusive culture where everyone is able to do and be the best version of themselves.

About Us

Millions of developers around the world have used Twilio to unlock the magic of communications to improve any human experience. Twilio has democratized communications channels like voice, text, chat, video and email by virtualizing the world’s communications infrastructure through APIs that are simple enough for any developer to use, yet robust enough to power the world’s most demanding applications. By making communications a part of every software developer’s toolkit, Twilio is enabling innovators across every industry — from emerging leaders to the world’s largest organizations — to reinvent how companies engage with their customers.

Job tags: AWS EC2 Java Kafka Python Redshift Scala Spark
Share this job: