Data Infrastructure Engineer

Weave - Headquarter

Applications have closed
Weave HQ logo
Weave HQ
Weave is a remote business toolbox. Looking for ways to manage your business through unpredictability and get back to work (safely) when the time is right? Weave got your back, no matter how much you modify the ways you interact with customers,...
Find more jobs like this

The first Utah company to join Y Combinator, Weave has set the bar for Utah startup achievement & work culture. In the past year, Weave has been included in the Forbes Cloud 100, Inc. 5000 fastest-growing companies in America, and Glassdoor Best Places to Work. 

At the core of Weave's growth are our people. We are passionate about providing an amazing workplace for accomplished people who demonstrate our core values: Hungry, Creative, and Caring. Don't believe us? Check out why our employees, their families, and our 20,000+ customers love Weave visit our website or head to our Instagram page @workatweave to see what our employees are up to.

What you will love about us

  • Medical, Dental, & Vision Insurance- we cover 75%!

  • Flexible PTO and work schedules

  • Free haircuts at our onsite salon

  • Parental leave + baby benefits! House cleanings, meals, and one year of free diapers provided

  • Brand new building including a huge onsite gym

  • 401k with company match

  • Commuter benefits (UTA Pass)

  • Company holiday and summer events

  • Weave’s in-house coaching initiative

  • We believe in diversity and inclusion! Join one of our Peer Resource Groups 

  • People, not Employees Culture 

Weave is looking for engineers hungry for fun challenges who can join our self-empowered teams and contribute in both technical and non-technical ways.

You will be joining a team of talented developers that share a common interest in distributed backend systems, data, scalability, and continued development. You will get a chance to apply these, and other skills, to new and ongoing projects to make data more available, and easier to discover and use. 

Our teams are cross-functional agile teams composed of a product owner, backend and frontend devs and devops. Teams are highly autonomous with the ownership and ability to act in Weave’s best interest. 

Above all, your work will impact the way our customers experience Weave while working closely with a highly skilled team to accomplish varying goals and cultivate our phenomenal culture.

The Data Platform Team's mission is to enable product innovation by making it painless for developers to build applications that require access to large sets of data. Many of the core Weave products/features (auto scheduling, AI/ML, real-time notifications, etc.) and backend services (Search indexing, Conversion, etc.) are powered by our data platform infrastructure. We handle data for thousands of customers daily. Data Platform Team's mission in 2021 is to re-think our data architecture and make it ready for millions of customers worldwide. This will require out of the box thinking.

Our data platform components include

  • The messaging layer (pub/sub)

  • Async job processing

  • Real-time stream compute

  • Offline data processing

  • Events index

  • Data access services

What you will be doing 

  • Design and Develop core data platform components for enabling business process orchestration and workflows.

  • Build scalable, resilient services to support data integration, event processing, and platform extensions.

  • Contribute to the continued evolution of product functionality that services large amounts of data and traffic.

  • Write code that is high-quality, performant, sustainable, and testable while holding yourself accountable for the quality of the code you produce.

  • Coach and collaborate inside and outside the team. You enjoy working closely with others - helping them grow by sharing expertise and encouraging best practices.

  • Work in a hybrid cloud infrastructure, considering the implementation of functionality through several distributed components and services.

  • Work with our stakeholders to translate product goals into actionable engineering plans.

What you will need to accomplish the job (minimum qualifications)

  • High integrity, team-focused approach, and collaboration skills to build tight-knit relationships across Weave

  • 3+ years of experience in any back-end language, i.e. Go, Java or Python (Go, Java, or C/C++ experience is a plus)

  • Experience moving and storing TBs of data or 100’s of millions of records.

  • Understanding of distributed systems and building scalable, redundant, and observable services

  • Expertise in architecting messaging systems, distributed data stores or NoSQL technologies (e.g., Kafka, Google PubSub, Bigtable, Spanner, Vertica, Vitess, Cassandra, Postgres, S3, Iceberg, etc.)

  • Experience building solutions to run on one or more of the public clouds (e.g., AWS, GCP, etc.)

  • Experience providing stable well designed libraries and SDKs for internal use

  • Responsive person with a strong bias for action

  • Entrepreneurial spirit and a thirst for learning

  • Demonstrated track record of delivering complex projects on time and have experience working in enterprise-grade production environments

  • Strategic thinker with a strong technical aptitude and a passion for execution 

What will make us love you (preferred qualifications)

  • A background with data analysis and visualization

  • 5+ years of experience in any back-end language, preferably Java, Go or Python

  • Proficient understanding of containers, orchestrators, and usage patterns at scale including networking, storage, service meshes, and multi-cluster communication

  • Advanced experience with SQL and large multi-tenant relational databases

  • Experience with automation and container based workflows

  • Experience owning and operating datastores and databases

  • Experience with GitOps, IaC, and configuration driven systems

  • A preference for open source solutions

  • A track record of clean abstractions and simple to use APIs

  • Deep understanding of distributed data technologies such as streaming, data mesh, data lakes, warehouses, or distributed machine learning

  • Experience with Kubernetes or GKE and the Operator Pattern (GCP)

  • A desire to advance the state of the art with new and innovative technologies

  • Enjoys working in a greenfield environment using rapid prototyping

Job region(s): Remote/Anywhere
Job stats:  4  0  0

Explore more DevOps, Cloud and SRE career opportunities