Senior Data Platform Engineer (Kafka)


Full Time Senior-level / Expert
EasyPost logo
EasyPost's best-in-class Shipping APIs provide end-to-end flexibility and more control over parcel shipping and logistics processes for anyone shipping online.
Apply now Apply later

Founded in 2012 as the first RESTful API for shipping, EasyPost is helping e-commerce companies with accurate tracking and logistics. We are delivering hope and spreading smiles to homes all across the country. EasyPost pushes boundaries and changes the status quo through our RESTful API, allowing companies greater control over their shipping. We continue to disrupt the shipping industry, and this is the best time to get on board. We are out to do things differently, to consistently change, grow, and progress. Join us in building simple shipping solutions to enable sellers to define & rate postage, buy it, and track it in transit.

As apart of the Data team you will be responsible for building a scalable data ingestion/processing platform and low latency customer facing data APIs. People ship millions of packages with us everyday, and these shipments go through multiple stages that generate tens of millions events a day. The platform we build is the foundation of intelligence offering to our customers, it enables us to build complex models to power our data API products that set us apart from

What you will do: 

  • Architecting fault tolerant and self-healing distributed systems
  • Work with data scientist to create highly scalable API services that's based on ML and statistic models
  • Re-building our data warehouse to support growth, design next gen schema for analytics
  • Find new ways to improve in-house batch processing framework and workflow orchestration
  • Establish standard methodologies for creating systems and datasets for the entire company's use
  • Work closely with other teams from across the organization
  • Build and maintain the automation that manages data storage technologies
  • Mentor fellow teammates on algorithms, data structures, design patterns, and best practices
  • Find new ways to improve data team initiatives and workflow orchestration

About You: 

  • Involvement with online pub-sub systems at scale
  • Comfortable in a Polyglot environment
  • Familiarity with building scalable micro services
  • Experience diagnosing and resolving complex multi-system performance problems
  • Work with cross engineering team projects, good at communicating complex technical problems and making judgement calls.
  • Multiple years of exposure to  utilizing an open-source stream-processing software platform
  • Strong desire to work in a fast-paced, start-up environment with multiple releases a day
  • A passion for working as part of a team

What We Offer: 

  • Comprehensive medical, dental, vision, and life insurance
  • Competitive compensation package and equity
  • Monthly work from home stipend of $100 net
  • Flexible work schedule and paid time off
  • Collaborative culture with a supportive team
  • A great place to work with unlimited growth opportunities
  • The opportunity to make massive contributions at a hyper-growth company
  • Make an impact on a product helping ship millions of packages per day

Data Privacy Notice for Job Applicants:

For information on personal data processing, please see our Privacy Policy:

Job tags: Go Kafka
Job region(s): Remote/Anywhere North America
Job stats:  0  0  0
  • Share this job via
  • or

Explore more DevOps, Cloud and SRE career opportunities