Data Platform Engineer

Kyiv, Ukraine

thredUP logo
Apply now Apply later

Posted 2 weeks ago

About thredUP
thredUP is the world’s largest fashion resale platform, inspiring a new generation to think secondhand first. The company has spent the past 10 years reinventing resale, building a marketplace and infrastructure now poised to power the $50B resale economy and usher in a more sustainable fashion future. Millions of consumers use thredUP as the easiest way to sell their clothes and shop over 35,000 brands at up to 90% off — online, in stores or via “try-before-you-buy” Goody Boxes. Backed by world-class investors, thredUP designed a resale engine that has redistributed nearly 100 million unique garments from closets across America and is now powering resale for the broader fashion industry via its Resale-As-A-Service (RAAS) platform.
About the Role: 
At thredUP, we live a true data driven culture with an ever growing appetite for data and a mindset to generate insights and make informed business decisions. In this role, you will shape the roadmap for the next generation of data-platform that is easy to use, elastic and cost efficient.  This platform should make it easy for data engineers to build batch and streaming pipelines as well as make it easy to interact with  the data for the end consumers.  Build a  robust machine-learning platform to develop, test, and deploy machine learning models deployed for both batch and realtime use cases. Additionally, support the current technology stack and evaluate newer tools/technologies  that reduce the support cost and provide improved productivity and  experience.You will interact in a highly collaborative environment of data engineers, data scientists, product managers, domain experts and business leaders to deliver data solutions. 


  • Ideate and build the next generation product roadmap for the data platform team.
  • Own Airflow deployments, integration with Datadog and CI/CD.
  • Own the Mlflow machine learning platform and find alternatives or enhance it to support input data versioning and model deployment in A/B testing mode.
  • Build and evangelize reusable components for data pipelines.
  • Implement a solution to capture metadata and lineage.
  • Own the enterprise event bus solution.
  • Own the databricks environment.
  • Own and shape the feature store infrastructure and roadmap.


  • 8+ years of experience in building scalable data platforms and tools.
  • Demonstrated experience in implementing Spark for data processing.
  • Demonstrated experience in implementing Kafka for real time data processing.
  • Demonstrated experience providing rest endpoints to expose data to other applications.
  • Experience  integrating and supporting Databricks.
  • Working with noSQL stores like HBASE/DynamoDB.
  • Experience providing solutions to handle data privacy.
  • Experience in implementing machine learning platforms like MLFlow, Tensorflow,  Sagemake.
  • Experience in deploying Airflow as a scheduling tool.
  • Experience in implementing CI/CD pipelines using Jenkins.
  • Expert level using Python and PySpark.
  • Understanding with AWS Data technologies (such as Redshift, S3, Glue)Experience with repositories like Git, maven, jfrog.


  • Experience working with cloud databases like Snowflake/Redshift/BigQuery.
  • Experience migrating legacy data platforms to cloud native solutions.
  • Experience implementing Delta-Lake.
  • Experience working with  machine learning models.
  • Experience with Java and building microservices.

What We Offer:- The opportunity to make a massive impact & influence outcomes for our business and customers alongside passionate coworkers- Autonomy. The ability to make, own, and carry out decisions- Working within a modern tech stack- Competitive top market salary- Flexible working hours (possibility to work from home on Tuesdays and Thursdays)- Full-coverage medical insurance, free lunches on Wednesdays, English classes, etc.- Relocation assistance and cost coverage program for candidates from other countries and cities
At thredUP, our mission has been built on extending the lives of millions of unique clothing items. Much like our inventory, we believe diversity is key. As a diverse and inclusive workplace, we are committed to ensuring our employees are comfortable bringing their authentic selves to work every day. A unique perspective is critical to solving complex problems and inspiring a new generation to think secondhand first. Everyone is welcome - be you.
Job tags: Airflow AWS CD CI Git Java Kafka Python Redshift REST S3 Spark Streaming