Senior Data Platform Engineer (Remote)

Remote - Canada

Full Time Senior-level / Expert
Upgrade Inc. logo
Upgrade Inc.
Check your rate for a fixed-rate personal loan and borrow up to $50,000. Or get started with Upgrade Card in just minutes. Check out Rewards Checking with cash back rewards and more. We're here to help you build the future you want.
Apply now Apply later

Upgrade is a fintech unicorn founded in 2017. In the last four years, over 15 million people have applied for an Upgrade card or loan, and we have delivered over $7 billion in affordable and responsible credit. Our innovative Upgrade Card combines the flexibility of a credit card with the low cost of an installment loan. Our latest offering, Rewards Checking, gives customers access to no-fee checking accounts with 2% cash back rewards on common everyday spending. Learn more about the team. Upgrade has been named a “Best Place to Work in the Bay Area” by the San Francisco Business Times and Silicon Valley Business Journal 3 years in a row, and received “Best Company for Women” and “Best Company for Diversity” awards from Comparably. Upgrade has been included in the 2021 Inc. 5000 list of the fastest-growing private companies in America. We are looking for new team members who get excited about designing and implementing new and better products to join a team of 750 talented and passionate professionals. Come join us if you like to tackle big problems and make a meaningful difference in people's lives.
This is a remote position based in Canada. At this time we are unable to consider international applicants for this role.

What You'll Do

  • Design, architect, and maintain distributed fault tolerant data infrastructure that supports the needs of data pipeline, data warehouse and business intelligence engineers
  • Quickly gain a deep understanding of the business and how data flows through the organization and through the data engineering codebase while playing a key role in building an efficient and scalable data and reporting layer for the organization.  
  • Build and scale tools needed by data engineers to build awesome data pipelines to enrich our Enterprise Data warehouse.
  • Set up tools and processes for effective data loading, data loading, lineage tracking and monitoring and drive quality across data in the data warehouse.
  • Continuously improve our data infrastructure and stay ahead of technology.

What You'll Bring

  • Solid understanding of core computer science including algorithms & data structures, operating systems, distributed systems, networking, and concurrent programming.
  • Experience scaling data environments with distributed batch and realtime systems and self-serve visualization environments.
  • You have high development standards, especially for code quality, code reviews, unit testing, continuous integration and deployment
  • Proficiency in writing object oriented code for data processing in Python and/or Java, including development of web services.
  • Experience building complex customized, highly scalable data pipelines with task orchestrators such as Airflow while building and maintaining the code base for data integration.
  • Experience building complex fault tolerant docker containerized batch data processing tasks using Python and SQL  distributed over a kubernetes cluster using distributed task frameworks such as celery.
  • Automate producing and consuming data in real time from event driven microservices using streaming platforms like Kafka, Kinesis or RabbitMQ.
  • Experience with building and managing realtime and near time data replication systems from OLTP databases into OLAP databases.
  • A good understanding of cloud based columnar Data Warehouses/Data Lakes (Redshift and/or Snowflake) with distributed file systems such as S3, HDFS maintaining data in popular standard and  columnar compressed file formats.
  • Expertise in wrangling with 3rd party API’s to push/pull data.
  • Understand securely storing sensitive data in transit and at rest.
  • Excellent verbal and written communication skills – Ability to synthesize complex ideas and communicate them in very simple ways.
  • Highly analytical and detail-oriented.
  • Ability to troubleshoot and fix issues quickly in a fast-paced environment.

Strong Plus

  • Worked with building web services using Flask, Django and/or FastApi
  • Knowledge of serverless data computing with Amazon lambda, iron.io etc.
  • Financial services experience.
  • Reporting and data visualization skills.

Benefits/Perks

  • Competitive salary and stock option plan. 
  • 100% paid coverage of medical, dental and vision insurance. 
  • Unlimited vacation. 
  • Learning stipend for personal growth and development. 
  • Paid parental leave.  
We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
Job region(s): Remote/Anywhere North America
Job stats:  1  0  0
  • Share this job via
  • or

Explore more DevOps, Cloud and SRE career opportunities