• Anywhere

Junior Big Data Engineer- Riskified

Riskified is the AI platform powering the eCommerce revolution. We use cutting-edge technology, machine-learning algorithms, and behavioral analytics to identify legitimate customers and keep them moving toward checkout. Merchants use Riskified to increase revenue, prevent fraud, and eliminate customer friction. Riskified has reviewed hundreds of millions of transactions and approved billions of dollars of revenue for merchants across virtually all industries, including Wish, Prada, Aldo, Finish Line, and many more. We’re privately funded and VC backed, and our recent Series E round raised $165 million with a valuation in excess of $1 billion. Check out the Riskified Technology Blog for a deeper dive into our R&D work.

Our R&D Team

  • We are passionate about scale, performance engineering, high availability, robust distributed architecture, and clean code.
  • As part of our team, you’ll have the chance to experience real highly-available architecture and take part in the ongoing efforts to enhance our scale and resilience. You’ll work with the most advanced technologies and contribute to the creation and maintenance of one of the most interesting and complex products on the market, including:
    • Our analysis flow, involving a variety of data sources and complex analytical logic, with its high availability and strict SLA requirements
    • Our business applications, meant to serve our customers’ payments teams and their varied use cases
    • Our operational tools, written to maintain an operation of considerable onboarding size and monitor our merchants’ analytical performance
  • Our culture is fast-paced, empowering, and collaborative. We work together as a department, and also, in smaller teams, enabling each developer to have a significant impact on our product. Each team member has a high-level of independence and makes important decisions that influence our customers’ payment flow and risk management.

About the Role:

Big data analysis is at the core of our technology. Riskified’s broad spectrum of departments – From data science to performance analytics, to customer support and finance – All depend on fast and easy access to high volume data. Our growth heavily depends on a system that can quickly and efficiently process data at scale.

Our revolutionary approach for Data Science, a key component in our success, relies heavily on our Big Data team to digest, process and serve data produced by billions of events per day.

Our key data pipelines include:

  • Streaming events in near-real-time from multiple high scale production services into our data lake
  • Processing data for our complex model training flow. Data pipelines supporting complex fraud detection patterns, potentially preventing fraud on a scale of millions of dollars per day (!)

Our Technological Stack includes: Scala/Spark, Dynamodb, Elasticsearch, Airflow, Redshift, AWS, Kafka, Kubernetes and more

As a Big Data Engineer, you will solve complex problems that require a varied and multi-disciplinary skill set. You’ll be required to understand the bigger picture, design system architecture, build highly complex data flows, and manage multiple, multi-faceted projects at once.

What You’ll Be Doing:

  • Be a significant part of our data operation – help our fraud detection engine make sub-second decisions, support our model training pipeline that ingests terabytes of data and provides near real-time BI and analytics to our customers
  • Architect highly scalable data solutions for diversified and complex data flows using Apache Spark
  • Build and develop high-performance, near real-time ETL processes incorporating Apache Kafka

Qualifications:

  • B.Sc. in Computer Science from a leading university with a grade point average of 85 and above (or equivalent)
  • At least 1 years full-time experience coding at a SaaS startup/Hi-tech company
  • Experience with SQL
  • Development experience with Spark or similar frameworks – big advantage
  • Experience with data lakes on Hadoop or cloud storage (S3 or similar)- an advantage
  • Experience with Scala – an advantage
  • Experience working with kubernetes/docker etc – an advantage
Upload your CV/resume or any other relevant file. Max. file size: 1,000 MB.