Oscilar Logo

Oscilar

Sr. Data Engineer

Reposted 14 Days Ago
Remote
2 Locations
Senior level
Remote
2 Locations
Senior level
As a Senior Data Engineer, you will design and maintain data infrastructure, building scalable data pipelines for real-time analytics and decisioning, while collaborating with cross-functional teams.
The summary above was generated by AI

Shape the future of trust in the age of AI
At Oscilar, we're building the most advanced AI Risk Decisioning™ Platform. Banks, fintechs, and digitally native organizations rely on us to manage their fraud, credit, and compliance risk with the power of AI. If you're passionate about solving complex problems and making the internet safer for everyone, this is your place.

Why join us:
  • Mission-driven teams: Work alongside industry veterans from Meta, Uber, Citi, and Confluent, all united by a shared goal to make the digital world safer.

  • Ownership and impact: We believe in extreme ownership. You'll be empowered to take responsibility, move fast, and make decisions that drive our mission forward.

  • Innovate at the cutting edge: Your work will shape how modern finance detects fraud and manages risk.

Job Description

As a Senior Data Engineer at Oscilar, you will be responsible for designing, building, and maintaining the data infrastructure that powers our AI-driven decisioning and risk management platform. You will collaborate closely with cross-functional teams, ensuring the delivery of highly reliable, low-latency, and scalable data pipelines and storage solutions that support real-time analytics and mission-critical ML/AI models.

Responsibilities
  • Architect and implement scalable ETL and data pipelines spanning ClickHouse, Postgres, Athena, and diverse cloud-native sources to support real-time risk management and advanced analytics for AI-driven decisioning.

  • Design, develop, and optimize distributed data storage solutions to ensure both high performance (low latency, high throughput) and reliability at scale—serving mission-critical models for fraud detection and compliance.

  • Drive schema evolution, data modeling, and advanced optimizations for analytical and operational databases, including sharding, partitioning, and pipeline orchestration (batch, streaming, CDC frameworks).

  • Own the end-to-end data flow: integrate multiple internal and external data sources, enforce data validation and lineage, automate and monitor workflow reliability (CI/CD for data, anomaly detection, etc.).

  • Collaborate cross-functionally with engineers, product managers, and data scientists to deliver secure, scalable solutions that enable fast experimentation and robust operationalization of new ML/AI models.

  • Champion radical ownership—identify opportunities, propose improvements, and implement innovative technical and process solutions within a fast-moving, remote-first culture.

  • Mentor and upskill team members, cultivate a learning environment, and contribute to a collaborative, mission-oriented culture.

Qualifications
  • 5+ years in data engineering (or equivalent), including architecting and operating production ETL/ELT pipelines for real-time, high-volume analytic platforms.

  • Deep proficiency with ClickHouse, Postgres, Athena, and distributed data systems (Kafka, cloud-native stores); proven experience with both batch and streaming pipeline design.

  • Advanced programming in Python and SQL, with bonus points for Java; expertise in workflow orchestration (Airflow, Step Functions), CI/CD, and automated testing for data.

  • Experience in high-scale, low-latency environments; understanding of security, privacy, and compliance requirements for financial-grade platforms.

  • Strong communication, business alignment, and documentation abilities—capable of translating complex tech into actionable value for customers and stakeholders.

  • Alignment with Oscilar’s values: customer obsession, radical ownership, bold vision, efficient growth, and unified teamwork with a culture of trust and excellence.

Nice-to-have
  • Experience integrating Kafka with analytics solutions like ClickHouse.

  • Knowledge of event-driven architecture and streaming patterns like CQRS and event sourcing.

  • Hands-on experience with monitoring tools (e.g., Prometheus, Grafana, Kafka Manager).

  • Experience automating infrastructure with tools like Terraform or CloudFormation.

  • Proficiency with Postgres, Redis, ClickHouse, and DynamoDB. Experience with data modeling, query optimization, and high-transaction databases.

  • Familiarity with encryption, role-based access control, and secure API development.

Benefits
  • Compensation: Competitive salary and equity packages, including a 401k

  • Flexibility: Remote-first culture — work from anywhere

  • Health: 100% Employer covered comprehensive health, dental, and vision insurance with a top tier plan for you and your dependents (US)

  • Balance: Unlimited PTO policy

  • Technical: AI First company; both Co-Founders are engineers at heart; and over 50% of the company is Engineering and Product

  • Culture: Family-Friendly environment; Regular team events and offsites

  • Development: Unparalleled learning and professional development opportunities

  • Impact: Making the internet safer by protecting online transactions

Top Skills

Airflow
Athena
Clickhouse
CloudFormation
DynamoDB
Grafana
Java
Kafka
Postgres
Prometheus
Python
Redis
SQL
Terraform

Similar Jobs

3 Days Ago
Easy Apply
Remote or Hybrid
6 Locations
Easy Apply
Senior level
Senior level
Fintech • HR Tech
The Senior Data Engineer will build scalable data systems, optimize data workflows, and collaborate with teams to enhance data-driven decisions.
Top Skills: BigQueryDatabricksDbtPythonRedshiftSnowflakeSQL
3 Days Ago
Remote or Hybrid
7 Locations
Senior level
Senior level
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
As a Senior Software Engineer, you will develop and maintain high-scale data platforms, write Java code for event pipelines using Spark, and manage a new graph database to enhance data access for analytics and threat hunting.
Top Skills: SparkAWSCassandraDynamoDBFlinkGoGrpcIcebergJavaJenkinsKubernetesMySQLParquetPinotPostgresProtocol BuffersScala
2 Days Ago
Remote
Ontario, ON, CAN
Senior level
Senior level
Information Technology • Consulting
Design and implement data architectures, lead technical strategy, ensure collaboration with stakeholders, and optimize data processes while mentoring engineers.
Top Skills: Apache KafkaAws RedshiftAzure DevopsAzure SynapseCi/CdDatabricksGithub ActionsGitlab CiPysparkPythonScalaSnowflakeSpark Structured Streaming

What you need to know about the Toronto Tech Scene

Although home to some of the biggest names in tech, including Google, Microsoft and Amazon, Toronto has established itself as one of the largest startup ecosystems in the world. And with over 2,000 startups — more than 30 percent of the country's total startups — Toronto continues to attract new businesses. Be it helping entrepreneurs manage their finances, simplifying business operations by automating payroll or assisting pharmaceutical companies in launching new drugs, the city's tech scene is just getting started.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account