Pagos Logo

Pagos

Software Engineer, Data Platform

Reposted 14 Days Ago
Remote
Hiring Remotely in Canada
Senior level
Remote
Hiring Remotely in Canada
Senior level
Build and maintain backend services, APIs, and data platform components; contribute to ELT/ETL pipelines (real-time and batch), integrations with data providers, testing, and cross-functional engineering projects with high ownership.
The summary above was generated by AI
About Us

At Pagos, we’re passionate about empowering businesses to take control of their payments stack and solve the puzzles standing between them and optimized growth. Our global platform provides developers, product teams, and payments leaders with both a deeper understanding of their payments data and access to new payments technology through user-friendly tools that are easy to implement. To succeed in this, we need creative thinkers who are willing to roll up their sleeves and start building alongside us.

About the Role

As a Software Engineer you'll play a key part in building and maintaining the platform that powers our products. You'll also contribute across our broader engineering ecosystem - including data engineering, APIs, backend services, and internal tooling, wherever your skills can have the most impact.

By collaborating with backend engineers, product management, and other team members, you'll build and own new features, modules, and extensions of our systems. We're seeking an action-oriented and collaborative problem solver who thrives in ambiguity and can take on new challenges with optimism in a fast-paced environment. We value team members who are not only skilled in their area of expertise but are also continuous learners committed to growth and contributing to our collective success.

In this role, you will:

  • Design, build, and maintain backend services, APIs, and infrastructure components that power our platform

  • Contribute to data pipelines and processes (ELT/ETL) to extract, process, and transform data - both real-time and batched

  • Build and maintain integrations with data providers using various data transfer protocols

  • Contribute to the broader platform by supporting and developing backend services, APIs, and other infrastructure components as needed

  • Drive engineering projects from start to finish with a high level of ownership and autonomy

  • Collaborate cross-functionally, stepping in where needed to support the team's goals beyond your core data engineering responsibilities

  • Ensure the quality of our products and data through both manual and automated testing, as well as code reviews

What We’re Looking For

We’re looking for someone with:

  • 8+ years of software engineering experience with an emphasis on Data Engineering

  • Strong backend development skills, including building and supporting REST/gRPC APIs and services

  • Experience with complex SQL queries and database/lakehouse technologies such as Redshift, Clickhouse, Trino, StarRocks, Apache Iceberg, Postgres, or similar

  • Familiarity with cloud platforms like AWS, GCP, or Azure, and common data-related services (e.g. S3, Redshift, EMR, Glue, Kinesis, Athena)

  • A bias for action, where no task is too small, and an eagerness to learn and grow with our industry

Nice to have:

  • Experience with big data technologies and frameworks such as Apache Spark and DBT, including data quality and testing practices

  • Experience with the .NET ecosystem

  • Experience with orchestration tools such as Temporal, Apache Airflow, or similar

  • Experience working in high-growth, venture-backed startup(s)

Pagos does not accept unsolicited resumes from third-party recruiting agencies. All interested candidates are encouraged to apply directly.

Top Skills

.Net
Apache Airflow
Apache Iceberg
Spark
Athena
AWS
Azure
Clickhouse
Dbt
Emr
GCP
Glue
Grpc
Kinesis
Postgres
Redshift
Rest
S3
SQL
Starrocks
Temporal
Trino

Similar Jobs

4 Days Ago
In-Office or Remote
CA
Senior level
Senior level
Information Technology • Consulting
Lead the design and optimization of data infrastructure, focusing on scalability. Requires expertise in GCP, ETL/ELT processes, and collaboration across teams.
Top Skills: Apache AirflowCdcDatabricksDataformDbtETLFivetranGCPPythonSparkSQLTerraform
8 Days Ago
Remote
Canada
Mid level
Mid level
Marketing Tech
Design, build, and maintain scalable data systems and pipelines; ensure SLOs and reliability; partner with Product, Sales Ops, and Finance to deliver analytics; support self-service tooling; contribute to data architecture, testing, and documentation.
Top Skills: AirflowBigQueryDbtDockerGoKubernetesLookerMySQLOciPostgresPythonSnowflakeSQLTerraform
3 Days Ago
In-Office or Remote
CA
Mid level
Mid level
Financial Services
Design and build scalable backend services for a trading platform, focusing on data ingestion, processing pipelines, and APIs while ensuring system reliability and integrating with various teams.
Top Skills: AirflowAWSDockerGCPGraphQLGrpcKafkaKubernetesPostgresPythonRedisSparkWebsockets

What you need to know about the Toronto Tech Scene

Although home to some of the biggest names in tech, including Google, Microsoft and Amazon, Toronto has established itself as one of the largest startup ecosystems in the world. And with over 2,000 startups — more than 30 percent of the country's total startups — Toronto continues to attract new businesses. Be it helping entrepreneurs manage their finances, simplifying business operations by automating payroll or assisting pharmaceutical companies in launching new drugs, the city's tech scene is just getting started.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account