Charger Logistics Logo

Charger Logistics

Data Engineer

Posted 2 Days Ago
Be an Early Applicant
In-Office
Brampton, ON
Senior level
In-Office
Brampton, ON
Senior level
The Data Engineer will design and implement SQL pipelines, develop Python applications for data processing, and manage real-time streaming data with RisingWave and Apache Kafka, while optimizing database performance and orchestrating workflows on GCP.
The summary above was generated by AI

Charger logistics Inc. is a world- class asset-based carrier with locations across North America. With over 20 years of experience providing the best logistics solutions, Charger logistics has transformed into a world-class transport provider and continue to grow.

Charger logistics invests time and support into its employees to provide them with the room to learn and grow their expertise and work their way up. We are seeking a Data Engineer with expertise in SQL, Python, DBT and RisingWave to join our modern data team.

Responsibilities:

  • Design high-performance SQL pipelines across PostgreSQL, BigQuery, Snowflake, and MongoDB.
  • Develop Python applications for data ingestion, transformation, and automation.
  • Implement RisingWave streaming pipelines for real-time analytics.
  • Build Apache Kafka architectures for high-throughput data processing.
  • Orchestrate workflows using Apache Airflow on Google Cloud Platform.
  • Optimize queries and implement data quality checks across multiple platforms.
  • Mentor team members and collaborate with business stakeholders.
  • Deploy CI/CD workflows using Git for reliable pipeline management.

Requirements

Required Qualifications:

  • Bachelor's degree in Computer Science, Engineering, or related field.
  • 5+ years of data engineering experience with SQL, Python, and RisingWave.
  • Must have AlloyDB and CDC experience (DataStream/Debezium)
  • Expert DBT skills across Big Query, Snowflake and AlloyDB.
  • Expert SQL skills: CTEs, window functions, optimization across PostgreSQL, BigQuery, Snowflake.
  • Advanced Python: pandas, sqlalchemy, API integration, streaming data processing.
  • Production experience with Apache Kafka, Apache Airflow, and Google Cloud Platform.
  • Experience with MongoDB, dimensional modeling, and both batch/streaming ETL pipelines.
  • Strong Git and collaborative development experience.

Technical Skills:

  • Core: SQL (advanced), Python, RisingWave (required).
  • Cloud: Google Cloud Platform, BigQuery, GCP native services.
  • Streaming: Apache Kafka, real-time data processing.
  • Orchestration: Apache Airflow (production experience).
  • Databases: PostgreSQL, Snowflake, MongoDB.
  • Tools: Git, Docker, CI/CD pipelines.

Preferred Qualifications:

  • GCP certifications, Terraform/CloudFormation experience.
  • previous experience with RisingWave is strongly preferred
  • Data visualization tools (Looker, Tableau, Power BI).
  • DataOps and analytics engineering best practices.
  • ClickHouse experience is preferred

What You'll Build:

  • Scalable SQL pipelines across multiple database systems.
  • Python-based ETL/ELT solutions spanning cloud and on-premise.
  • Real-time streaming pipelines using RisingWave and Kafka.
  • GCP-native data solutions with automated quality checks.
  • Airflow-orchestrated workflows with CI/CD deployment.

Benefits
  • Competitive Salary
  • Healthcare Benefits Package
  • Career Growth

Top Skills

Apache Airflow
Apache Kafka
BigQuery
Dbt
Docker
Git
Google Cloud Platform
MongoDB
Postgres
Python
Risingwave
Snowflake
SQL

Charger Logistics Brampton, Ontario, CAN Office

25 Production Road, Brampton, Ontario, Canada, L6T4N8

Similar Jobs

49 Minutes Ago
Easy Apply
In-Office
Toronto, ON, CAN
Easy Apply
Mid level
Mid level
Software
As a Data Engineer, you'll design and maintain ETL pipelines, develop Airflow DAGs, and collaborate with analysts to ensure data quality and delivery.
Top Skills: AirflowArgocdGitGoogle AnalyticsKubernetesPythonSalesforceSnowflakeSQLStripeTerraformZendesk
4 Hours Ago
In-Office
Mississauga, ON, CAN
Senior level
Senior level
Information Technology
As a Data Engineer, you'll architect and optimize data pipelines using Azure Databricks, ensuring data quality and collaboration with analytics teams to support advanced initiatives.
Top Skills: Azure Data ServicesAzure DevopsDatabricksDelta LakeGitPower BIPysparkSparkSQL
4 Hours Ago
In-Office
Mississauga, ON, CAN
Mid level
Mid level
Fintech • Financial Services
The Data Engineer will design, develop, and maintain data pipelines in the banking domain, ensuring data quality and compliance while collaborating with stakeholders.
Top Skills: DatabricksOpenshiftPysparkPythonSnowflakeSQL

What you need to know about the Toronto Tech Scene

Although home to some of the biggest names in tech, including Google, Microsoft and Amazon, Toronto has established itself as one of the largest startup ecosystems in the world. And with over 2,000 startups — more than 30 percent of the country's total startups — Toronto continues to attract new businesses. Be it helping entrepreneurs manage their finances, simplifying business operations by automating payroll or assisting pharmaceutical companies in launching new drugs, the city's tech scene is just getting started.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account