Vanguard Logo

Vanguard

Senior Data Engineer

Reposted 13 Hours Ago
Be an Early Applicant
In-Office
Toronto, ON
Senior level
In-Office
Toronto, ON
Senior level
The Senior Data Engineer will design and optimize data pipelines, lead technical initiatives, and collaborate with stakeholders to develop scalable data solutions.
The summary above was generated by AI
We are looking for a Sr Data Engineer with in-depth expertise in AWS, Databricks, and modern data architecture and data modeling to help build the next generation of our data foundation, including data platforms such as Advisor360, CRM integrations, ETF/MF data pipelines, and other enterprise analytics assets.
This is a senior, highly autonomous role. You will operate as a technical consultant and data architecture thinker, working closely with stakeholders to understand ambiguous business needs, conduct independent investigations, and design solutions that are scalable, reliable, and aligned to both business and technology goals.
Ideal candidates have 5+ years of data engineering experience and thrive in environments where they drive clarity, define standards, and elevate engineering best practices across the team.

Core Responsibilities:

Data Engineering & Architecture

  • Design, build, and optimize scalable, secure, and repeatable data pipelines using AWS (S3, Glue, Lambda, Step Functions, Redshift, IAM) and Databricks (PySpark, Lakeflow, Delta Lake, Unity Catalog).

  • Serve as the technical leader for data ingestion pipelines, modeling new datasets such as Advisor360, CRM, ETF, and Mutual Fund platforms.

  • Apply strong data modeling (dimensional, canonical, and domain-driven) principles to support analytics, reporting, and AI/ML use cases.

  • Ensure alignment with enterprise data architecture standards, promoting reusability, governance, and long-term maintainability.

Consultative Problem Solving

  • With limited guidance, independently perform deep investigations, identify data issues, and propose solutions that balance performance, cost, risk, and business needs.

  • Engage business stakeholders to gather ambiguous requirements, ask the right questions, and translate them into clear technical designs.

  • Provide thought leadership and recommend technical patterns, frameworks, and toolsets.

Data Quality, Reliability & Operations

  • Implement robust data quality frameworks, monitoring, and alerting to ensure high trust in business-critical data assets.

  • Troubleshoot data inconsistencies and ensure proper logging, testing, and recovery mechanisms across pipelines.

  • Lead regression testing, software upgrades, and production deployments with strong change control discipline.

Collaboration & Leadership

  • Lead all phases of solution development—from design to deployment and operationalization.

  • Mentor and guide other engineers in coding standards, architecture patterns, Databricks best practices, and AWS platform usage.

  • Partner with Data Architecture, Analytics, Product, and Business teams to deliver solutions that improve decision-making.

  • Provide training sessions and documentation to uplift the data engineering maturity across the organization.

Special Projects

  • Participate in strategic initiatives such as AI readiness, data unification efforts, metadata strategy, and enterprise integration roadmaps.

  • Drive continuous improvement in engineering frameworks, onboarding workflows, and platform capability.

Qualifications:

Required

  • 5+ years of experience in data engineering, data architecture, or large-scale distributed data systems.

  • Expert-level experience with cloud platforms such as AWS, GCP, or Azure, leveraging services for data storage, ingestion, pipeline orchestration, database or lake house management, data transformation.

  • Strong background in data modeling (dimensional, canonical, data vault, or domain-driven).

  • Proven ability to work independently with minimal direction and deliver high-quality solutions in ambiguous environments.

  • Demonstrated experience translating complex business problems into scalable technical solutions.

  • Strong SQL and Python skills, with emphasis on ETL/ELT pipeline development.

  • Experience with CI/CD, GitHub, DevOps workflows, and automated testing.

Preferred

  • Experience in asset management, wealth management, or financial services (ETF, Mutual Funds, CRM, Advisor analytics).

  • Experience with enterprise data quality tools and metadata management concepts.

  • Familiarity with modern semantic layers, dbt, or domain-oriented data mesh concepts.

  • Undergraduate degree or equivalent professional experience in Computer Science, Engineering, Information Systems, or related field.

Expected Salary Range: $90,000 - $140,000

How We Work

Vanguard has implemented a hybrid working model for the majority of our crew members, designed to capture the benefits of enhanced flexibility while enabling in-person learning, collaboration, and connection. We believe our mission-driven and highly collaborative culture is a critical enabler to support long-term client outcomes and enrich the employee experience.

Top Skills

AWS
Databricks
Glue
Lambda
Pyspark
Python
Redshift
SQL
Step Functions

Similar Jobs

9 Days Ago
Easy Apply
Hybrid
7 Locations
Easy Apply
Senior level
Senior level
Big Data • Cloud • Software • Database
The Senior Software Engineer will develop a data migration suite, focusing on backend services, utilizing Java and streaming technologies, while collaborating with product teams.
Top Skills: DebeziumJavaKafkaMongoDBReactSpring BootSQL
9 Days Ago
Remote or Hybrid
7 Locations
Senior level
Senior level
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
As a Senior Software Engineer, you will develop and maintain high-scale data platforms, write Java code for event pipelines using Spark, and manage a new graph database to enhance data access for analytics and threat hunting.
Top Skills: SparkAWSCassandraDynamoDBFlinkGoGrpcIcebergJavaJenkinsKubernetesMySQLParquetPinotPostgresProtocol BuffersScala
Yesterday
Remote or Hybrid
2 Locations
Senior level
Senior level
Artificial Intelligence • Cloud • Fintech • Machine Learning • Mobile • Software
Design and implement data engineering solutions, enhance self-service capabilities, automate data monitoring, ensure data quality, and contribute to architectural frameworks for performance optimization.
Top Skills: AirflowAzure Cloud ComputingDatabricksDbtPythonSnowflakeSparkSQL

What you need to know about the Toronto Tech Scene

Although home to some of the biggest names in tech, including Google, Microsoft and Amazon, Toronto has established itself as one of the largest startup ecosystems in the world. And with over 2,000 startups — more than 30 percent of the country's total startups — Toronto continues to attract new businesses. Be it helping entrepreneurs manage their finances, simplifying business operations by automating payroll or assisting pharmaceutical companies in launching new drugs, the city's tech scene is just getting started.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account