QuadReal Property Group Logo

QuadReal Property Group

Data Engineer, Data Platform

Sorry, this job was removed at 04:16 p.m. (EST) on Wednesday, Jan 21, 2026
Be an Early Applicant
In-Office
Toronto, ON
In-Office
Toronto, ON

Similar Jobs

15 Days Ago
Hybrid
Toronto, ON, CAN
Senior level
Senior level
Blockchain • Fintech • Payments • Consulting • Cryptocurrency • Cybersecurity • Quantum Computing
Lead the MEI data strategy and run day-to-day operations for research-grade analytics environments. Design and operate Hadoop/Cloudera and Databricks lakehouse platforms, build ETL/ELT pipelines, implement data quality and governance, automate checks and CI/CD, liaise with central technology teams, and enable analysts with documentation, templates, and GenAI-enhanced data tooling.
Top Skills: Hadoop,Ozone,Cloudera,Spark,Sql,Python,R,Tableau,Databricks,Lakehouse,Delta,Splunk Catalog,Airflow,Databricks Jobs,Dbt,Kafka,Mlflow,Model Registry,Rag,Vector Search,Nl->Sql,Aws,Azure,Gcp,Ci/Cd,Etl,Elt
11 Days Ago
In-Office
Toronto, ON, CAN
Senior level
Senior level
Fintech • Financial Services
The Senior Data Engineer designs and implements scalable data architectures, manages data storage, optimizes pipelines, ensures data quality, and collaborates with cross-functional teams.
Top Skills: Aws DatazoneAws GlueAws S3Ci/CdInfrastructure As Code (Iac)PythonSagemaker Unified StudioSnowflakeSQL
14 Days Ago
Easy Apply
In-Office
2 Locations
Easy Apply
Senior level
Senior level
eCommerce • Fintech • Machine Learning • Retail
Lead technical strategy and hands-on development for Faire's data platform and analytics. Design and operate scalable, secure data ingestion, storage, compute, orchestration, and governance. Mentor teams, define architecture and roadmaps, improve performance and cost-efficiency, and ensure monitoring, alerting, and data integrity across analytics workflows.
Top Skills: Sql,Python,Snowflake,Airflow,Spark,Aws,Kotlin

About QuadReal Property Group 

QuadReal Property Group is a global real estate investment, operating and development company headquartered in Vancouver, British Columbia. Its assets under management are $94 billion. From its foundation in Canada as a full-service real estate operating company, QuadReal has expanded its capabilities to invest in equity and debt in both the public and private markets. QuadReal invests directly, via programmatic partnerships and through operating platforms in which it holds an ownership interest.

QuadReal seeks to deliver strong investment returns while creating sustainable environments that bring value to the people and communities it serves. Now and for generations to come.

QuadReal: Excellence lives here.

www.quadreal.com

Reporting to the Team Lead, Data Platform, this role will help build, scale, and optimize our enterprise data platform. This is a hands-on engineering position for someone who thrives in code — designing and delivering high-quality pipelines, APIs, and integrations using modern Python-based tools. You will work closely with Data Governance, IT, and business teams to ensure our solutions are scalable, maintainable, secure, and trusted. In addition to development work, you will contribute to platform standards and collaborate with team members to deliver high-impact, production-ready solutions.

Responsibilities

Engineering & Development

  • Design, code, and deploy robust, maintainable data pipelines and APIs using Python and modern frameworks such as dbt, dltHub, and Apache Airflow.
  • Automate environment provisioning using Infrastructure as Code (IaC) using Terraform for Azure storage, compute, and orchestration.
  • Implement data quality checks, validation, and lineage to maintain enterprise trust and compliance.

Architecture & Standards

  • Participate in platform architecture discussions, bringing a coder’s perspective to designing scalable, performant data solutions
  • Champion coding excellence: consistent naming conventions, clear documentation, version control discipline, and clean, readable Python.
  • Ensure adherence to data governance and security standards, integrating with Microsoft Purview for classification, tagging, and access control.

Collaboration & Mentorship

  • Work closely with Data Governance, Data Solutions, Advanced Analytics, and IT Security teams to ensure business alignment.
  • Participate in code reviews and knowledge-sharing sessions with peers.
  • Support a culture of continuous learning and experimentation within the team.

Qualifications and Experience

  • Bachelor’s degree in Computer Science, Information Systems, Business Technology Management, or related field.
  • 2–4 years of experience in data engineering or similar roles, with a strong programming background.
  • Hands-on experience building enterprise-wide data pipelines and transformations using Python and dbt. Proficiency in SQL for data querying and manipulation.
  • Advanced Python skills — comfortable writing clean, efficient, production-grade code for pipelines, APIs, and data transformations.
  • Proficiency in SQL for complex querying and data manipulation.
  • Hands-on experience with workflow orchestration (e.g., Apache Airflow) and modern data stack tools (dbt, dltHub, Microsoft Fabric).
  • Experience using Infrastructure as Code (IaC) to automate provisioning of cloud data environments and platform components.
  • Hands-on experience with Terraform to define, version, and deploy reproducible data platform infrastructure — ideally in Azure.
  • Familiarity with modern data architectures (data lakehouse, data mesh, warehousing best practices).
  • Strong communication and collaboration skills.
  • Experience with governance tools (Microsoft Purview) and CI/CD workflows (Azure DevOps, GitHub Actions) is an asset.
What Success Looks Like in 12 Months
  • 2–3 production-grade pipelines delivered, meeting performance SLAs and governance standards.
  • Reduction in pipeline latency and improved data freshness across key domains.
  • Contribution to at least one major platform standard or automation initiative adopted enterprise-wide.
  • Recognition as a collaborative, reliable contributor to the data platform team.

This role is ideal for a data engineer who loves to code — someone who can move quickly from concept to working software, is comfortable tackling complex data challenges, and wants to shape the future of our enterprise data platform.

#LI-TV1

#LI-Hybrid

Note to Recruiters: QuadReal does not accept unsolicited resumes from any source other than directly from a candidate. Any unsolicited resumes sent to QuadReal, directly or indirectly, will be considered QuadReal property. QuadReal will not pay a fee for any placement resulting from the receipt of an unsolicited resume. A recruiting agency must first have a valid, written and fully executed agency agreement contract for engaged services to submit resumes.

QuadReal Property Group will provide reasonable accommodation at any time throughout the hiring process for applicants with disabilities or for those needing job postings in an alternate format. If you require accommodation, please advise the Talent Acquisition team member you are working with and include the following: Job posting #, your name and your preferred method of contact.

What you need to know about the Toronto Tech Scene

Although home to some of the biggest names in tech, including Google, Microsoft and Amazon, Toronto has established itself as one of the largest startup ecosystems in the world. And with over 2,000 startups — more than 30 percent of the country's total startups — Toronto continues to attract new businesses. Be it helping entrepreneurs manage their finances, simplifying business operations by automating payroll or assisting pharmaceutical companies in launching new drugs, the city's tech scene is just getting started.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account