LotLinx, Inc. Logo

LotLinx, Inc.

Data Engineer

Reposted 6 Days Ago
Be an Early Applicant
Easy Apply
In-Office
2 Locations
Senior level
Easy Apply
In-Office
2 Locations
Senior level
Design, build, and scale cloud-first data pipelines, manage ELT/ETL workflows, ensure data quality and security, optimize performance, and enable Analytics/Product teams through robust data infrastructure.
The summary above was generated by AI

Since our founding in 2012, Lotlinx has consistently pioneered advancements in the automotive landscape. We specialize in empowering automobile dealers and manufacturers by providing cutting-edge data and technology, delivering a distinct market advantage for every single vehicle transaction. Today, we stand as the foremost automotive AI and machine learning powered technology, excelling in digital marketing, risk management, and strategic inventory management.

Lotlinx provides employees with a dynamic work environment that is challenging, team-oriented, and full of passionate people. We offer great incentives to our employees, such as competitive compensation and benefits, flex time off, and career development opportunities.

Job Summary

We are seeking an experienced Data Engineer to join our growing Data team. In this role, you will be the primary architect of the cloud-first data infrastructure that powers LotLinx's Automotive Digital Advertising platform. You will be responsible for designing, building, and scaling the robust data pipelines that ingest, process, and store massive volumes of data from highly disparate sources.

Reporting to the Senior Director of Data Analytics, you will thrive in a fast-paced environment where your infrastructure decisions directly enable Analytics, Product, and Design teams to extract maximum value from our data. This is a highly collaborative, high-impact role where you will not just maintain pipelines, but actively optimize, secure, and innovate our entire data aggregation architecture.

Why Join Us?

Direct Impact: You hold the keys to the engine. The infrastructure you build directly impacts our ability to deliver real-time insights to our automotive clients. Your architectural decisions will have a highly visible, direct impact on the company's success.

Autonomy & Fast Shipping: We dislike red tape just as much as you do. We trust our engineers. You will have the autonomy to make high-level technical decisions and deploy infrastructure quickly without waiting on long committee approvals.

Outcome-Driven Culture: We measure success by the reliability and scalability of your pipelines, not the hours you sit at your desk. We value an async-friendly approach.

The Best of Both Worlds (Hybrid Flexibility): Enjoy the perk of working from home two days a week, while joining your team for three days of engaging, in-person collaboration at our Winnipeg, Oakville, or Vancouver offices.

Top-Tier Developer Experience: To ensure you have the best tools for the job, we provide top-of-the-line laptops and let you choose your preferred hardware environment (Mac, Windows, or Linux). Furthermore, we actively encourage leveraging cutting-edge technologies, including LLMs and AI coding assistants to supercharge your daily workflows.

Key Responsibilities
  • Architect High-Scale Data Infrastructure: Design, build, and maintain robust, scalable, and highly available data pipelines to process massive datasets from internal and external sources.
  • Build Robust Ingestion & Transformation: Develop and manage complex ELT/ETL workflows for data ingestion, transformation, and loading into our data lakes and cloud data warehouses.
  • Cloud Architecture & Security: Architect and implement modern cloud-based solutions (AWS, GCP) while partnering closely with DevOps and Security to ensure strict compliance with data governance, privacy, and security standards.
  • Deep Performance Optimization: Proactively identify and resolve performance bottlenecks, scaling challenges, and technical issues. Engineer solutions for large-scale data storage and compute efficiency.
  • Cross-Functional Enablement: Work closely with Analytics, Product, and Design teams to assist with data-related technical issues and build the infrastructure they need to succeed.
  • Ownership of Data Quality: Act as the internal expert for our core data sources. Explore new technologies to continuously improve workflow reliability, data quality, and system scalability.
Qualifications
  • Experience: 3+ years of professional experience in data engineering, with a proven track record of designing, building, and managing scalable distributed data systems.
  • Cloud Infrastructure Mastery: Proven, hands-on expertise with major cloud platforms, specifically AWS and/or GCP.
  • Advanced Programming & SQL: Strong software engineering skills in Python, Scala, or Java, coupled with proficiency in advanced SQL and query optimization.
  • Big Data & Streaming: Deep experience with big data processing frameworks (Apache Spark, Hadoop, Beam) and real-time event streaming platforms (Apache Kafka, Pub/Sub, Kinesis).
  • Orchestration & Transformation: Extensive experience with workflow orchestration (Airflow, Dataflow) and modern transformation tools (dbt).
  • DevOps & CI/CD: Strong working knowledge of CI/CD pipelines, Git, and containerization/orchestration technologies (Docker, Kubernetes).
  • Modern Warehousing: Experience managing massive datasets within modern data platforms like Snowflake, BigQuery, or Redshift.
  • Problem Solving: Demonstrated ability to creatively troubleshoot complex distributed systems and innovate within modern cloud ecosystems.
Nice-to-Haves
  • Previous experience in the Automotive or AdTech industry.
  • Familiarity with workflow orchestration tools like Apache Airflow.
  • Experience with Apache Pinot.
  • Experience with data quality and testing frameworks.
  • Familiarity with Google Cloud Platform (GCP) services beyond BigQuery.
Our Tech Stack
  • Cloud/Warehouse: Google Cloud Platform (GCP), AWS, Google BigQuery, Apache Pinot
  • Transformation: SQL, Python
  • Orchestration: Airflow, Cloud Composer
  • Ingestion: Custom Scripts, Pub/Sub, Lambda

The salary range for this position is $108,000 - $162,000 with an annual target bonus.

Lotlinx provides a comprehensive benefits package.

This job posting is for an existing vacancy.

Lotlinx is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.

Lotlinx is not currently able to offer sponsorship for employment visa status.

Lotlinx is headquartered in Peterborough, NH and has locations in Holmdel NJ, Manitoba, Ontario and British Columbia, Canada in addition to a large team spanning from the US to Canada.

Our success relies heavily on our customers but also our dedicated talent that continuously moves our platform forward. We value our employees, their abilities and seek to foster an open, cooperative, dynamic environment where the team and company alike can thrive. 

Top Skills

Airflow
Amazon Kinesis
Apache Beam
Apache Kafka
Apache Pinot
Spark
AWS
Aws Lambda
BigQuery
Cloud Composer
Dataflow
Dbt
Docker
GCP
Git
Google Pub/Sub
Hadoop
Java
Kubernetes
Python
Redshift
Scala
Snowflake
SQL

Similar Jobs

49 Minutes Ago
Easy Apply
In-Office
Toronto, ON, CAN
Easy Apply
Mid level
Mid level
Software
As a Data Engineer, you'll design and maintain ETL pipelines, develop Airflow DAGs, and collaborate with analysts to ensure data quality and delivery.
Top Skills: AirflowArgocdGitGoogle AnalyticsKubernetesPythonSalesforceSnowflakeSQLStripeTerraformZendesk
4 Hours Ago
In-Office
Mississauga, ON, CAN
Senior level
Senior level
Information Technology
As a Data Engineer, you'll architect and optimize data pipelines using Azure Databricks, ensuring data quality and collaboration with analytics teams to support advanced initiatives.
Top Skills: Azure Data ServicesAzure DevopsDatabricksDelta LakeGitPower BIPysparkSparkSQL
4 Hours Ago
In-Office
Mississauga, ON, CAN
Mid level
Mid level
Fintech • Financial Services
The Data Engineer will design, develop, and maintain data pipelines in the banking domain, ensuring data quality and compliance while collaborating with stakeholders.
Top Skills: DatabricksOpenshiftPysparkPythonSnowflakeSQL

What you need to know about the Toronto Tech Scene

Although home to some of the biggest names in tech, including Google, Microsoft and Amazon, Toronto has established itself as one of the largest startup ecosystems in the world. And with over 2,000 startups — more than 30 percent of the country's total startups — Toronto continues to attract new businesses. Be it helping entrepreneurs manage their finances, simplifying business operations by automating payroll or assisting pharmaceutical companies in launching new drugs, the city's tech scene is just getting started.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account