Varsity Brands Logo

Varsity Brands

Senior Data Engineer

Reposted An Hour Ago
Hybrid
4 Locations
Mid level
Hybrid
4 Locations
Mid level
The Senior Data Engineer will architect and implement data pipelines, manage their performance, and partner with stakeholders to ensure data accessibility and quality. Responsibilities include assessing data sourcing needs and providing visibility into pipeline status.
The summary above was generated by AI



 

WORK TYPE: Hybrid, applicants can be located in Texas, Indiana, Tennessee and Kansas

 

WORK HOURS:  Monday – Friday; 8am-5pm CST

 

Applicants must be authorized to work in the U.S. without sponsorship. No sponsorship offered for this role. 

 

TRAVEL REQUIREMENT: Less than 5%

 

BASE PAY RATE:    $110,000 - $130,000

The base salary will vary based on criteria such as education, experience and qualifications of the applicant, location, internal equity, and alignment with the market.  

 

 

HOW YOU WILL MAKE AN IMPACT

The Senior Data Engineer (Sourcing & Pipeline Management) will play an integral role within the growing Varsity Brands Data Center of Excellence team.

 

The role’s key elements are:

  • Architect + Implement: Design, build and launch efficient and reliable data pipelines to move data from source platforms, including front-end applications, back-end systems, and third-party analytics and data services, to our enterprise data hub.  In addition, design and build pipelines to supply downstream enterprise applications with prepared reference data from our enterprise data hub.
  • Orchestrate + Monitor: Manage data pipelines as an interdependent network, with proactive visibility into pipeline errors as well as costs over time.
  • Partner + Educate: Partner with stakeholders to understand business requirements, work with cross-functional data and products teams and build efficient and scalable data solutions. Use your data and analytics experience to identify gaps and propose improvements in existing systems and processes, as well as making your source data pipelines easily accessible to data stakeholders

 

 

WHAT YOU WILL DO  

  • Working with data modelers and analysts to identify and prioritize data sourcing gaps.
  • Assessing best fit tool for any given data source.
  • Establishing pipeline cadences and timing based on analytics needs and use cases while being cost conscious.
  • Providing downstream data stakeholders visibility to pipeline scheduling and status.
  • Responsively troubleshooting errors or alerts in existing pipelines.
  • Tracking and summarizing current period pipeline costs and trends for business and IT stakeholders.

 

 

QUALIFICATIONS

KNOWLEDGE/ SKILLS/ ABILITIES

  • Familiarity with modern data stack tools and services employed to replicate data from source systems to cloud data warehouses or lakes particularly using Snowflake  and solid understanding of when and where a given tool is appropriate.
  • Experience utilizing data replication tools and services like HVR, Fivetran, Airbyte, Meltano, & Matillion a MUST
  • Proficiency writing custom code to source data from APIs when needed.
  • Ability to work collaboratively with product or application owners to tease out relevant raw data is available to source
  • Ability to identify source system data capture opportunities to unlock analytics capabilities.
  • Strong knowledge in data architecture, data modeling, schema design, and software development principles.

 

 

EDUCATION/ EXPERIENCE

  • 3+ years of experience in the data engineering/warehousing space, custom ELT/ETL design, implementation, and maintenance.
  • 3+ years of experience writing SQL in an analytics or data pipeline context.
  • 2+ years of experience in at least one language (Python, Scala, Java, etc) in a data engineering or analytics context.
  • 1+ year of experience using an orchestration tool or service to coordinate ELT and downstream analytics pipelines.
  • Experience using REST APIs to acquire and flow data from source to target systems.
  • Experience working with cloud data analytics platforms and tools, particularly Snowflake, dbt, Tableau and Power BI a MUST
  • Experience standing up data pipelines from SAP ERP is a plus.
  • Experience standing up data pipelines from Google Analytics 4 data is a plus.

 

 

PHYSICAL REQUIREMENTS

This job operates in a professional office environment. Largely a sedentary role with some filing requiring the ability to lift files, open filing cabinets and bending or standing on a stool as necessary. The ability to sit or stand for long periods through meetings and while operating office equipment, PC’s, laptop, telephone will be required.  


 

Top Skills

Airbyte
Dbt
Fivetran
Hvr
Java
Matillion
Meltano
Power BI
Python
Scala
Snowflake
SQL
Tableau

Similar Jobs

7 Days Ago
Easy Apply
Remote or Hybrid
United States
Easy Apply
Senior level
Senior level
AdTech • Artificial Intelligence • Marketing Tech • Software • Analytics
The Senior Data Engineer will design, build, and operate data pipelines for Zeta's AdTech platform, focusing on high-scale data processing and analytics-ready datasets.
Top Skills: AirflowAthenaAWSCassandraDagsterDeltaDynamoDBEmrFlinkGlueGoHudiIcebergJavaKafkaKinesisMySQLParquetPostgresPythonRedisRedshiftS3ScalaSparkSQLStep Functions
12 Days Ago
Hybrid
3 Locations
Senior level
Senior level
Fintech • Machine Learning • Payments • Software • Financial Services
The role involves collaborating with Agile teams to develop and support technical solutions, utilizing various programming languages and cloud technologies, and ensures high-performance code through reviews and testing.
Top Skills: AWSCassandraDatabricksEmrGCPGurobiHadoopHiveJavaKafkaLinuxMapreduceAzureMongodbMySQLPythonRedshiftScalaSnowflakeSparkSQLUnix
12 Days Ago
Hybrid
3 Locations
Senior level
Senior level
Automotive • Big Data • Information Technology • Robotics • Software • Transportation • Manufacturing
The Senior Data Engineer will design and build scalable data pipelines, focusing on CRM data integration to support sales and marketing initiatives, utilizing Azure Databricks and modern DevOps practices.
Top Skills: SparkAzure Data FactoryAzure Data LakeAzure DatabricksAzure SynapseCi/CdGitPower-BiPythonSQLTableauTerraform

What you need to know about the Toronto Tech Scene

Although home to some of the biggest names in tech, including Google, Microsoft and Amazon, Toronto has established itself as one of the largest startup ecosystems in the world. And with over 2,000 startups — more than 30 percent of the country's total startups — Toronto continues to attract new businesses. Be it helping entrepreneurs manage their finances, simplifying business operations by automating payroll or assisting pharmaceutical companies in launching new drugs, the city's tech scene is just getting started.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account