E2open Logo

E2open

Data Engineer

Sorry, this job was removed at 01:47 a.m. (EST) on Friday, Feb 21, 2025
Remote
3 Locations
Remote
3 Locations

E2open is the connected supply chain platform that enables the world’s largest companies to transform the way they make, move, and sell goods and services. We connect more than 400,000 partners as one multi-enterprise network. Powered by the network, data, and applications, our SaaS platform anticipates disruptions and opportunities to help companies improve efficiency, reduce waste, and operate sustainably. Our employees around the world are focused on delivering enduring value for our clients.

Job Summary:

e2open seeks a Data Engineer with approximately 3-5 years of experience in building and maintaining scalable data pipelines, architectures, and infrastructure. The ideal candidate will have hands-on experience with Databricks and/or Snowflake, as well as a strong understanding of data governance, regulatory requirements, and global data hosting.


***This is a hybrid role requiring 3 days in office***

Key Responsibilities:

  • Design, build, and maintain scalable data pipelines, architectures, and infrastructure using Databricks and/or Snowflake
  • Work with large data sets, ensuring data quality, integrity, and compliance with regulatory requirements
  • Collaborate with cross-functional teams, including data science, product, and engineering, to identify and prioritize data requirements
  • Develop and implement data governance policies, procedures, and standards to ensure data quality, security, and compliance
  • Ensure compliance with global data hosting regulatory requirements such as GDPR
  • Optimize data infrastructure for performance, scalability, and reliability
  • Develop and maintain technical documentation for data infrastructure and pipelines
  • Stay current with industry trends, best practices, and emerging technologies in data engineering

Requirements:

  • 5+ years of experience in data engineering, with a focus on building and maintaining scalable data pipelines, architectures, and infrastructure
  • Hands-on experience with Databricks and/or Snowflake
  • Strong understanding of data governance, regulatory requirements, and global data hosting
  • Experience working with large data sets, ensuring data quality, integrity, and compliance
  • Strong programming skills in languages such as Python or Java
  • Experience with data warehousing, ETL/ELT, and data modeling
  • Strong understanding of data security, access controls, and compliance
  • Excellent problem-solving skills, with the ability to work in a fast-paced environment
  • Strong communication and collaboration skills, with the ability to work with cross-functional teams

Nice to Have:

  • Experience with cloud-based data platforms, such as AWS, Azure, or GCP
  • Knowledge of data discovery, metadata management, and data cataloging
  • Experience with agile development methodologies and version control systems, such as Git
  • Certification in data engineering, data governance, or related fields



E2open is proud to be an Equal Employment Opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics.

E2open participates in the E-verify program in certain locations, as required by law.

E2open does not accept unsolicited referrals or resumes from any source other than directly from candidates or preferred vendors. We will not consider unsolicited referrals.

Similar Jobs

Yesterday
Remote
USA
Mid level
Mid level
Healthtech • Pet
As a Data Engineer at Vetcove, you will design and maintain scalable data pipelines, integrate external data sources, and leverage advanced techniques like Large Language Models for efficient data management and analysis.
Top Skills: Python
7 Days Ago
Remote
USA
Mid level
Mid level
Big Data • Fitness • Healthtech • Software • Analytics • Energy
The Data Engineer at Arcadia will design, build and test data connectors and ingestion pipelines, contributing to the integration of client claim and clinical data platforms. Responsibilities include peer code reviews, sprint planning, and maintaining JIRA tasks. The role requires problem solving for coding and data analysis issues within a dynamic healthcare environment.
Top Skills: ScalaSparkSQL
21 Hours Ago
Remote
Hybrid
67 Locations
Senior level
Senior level
Artificial Intelligence • Professional Services • Business Intelligence • Consulting • Cybersecurity • Generative AI
The Google Cloud Data Engineer at PwC will design and implement data architecture, develop data pipelines, and lead projects utilizing Google Cloud services. The role involves mentoring and coaching team members while ensuring data governance and compliance, and managing client relationships effectively.
Top Skills: DatabricksGoogle Cloud PlatformHadoopKafkaQuickbaseSnowflakeSparkSQLTerraform

What you need to know about the Toronto Tech Scene

Although home to some of the biggest names in tech, including Google, Microsoft and Amazon, Toronto has established itself as one of the largest startup ecosystems in the world. And with over 2,000 startups — more than 30 percent of the country's total startups — Toronto continues to attract new businesses. Be it helping entrepreneurs manage their finances, simplifying business operations by automating payroll or assisting pharmaceutical companies in launching new drugs, the city's tech scene is just getting started.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account