Job code
IRC279569
Published on 15 October 2025

Senior Data Engineer IRC279569

Designation

Senior Software Engineer

Oblasti

Engineering

Skúsenosti

5-10 years

Lokácie

Slovakia

Skills

Azure, azure data factory, Databricks, ETL, PySpark, Python, SQL

Formy práce

Remote

Apply

Description

Our customer (originally the Minnesota Mining and Manufacturing Company) is an American multinational conglomerate operating in the fields of industry, worker safety, and consumer goods. Based in the Saint Paul suburb of Maplewood, the company produces over 60,000 products, including adhesives, abrasives, laminates, passive fire protection, personal protective equipment, window films, paint protection film, electrical, electronic connecting, insulating materials, car-care products, electronic circuits, and optical films.

#LI-IN1

Requirements

We are looking for a highly skilled and experienced Senior Data Engineer to join our team. In this role, you will be a key player in designing, building, and optimizing our data architecture and pipelines. You will be working on a complex data project, transforming raw data into reliable, high-quality assets ready for analytics, data science, and business intelligence. As a senior member of the team, you will also be expected to help junior/middle engineers, drive technical best practices, and contribute to the strategic direction of our data platform.

Required Qualifications & Skills

  • 5+ years of professional experience in data engineering or a related role.
  • A minimum of 3 years of deep, hands-on experience using Python for data processing, automation, and building data pipelines.
  • A minimum of 3 years of strong, hands-on experience with advanced SQL for complex querying, data manipulation, and performance tuning.
  • Proven experience with cloud data services, preferably Azure (Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Data Lake Storage).
  • Hands-on experience with big data processing frameworks like Spark (PySpark) and platforms such as Databricks.
  • Solid experience working with large, complex data environments, including data processing, data integration, and data warehousing.
  • Proficiency in data quality assessment and improvement techniques.
  • Experience working with and cleansing a variety of data formats, including unstructured and semi-structured data (e.g., CSV, JSON, Parquet, XML).
  • Familiarity with Agile and Scrum methodologies and project management tools (e.g., Azure DevOps, Jira).
  • Excellent problem-solving skills and the ability to communicate complex technical concepts effectively to both technical and non-technical audiences.

Preferred Qualifications & Skills

  • Knowledge of DevOps methodologies and CI/CD practices for data pipelines.
  • Experience with modern data platforms like Microsoft Fabric for data modeling and integration.
  • Experience with consuming data from REST APIs.
  • Experience with database design, optimization, and performance tuning for software application backends.
  • Knowledge of dimensional data modeling concepts (Star Schema, Snowflake Schema).
  • Familiarity with modern data architecture concepts such as Data Mesh.
  • Real-world experience supporting and troubleshooting critical, end-to-end production data pipelines.

Job responsibilities

Key Responsibilities

  • Architect & Build Data Pipelines: Design, develop, and maintain robust, scalable, and reliable data pipelines using Python, SQL, and Spark on the Azure cloud platform.
  • End-to-End Data Solutions: Architect and implement end-to-end data solutions, from data ingestion and processing to storage in our data lake (Azure Data Lake Storage, Delta Lake) and data warehouse.
  • Cloud Data Services Management: Utilize Azure services like Azure Data Factory, Databricks, and Azure SQL Database to build, orchestrate, and manage complex data workflows.
  • Data Quality & Governance: Implement and enforce comprehensive data quality frameworks, including data profiling, cleansing, and validation routines to ensure the highest levels of data integrity and trust.
  • Performance Optimization: Analyze and optimize data pipelines for performance, scalability, and cost-efficiency, ensuring our systems can handle growing data volumes.
  • Mentorship & Best Practices: Mentor and provide technical guidance to junior and mid-level data engineers. Lead code reviews and champion best practices in data engineering, coding standards, and data modeling.
  • Stakeholder Collaboration: Work closely with data analysts, data scientists, and business stakeholders to understand data requirements, provide technical solutions, and deliver actionable data products.
  • System Maintenance: Support and troubleshoot production data pipelines, identify root causes of issues, and implement effective, long-term solutions.

What we offer

Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. 

Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally.

Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today.

Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way!

High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do.

About GlobalLogic

GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.

Apply Now

The gender information on this form helps us understand the makeup of our applicant pool in this key area, and to continuously improve our efforts to make our workforce more inclusive.

Drag and drop your file here or click here to upload

Only .docx, .rtf, .pdf formats allowed to a max size of 5 MB.

Alternately you can include your Linkedin profile