Technologie-Lösungen
Technologie-LösungenEntdecken Sie, wie die KI-gestützten Lösungen von GlobalLogic einem globalen Softwareun...

Erfahren Sie, wie GlobalLogic einem führenden Anbieter von Mietlösungen geholfen hat, d...

GlobalLogic Inc., ein Unternehmen der Hitachi Group und führend im Digital Engineering,...

Die Brandon Hall Group Excellence in Awards™ 2025 werden für Projekte und Initiativen i...

Consultant
Engineering
5-10 years
India - Hyderabad
Apache, AWS, Azure, cloud, Databricks, Power BI, Python, Redshift, Snowflake, SQL, Tableau
On-site
–
About the Role
We’re looking for a proactive and enthusiastic Data Engineer to join our team. This is a hands-on role where you will design, build, and maintain scalable data platforms across on-premise open-source technologies and cloud environments (preferably Azure and AWS) that fuel our business decisions. You’ll be a key contributor, working closely with data analysts, product managers, and business stakeholders to transform raw data into reliable and accessible resources. This is a great opportunity for someone who is passionate about data and eager to take ownership of end-to-end data pipelines across hybrid ecosystems.
Key Responsibilities: Data Engineering
Design and build efficient and scalable data pipelines using cloud-native (AWS Glue, Azure Data Factory) or open-source frameworks (Apache Airflow, Apache NiFi, Spark, Kafka).
Develop robust ETL/ELT workflows to ingest, transform, and load data into our data warehouse solutions such as Snowflake, Redshift, Azure Synapse, or open-source alternatives (Hive, Presto, Trino, PostgreSQL-based warehouses).
Contribute to the data ecosystem by performing exploratory data analysis (EDA) to validate pipeline outputs and ensure data accuracy.
Take ownership of data quality by implementing proactive checks and monitoring to maintain data integrity and reliability.
Write and optimize complex SQL queries to support the data needs of our analytical and reporting teams.
Collaborate with data analysts and business users to understand their requirements and translate them into effective data models.
Participate in CI/CD practices using Git-based workflows (GitHub Actions, GitLab CI/CD, Jenkins, or Azure DevOps) to automate pipeline deployment and ensure smooth delivery.
Assist in maintaining and optimizing data models that power BI tools such as Power BI, Tableau, or open-source dashboards (e.g., Superset, Metabase).
Proactively document data flows, transformation logic, and pipeline architecture to foster knowledge sharing within the team.
Required Skills and Qualifications
3–5 years of hands-on experience in data engineering using a mix of cloud-native (Azure, AWS) and open-source tools.
Proficiency with data integration and processing frameworks such as Databricks, Apache Airflow, Apache NiFi, Spark, or Kafka, Azure Data Factory, AWS Glue
Strong command of SQL for data manipulation and analysis, along with solid Python skills for scripting, automation, and data processing
Good understanding of data lakes, data warehouses, and data modeling concepts.
Exposure to data governance, data quality practices, and performance tuning in large-scale environments.
Familiarity with version control (Git/GitHub/GitLab) and CI/CD pipelines (GitHub Actions, Jenkins, GitLab CI/CD, or Azure DevOps).
Experience with business intelligence tools such as Power BI, Tableau, or open-source dashboards (Superset, Metabase).
Knowledge of basic machine learning concepts and their integration into data pipelines is a plus.
Communication and Soft Skills
Excellent verbal and written communication skills.
Ability to translate business requirements into technical specifications and vice versa.
Collaborative team player with experience working in cross-functional, agile environments.
Strong analytical thinking and proactive approach to problem-solving.
Exposure to machine learning pipelines and real-time analytics.
Preferred Qualifications
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
Familiarity with Agile methodologies and version control systems like Git.
Enthusiasm for learning new tools, technologies, and best practices in data engineering.
About the Role
We’re looking for a proactive and enthusiastic Data Engineer to join our team. This is a hands-on role where you will design, build, and maintain scalable data platforms across on-premise open-source technologies and cloud environments (preferably Azure and AWS) that fuel our business decisions. You’ll be a key contributor, working closely with data analysts, product managers, and business stakeholders to transform raw data into reliable and accessible resources. This is a great opportunity for someone who is passionate about data and eager to take ownership of end-to-end data pipelines across hybrid ecosystems.
Key Responsibilities: Data Engineering
Design and build efficient and scalable data pipelines using cloud-native (AWS Glue, Azure Data Factory) or open-source frameworks (Apache Airflow, Apache NiFi, Spark, Kafka).
Develop robust ETL/ELT workflows to ingest, transform, and load data into our data warehouse solutions such as Snowflake, Redshift, Azure Synapse, or open-source alternatives (Hive, Presto, Trino, PostgreSQL-based warehouses).
Contribute to the data ecosystem by performing exploratory data analysis (EDA) to validate pipeline outputs and ensure data accuracy.
Take ownership of data quality by implementing proactive checks and monitoring to maintain data integrity and reliability.
Write and optimize complex SQL queries to support the data needs of our analytical and reporting teams.
Collaborate with data analysts and business users to understand their requirements and translate them into effective data models.
Participate in CI/CD practices using Git-based workflows (GitHub Actions, GitLab CI/CD, Jenkins, or Azure DevOps) to automate pipeline deployment and ensure smooth delivery.
Assist in maintaining and optimizing data models that power BI tools such as Power BI, Tableau, or open-source dashboards (e.g., Superset, Metabase).
Proactively document data flows, transformation logic, and pipeline architecture to foster knowledge sharing within the team.
Required Skills and Qualifications
3–5 years of hands-on experience in data engineering using a mix of cloud-native (Azure, AWS) and open-source tools.
Proficiency with data integration and processing frameworks such as Databricks, Apache Airflow, Apache NiFi, Spark, or Kafka, Azure Data Factory, AWS Glue
Strong command of SQL for data manipulation and analysis, along with solid Python skills for scripting, automation, and data processing
Good understanding of data lakes, data warehouses, and data modeling concepts.
Exposure to data governance, data quality practices, and performance tuning in large-scale environments.
Familiarity with version control (Git/GitHub/GitLab) and CI/CD pipelines (GitHub Actions, Jenkins, GitLab CI/CD, or Azure DevOps).
Experience with business intelligence tools such as Power BI, Tableau, or open-source dashboards (Superset, Metabase).
Knowledge of basic machine learning concepts and their integration into data pipelines is a plus.
Communication and Soft Skills
Excellent verbal and written communication skills.
Ability to translate business requirements into technical specifications and vice versa.
Collaborative team player with experience working in cross-functional, agile environments.
Strong analytical thinking and proactive approach to problem-solving.
Exposure to machine learning pipelines and real-time analytics.
Preferred Qualifications
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
Familiarity with Agile methodologies and version control systems like Git.
Enthusiasm for learning new tools, technologies, and best practices in data engineering.
Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders.
Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally.
Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today.
Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way!
High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do.
GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Hi there — how can I assist you today?
Explore our services, industries, career opportunities, and more.