Technology Capabilities
Technology CapabilitiesConsultant
Engineering
10-15 years
India - Hyderabad
Hybrid
Key Responsibilities
Technical Delivery
Develop robust ETL/ELT workflows to ingest, transform, and load data into our data warehouse, Azure Synapse Analytics using Informatica
Contribute to the data ecosystem by performing exploratory data analysis (EDA) to validate pipeline outputs and ensure data accuracy.
Take ownership of data quality by implementing proactive checks and monitoring to maintain data integrity and reliability.
Write and optimize complex SQL queries to support the data needs of our analytical and reporting teams.
Collaborate with data analysts and business users to understand their requirements and translate them into effective data models.
Technical Leadership
Contribute to system-level design, story breakdowns, and technical planning.
Collaborate across backend, frontend, and DevOps teams to deliver integrated solutions.
Continuously learn and adopt new technologies, frameworks, and best practices.
Champion the use of AI, automation, and modern frameworks to enhance product performance and developer efficiency.
Understand business context and build features that align with digital product and customer experience goal
Required Skills and Qualifications
3 to 5 years of professional experience in ETL/ELT workflows
Familiarity with NoSQL databases (e.g., MongoDB, Cassandra) and relational databases.
Exposure to event-driven messaging platforms (Kafka or similar).
Knowledge of API Gateway tools (e.g., Kong, Zuul, Apigee, AWS API Gateway).
Solid foundation in distributed systems, scalability, and performance tuning.
Bachelor’s degree in Computer Science, Software Engineering, or related field.
Preferred Qualifications
Experience in digital products, e-commerce platforms, or transaction systems.
Familiarity with regulatory frameworks (PCI, SOX, GDPR).
Certifications in cloud platforms (AWS, GCP, Azure) or front-end frameworks.
Communication and Soft Skills
Excellent verbal and written communication skills.
Ability to translate business requirements into technical specifications and vice versa.
Strong problem-solving and analytical thinking skills.
Collaborative team player with experience in Agile/Scrum environments.
Comfortable presenting ideas and engaging in technical discussions.
Education:
Bachelor’s degree in Computer Science, Software Engineering, or a related technical discipline (Required).
Additional certifications in backend technologies, cloud platforms, or security frameworks (Preferred but not required).
Python Azure Data Factory (ADF), Azure Databricks, PySpark, Delta Lake, ETL/ELT, data pipelines, data lakehouse architecture.
Key Responsibilities
Technical Delivery
Develop robust ETL/ELT workflows to ingest, transform, and load data into our data warehouse, Azure Synapse Analytics using Informatica
Contribute to the data ecosystem by performing exploratory data analysis (EDA) to validate pipeline outputs and ensure data accuracy.
Take ownership of data quality by implementing proactive checks and monitoring to maintain data integrity and reliability.
Write and optimize complex SQL queries to support the data needs of our analytical and reporting teams.
Collaborate with data analysts and business users to understand their requirements and translate them into effective data models.
Technical Leadership
Contribute to system-level design, story breakdowns, and technical planning.
Collaborate across backend, frontend, and DevOps teams to deliver integrated solutions.
Continuously learn and adopt new technologies, frameworks, and best practices.
Champion the use of AI, automation, and modern frameworks to enhance product performance and developer efficiency.
Understand business context and build features that align with digital product and customer experience goal
Required Skills and Qualifications
3 to 5 years of professional experience in ETL/ELT workflows
Familiarity with NoSQL databases (e.g., MongoDB, Cassandra) and relational databases.
Exposure to event-driven messaging platforms (Kafka or similar).
Knowledge of API Gateway tools (e.g., Kong, Zuul, Apigee, AWS API Gateway).
Solid foundation in distributed systems, scalability, and performance tuning.
Bachelor’s degree in Computer Science, Software Engineering, or related field.
Preferred Qualifications
Experience in digital products, e-commerce platforms, or transaction systems.
Familiarity with regulatory frameworks (PCI, SOX, GDPR).
Certifications in cloud platforms (AWS, GCP, Azure) or front-end frameworks.
Communication and Soft Skills
Excellent verbal and written communication skills.
Ability to translate business requirements into technical specifications and vice versa.
Strong problem-solving and analytical thinking skills.
Collaborative team player with experience in Agile/Scrum environments.
Comfortable presenting ideas and engaging in technical discussions.
Education:
Bachelor’s degree in Computer Science, Software Engineering, or a related technical discipline (Required).
Additional certifications in backend technologies, cloud platforms, or security frameworks (Preferred but not required).
Key Responsibilities
Technical Delivery
Develop robust ETL/ELT workflows to ingest, transform, and load data into our data warehouse, Azure Synapse Analytics using Informatica
Contribute to the data ecosystem by performing exploratory data analysis (EDA) to validate pipeline outputs and ensure data accuracy.
Take ownership of data quality by implementing proactive checks and monitoring to maintain data integrity and reliability.
Write and optimize complex SQL queries to support the data needs of our analytical and reporting teams.
Collaborate with data analysts and business users to understand their requirements and translate them into effective data models.
Technical Leadership
Contribute to system-level design, story breakdowns, and technical planning.
Collaborate across backend, frontend, and DevOps teams to deliver integrated solutions.
Continuously learn and adopt new technologies, frameworks, and best practices.
Champion the use of AI, automation, and modern frameworks to enhance product performance and developer efficiency.
Understand business context and build features that align with digital product and customer experience goal
Required Skills and Qualifications
3 to 5 years of professional experience in ETL/ELT workflows
Familiarity with NoSQL databases (e.g., MongoDB, Cassandra) and relational databases.
Exposure to event-driven messaging platforms (Kafka or similar).
Knowledge of API Gateway tools (e.g., Kong, Zuul, Apigee, AWS API Gateway).
Solid foundation in distributed systems, scalability, and performance tuning.
Bachelor’s degree in Computer Science, Software Engineering, or related field.
Preferred Qualifications
Experience in digital products, e-commerce platforms, or transaction systems.
Familiarity with regulatory frameworks (PCI, SOX, GDPR).
Certifications in cloud platforms (AWS, GCP, Azure) or front-end frameworks.
Communication and Soft Skills
Excellent verbal and written communication skills.
Ability to translate business requirements into technical specifications and vice versa.
Strong problem-solving and analytical thinking skills.
Collaborative team player with experience in Agile/Scrum environments.
Comfortable presenting ideas and engaging in technical discussions.
Education:
Bachelor’s degree in Computer Science, Software Engineering, or a related technical discipline (Required).
Additional certifications in backend technologies, cloud platforms, or security frameworks (Preferred but not required).
Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders.
Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally.
Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today.
Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way!
High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do.
GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Hi there — how can I assist you today?
Explore our services, industries, career opportunities, and more.