Job Search
We can help you build an exceptional career
Expert Data Engineer IRC184002
Job | IRC184002 |
Location | Argentina - Buenos Aires |
Designation | Associate Specialist Engineer |
Experience | 5-10 years |
Function | Engineering |
Skills | Amazon Simple Storage Service (S3), AWS Athena, AWS Glue, AWS Redshift, Data Warehousing, DynamoDB`, ETL, google BigQuery, Google Cloud Storage, MySQL, NoSQL, Oracle, PostgreSQL, SQL, SQL SERVER MS-SQL Server, SQL Server-SSIS |
Work Model: | Remote |
Description
We are seeking an experienced Staff Data Engineer with a strong background in SQL Server, Dynamo DB, and AWS Glue to join the Linq team. As a Staff Data Engineer, the candidate will be responsible for developing, testing, and maintaining data pipelines and workflows. He/she will work closely with the data team to design, build, and optimize the client’s data infrastructure, ensuring high data quality, availability, and reliability.
We look for support from someone that can be self-driven, autonomous, and work across our entire product offerings supporting multiple teams as needed.
Requirements
• At least 5 years of experience in data engineering, with a strong focus on SQL Server.
• In-depth knowledge of SQL Server Integration Services (SSIS), AWS Glue, and other relevant ETL tools.
• Strong SQL coding skills and proficiency in database design and development.
• Experience working with other databases, such as MySQL, Oracle, and PostgreSQL.
• Familiarity with data warehousing and dimensional modeling.
• Experience implementing data solutions on AWS and GCP data services, such as Amazon S3, Redshift, Athena, Google BigQuery, and Cloud Storage.
• Experience designing, developing, and maintaining NoSQL data solutions using Dynamo DB.
• Bachelor’s degree in Computer Science, Information Technology, or a related field.
• Excellent problem-solving skills and the ability to work independently or as part of a team.
• Strong communication and documentation skills.
Job Responsibilities
• Design and develop ETL processes and workflows using SQL Server Integration Services (SSIS), AWS Glue, and other relevant tools.
• Build and maintain data pipelines, ensuring high data quality and accuracy.
• Perform data analysis and develop solutions for data-related problems.
• Collaborate with data analysts and stakeholders to identify business requirements and design data models.
• Optimize data infrastructure for performance and scalability.
• Develop and maintain database schemas and structures.
• Implement data solutions on AWS and GCP data services, such as Amazon S3, Redshift, Athena, Google BigQuery, and Cloud Storage.
• Design, develop, and maintain NoSQL data solutions using Dynamo DB.
• Participate in code reviews, testing, and deployment activities.
• Document data engineering processes and procedures.
We Offer
Exciting Projects: Come take your place at the forefront of digital transformation! With clients across all industries and sectors, we offer an opportunity to work on market-defining products using the latest technologies.
Collaborative Environment:Expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities!
Work-Life Balance:GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules.We offer you the best quality of work life so that you exceed the expectations of our clients, while achieving your professional and personal ambitions.
Professional Development:Our dedicated Learning & Development team regularly organizes English classes, professional certifications, and technical and soft skill trainings. We also offer the chance to travel internationally
Excellent Benefits:We provide our employees with competitive salaries, family medical insurance, extended paternity leave, annual performance bonuses, and referral bonuses.