Job Search
We can help you build an exceptional career
Pyspark Developer IRC179577
Job: | IRC179577 |
Location: | India - Noida |
Designation: | Senior Software Engineer |
Experience: | 3-5 years |
Function: | Engineering |
Skills: | AWS, Azure Datawarehouse, cloud, Databases, DB performance tuning, Hadoop, Python, Spark, Spark MLlib |
Work Model: | Hybrid |
Description:
Join GlobalLogic, to be a valid part of the team working on a huge software project for the world-class company providing M2M / IoT 4G/5G modules e.g. to the automotive, healthcare and logistics industries. Through our engagement, we contribute to our customer in developing the end-user modules’ firmware, implementing new features, maintaining compatibility with the newest telecommunication and industry standards, as well as performing analysis and estimations of the customer requirements.
Requirements:
Description
Exp: 3-5 Years
Required Skills:
Python, Spark Framework with all API knowledge
Strong knowledge of the Data frames, RDD, Spark query tuning, and performance optimization
Strong knowledge of the MLlib, Spark Streaming
Knowledge of the Hadoop Ecosystem (HDFS, HIVE, Impala)
Knowledge of the SQL, NoSQL on different databases like Redshift, Snowflake, MongoDB, Synapse Analytics on Azure, DynamoDB, etc
Strong knowledge of working with different data file formats like Parquet, JSON, CSV, etc
Knowledge in the Cloud (AWS, Azure) and Databricks
Preferences:
Python, Spark, Azure Datawarehouse, AWS, Cloud, Hadoop, Spark MLlib, DB Performance Tuning, Database.
Job Responsibilities:
PySpark developer’s responsibilities include the ability to design, build, and unit test the application in Spark/Pyspark. Should understand existing ETL/Impala & Hive queries & logic to convert into PySpark with In-depth knowledge of Hadoop, PySpark, and similar frameworks with a strong focus on the functional programming paradigm
Excellent working knowledge on Data frames, RDD, Spark query tuning, and performance optimization
A very good understanding and implementation of Apache Spark RDD API, Apache Spark SQL DataFrame API, Apache Spark MLlib API, Apache Spark Streaming API, and Apache Spark GraphX API using Python
Actively participate in all phases of the development lifecycle
What We Offer
Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them.
Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities!
Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays.
Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings.
Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses.
Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!