Job Title: Python Developer
Experience- 4+Years to 15+Years
Location: Remote
Responsibilities:
- Design, implement, and maintain scalable data pipeline solutions using Spark, Python, AWS Glue, and Snowflake
- Develop and deploy high-performance ETL processes using Spark and Python to ingest data from various sources and load it into Snowflake
- Write efficient, maintainable, and scalable data transformation code using Spark, Python, and SQL
- Collaborate with team members to develop, test, and maintain data pipeline solutions
- Optimize data processing performance through tuning, caching, partitioning, and indexing techniques
- Monitor and troubleshoot issues related to data pipelines, ensuring data quality and integrity
- Develop and maintain documentation on data pipeline architecture, processes, data flows, and data dictionary
Requirements:
- Solid experience with Spark is required
- Strong expertise in Python is required
- Expertise in the AWS cloud platform is required
- Knowledge of data pipeline concepts and best practices
- Experience with AWS Glue and Snowflake is highly desirable
- Proficient in SQL and database concepts, with the ability to optimize queries and data processing performance
- Strong problem-solving and analytical skills
- Excellent communication and collaboration skills, with the ability to work effectively in a team environment.
Bachelor’s degree in Computer Science, Information Technology, or related field
Apply for job
By Email
Please login to submit application