If we can help in any way, please don't hesitate to set a time to meet or talk, or leave your details and we'll get back to you.
Python
AWS
Spark
GCP
ETL Pipeline
Hadoop
Kafka
Apache
Airflow
Scala
Job Title : Data Engineer (2-4 Years Experience)
Location : Indore, Madhya Pradesh
Company : Golden Eagle IT Technologies Pvt. Ltd.
Job Description :
Golden Eagle IT Technologies Pvt. Ltd. is seeking a skilled Data Engineer with 2 to 4 years of experience to join our team in Indore. The ideal candidate will have a strong background in data engineering, big data technologies, and cloud platforms. You will work on designing, building, and maintaining efficient, scalable, and reliable data pipelines.
Key Responsibilities :
- Develop and maintain ETL pipelines using tools like Apache Airflow, Spark, and Hadoop.
- Design and implement data solutions on AWS, including services such as DynamoDB, Athena, Glue Data Catalog, and SageMaker.
- Work with messaging systems like Kafka to manage data streaming and real-time data processing.
- Utilize Python and Scala for data processing, transformation, and automation.
- Ensure data quality and integrity across multiple sources and formats.
- Collaborate with data scientists, analysts, and other stakeholders to understand data needs and deliver solutions.
- Optimize and tune data systems for performance and scalability.
- Implement best practices for data security and compliance.
Preferred Skills (Plus Points) :
- Experience with infrastructure as code tools like Pulumi.
- Familiarity with GraphQL for API development.
- Experience with machine learning and data science workflows, especially using SageMaker.
Qualifications :
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- 2-4 years of experience in data engineering or a similar role.
- Proficiency in AWS cloud services and big data technologies.
- Strong programming skills in Python and Scala.
- Knowledge of data warehousing concepts and tools.
- Excellent problem-solving and communication skills.