Senior Big Data & DevOps Engineer
We are seeking a highly experienced Senior Big Data & DevOps Engineer to manage end-to-end data operations for enterprise-scale platforms. The ideal candidate will have 8 years of experience in Big Data technologies, ETL development, and DevOps automation, with hands-on expertise in HDFS, Hive, Impala, PySpark, Python, Jenkins, and uDeploy . This role is critical in ensuring the stability, scalability, and efficiency of data platforms while enabling smooth development-to-production workflows.
Required Qualifications
- Bachelor s degree in Computer Science, IT, or related field.
- 8 years of experience in Big Data engineering and DevOps practices.
- Advanced proficiency in HDFS, Hive, Impala, PySpark, Python, and Linux.
- Hands-on experience with CI/CD tools such as Jenkins and uDeploy.
- Strong understanding of ETL development, orchestration, and performance optimization.
- Experience with ServiceNow for incident/change/problem management.
- Excellent analytical, troubleshooting, and communication skills.
Nice to Have
- Exposure to cloud-based Big Data platforms (e.g., AWS EMR).
- Familiarity with containerization (Docker, Kubernetes) and infrastructure automation tools (Ansible, Terraform).