Responsibilities
Design, develop, and optimize ETL/ELT pipelines using Informatica DE and related tools.
Build and maintain real-time streaming solutions with Apache Kafka and Confluent.
Develop big data workflows on Cloudera using Spark (Python/Scala).
Automate and orchestrate data pipelines with NiFi or similar tools.
Collaborate with data architects, BI teams, and business stakeholders to deliver scalable, reliable solutions.
Ensure data quality, governance, and security compliance across pipelines.
Troubleshoot performance, scalability, and integration issues.
Mentor junior engineers and provide technical guidance.
Maintain documentation for designs, data mappings, and operational procedures.
Stay updated on big data, streaming, and Informatica technology trends.
Qualifications
Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
7+ years of hands-on experience in data engineering with strong focus on Informatica DE.
Expertise in ETL/ELT design, data modeling, and large-scale data warehousing.
Strong SQL and Python programming skills.
Experience with Apache Kafka / Confluent and Apache Spark.
Hands-on experience with Cloudera stack (HDFS, Hive, Impala).
Proven ability in performance tuning and optimizing pipelines.
Strong problem-solving, collaboration, and communication skills.
Informatica certification is a plus.
Preferred: Experience with cloud platforms (AWS, GCP, Azure), DevOps (CI/CD, Git, Jenkins), in-memory databases, and Agile delivery.
المهارات المطلوبة
CI/CD
Python
AWS
Data Modeling
ETL (Extract, Transform, Load)
Azure Cloud Services
Kafka
تفاصيل الوظيفة
الموقع الرياض - المملكة العربية السعودية
القطاع
تكنولوجيا المعلومات والاتصالات
نوع الوظيفة دوام كامل
الدرجة العلمية بكالوريوس
الخبرات 7+
الجنسية
غير محدد
سجلي الدخول للتقدم