Responsibilities
Collaborate with product owners and team leads to identify, design, and implement new data-driven features.
Build and maintain scalable ETL pipelines to extract, transform, and load data from APIs, streams, and data lakes.
Implement data privacy and security measures in accordance with standards and compliance rules.
Stay updated on data engineering trends and propose relevant improvements.
Share knowledge and best practices with cross-functional teams.
Evaluate and select the right tools and strategies for various data integration scenarios.
Qualifications
5+ years of commercial experience in data engineering.
Strong programming skills in Python.
Hands-on experience with distributed computing frameworks (e.g. PySpark).
Familiarity with at least one major cloud platform (GCP, AWS, or Azure) and its data services.
Proficiency in SQL and query optimization.
Solid understanding of data warehousing and modeling (OLTP/OLAP, SCD, dimensional modeling).
Expertise in relational databases like PostgreSQL, MSSQL, or MySQL.
Experience orchestrating data workflows (e.g. Apache Airflow, Prefect, Glue, Azure Data Factory).
Excellent teamwork and collaboration skills.
English language competency at B2 level or higher.
تفاصيل الوظيفة
الموقع Europe - Europe
القطاع
تكنولوجيا المعلومات والاتصالات
نوع الوظيفة دوام كامل
الدرجة العلمية بكالوريوس
الخبرات 5 - 7
الجنسية
غير محدد
سجلي الدخول للتقدم