Organize and structure raw data into a functional centralized data warehouse from various formats and databases.
Develop and maintain data pipelines and ETL/ELT processes.
Ensure data integrity and optimize workflows.
Build and manage CI/CD pipelines.
Implement real-time data streaming and monitoring.
Collaborate with stakeholders and provide technical support.
Requirements
Solid experience in Data Engineering (3+ years).
Strong Python, SQL, Docker skills.
Very good familiarity with data warehousing and its infrastructure.
Experience with ETL/ELT, Airflow, Git, and data warehousing.
Self-management and ability to meet sprint goals.
Company offers
Private health insurance.
Flexible hybrid work environment.
Ownership of data processes.
Professional growth in a collaborative environment.
Klausimai ir atsakymai
Užduokite klausimą:
Jūs galite pateikti klausimą darbdaviui. Klausimas atsiras skelbime iškart, kai tik darbdavys atsakys į jį. Lanktytojai negalės matyti jūsų el. pašto adreso.
Persiųsti
Nuoroda į skelbimą bus pridėta automatiškai žinutės pabaigoje.