GCP Data Engineer
130 - 170 PLN/ godz.B2B (netto)
SeniorFull-time·B2B
#312015·Dodano około miesiąc temu·34
Źródło: nofluffjobs.com🚫Oferta wygasła. Ta oferta pracy nie jest już aktywna i rekrutacja została zakończona.
Tech Stack / Keywords
GCPGoogle cloud platformBIETLBig dataSparkAirflowdbtmodern data warehousing
Wymagania
- 5+ years of hands-on experience in engineering complex, enterprise-grade data solutions.
- Minimum 3 years of practical experience with Google Cloud Platform (GCP).
- Strong command of BI, ETL, and Big Data technologies such as Spark, Airflow, and dbt.
- Deep understanding of modern data warehousing, including real-time data flow and complex aggregations.
Obowiązki
- Build and maintain robust, scalable data solutions within the GCP ecosystem.
- Develop complex data pipelines ensuring high performance in real-time and batch processing.
- Build, deploy, and optimize end-to-end ETL/ELT processes using GCP-native technologies (BigQuery, Dataflow, etc.).
- Implement sophisticated relational and Big Data structures tailored for high-volume environments.
- Apply modern data architectures such as Data Fabric, Data Mesh, and Data Vault within the data platform.
- Develop and maintain systems for stream processing and data warehousing aggregations to support instant analytics.
- Participate in design workshops, providing technical insights and ensuring feasibility of architectural choices.
Link Group
163 aktywne oferty