Nowa
GCP Data Platform Engineer - Automation & Innovation Department
Brak informacji o wynagrodzeniu
SeniorFull-time
#351171·Dodano wczoraj·0
Źródło: nofluffjobs.comTech Stack / Keywords
CloudStoragePUBLookerData migrationGCPInfrastructure as CodeTerraformLinuxBigQueryDockerKubernetesGitLabCD pipelinesCommunication skillsSparkJava
Firma i stanowisko
Join a new, strategic data transformation project moving analytics from on-premise to GCP and building data architecture and data model from the ground up, focusing on business value creation and customer experience. The project uses technologies like GCP, Spark, Python, Kubernetes, BigQuery, Vertex AI, Terraform, and Looker, integrating diverse, high volume data sources, designing streaming and batch processing layers, implementing data governance, lineage, data quality, and data security, and setting up CI/CD and monitoring/SLOs to support AI/LLM driven solutions.
Wymagania
- 3+ years of experience as a Data Platform Engineer in a data-driven environment (preferably with GCP).
- Experience in developing enterprise ready solutions based on GCP data services (BigQuery, Cloud Storage, Pub/Sub, Dataproc, Composer, Cloud Run, Looker, Vertex AI).
- Experience in large-scale data migration or cloud transformation projects.
- Experience with modern data platform patterns, including data lakehouse architectures on GCP (Cloud Storage + BigQuery).
- Hands-on experience with Infrastructure-as-Code (IaC) tools, including Terraform/Terragrunt.
- Proficiency in Python.
- Experience with Linux, Docker/Kubernetes and GitLab CI/CD pipelines.
- Very good command of English (spoken and written).
- Strong communication skills with the ability to explain complex technical concepts to business stakeholders.
Nice to have:
- Degree in Computer Science.
- Experience with Spark.
- Scala or Java knowledge.
- Knowledge of data governance, metadata and data quality tools.
- Experience collaborating with business stakeholders.
Obowiązki
- Develop reusable frameworks for data processing and testing on GCP (e.g., BigQuery, Dataflow/Dataproc, Composer).
- Build and maintain batch and streaming data ingestion pipelines from various sources (databases, Kafka/MQ, APIs, files) into GCP.
- Implement automated tests and data quality checks for data pipelines.
- Collaborate with analysts and data scientists to deliver reliable, well-documented datasets.
- Monitor, optimize and secure data pipelines in line with data governance and compliance standards.
T-Mobile
117 aktywnych ofert