5656 2 x Mid+/Senior Flink Data Engineer

165 PLN/ godz.B2B (netto)
SeniorFull-time·B2B
#336404·Dodano dziś·0
Źródło: emagine
Aplikuj teraz

Tech Stack / Keywords

CloudApacheAPIBigQueryAIArchitectureGoogle Cloud PlatformGCP

Firma i stanowisko

Our company is a global digital platform operating at massive scale, serving hundreds of millions of users worldwide. We focus on building consumer-facing subscription products that connect users with content creators through personalized, data-driven experiences. Our culture emphasizes collaboration, experimentation, and engineering excellence in a highly distributed, cloud-native environment.


Wymagania

  • Strong hands-on experience with Apache Flink, especially development using the DataStream API
  • Proven experience maintaining and upgrading Flink environments, ideally with exposure to Flink 2.0
  • Deep understanding of streaming pipeline architecture, performance tuning, state management, and fault tolerance
  • Experience migrating large-scale datasets from BigQuery (BQ) to Data Cloud Storage (DCS)
  • Strong proficiency in data format conversion, particularly Avro to Parquet
  • Ability to design, scale, and automate migration workflows while ensuring data integrity and minimal service disruption
  • Solid knowledge of Google Cloud Platform (GCP) and its data services
  • Good understanding of distributed systems, schema evolution, and storage optimization strategies
  • Ability to break down complex migration and platform challenges into clear, actionable steps
  • Proactive mindset with strong ownership of solutions and risk identification
  • Clear and effective communication skills, especially in explaining technical topics to non-technical stakeholders

Nice to have:

  • Interest in and familiarity with emerging AI-driven practices, with a willingness to explore and experiment beyond standard approaches
  • Experience working on high-scale, consumer-facing data platforms
  • Background in long-running migration programs involving multiple data sources and formats
  • Familiarity with observability, monitoring, and alerting for streaming systems
  • Practical experience using AI-powered assistants to improve productivity, quality, or decision-making in software delivery

Obowiązki

  • Developing and enhancing real-time streaming pipelines using Apache Flink
  • Migrating existing Flink jobs using the DataStream API and adapting them to newer platform standards
  • Leading and executing the upgrade of the Flink platform to version 2.0
  • Designing, optimizing, and maintaining high-throughput, fault-tolerant streaming architectures
  • Migrating large-scale datasets from BigQuery (BQ) to Data Cloud Storage (DCS)
  • Scaling and automating ongoing data migration processes to support growing data volumes
  • Converting datasets from Avro to Parquet format with emphasis on performance, schema evolution, and storage optimization
  • Leveraging AI-powered tools to accelerate migration, validation, and transformation workflows
  • Ensuring data quality, integrity, and minimal downtime during migrations
  • Collaborating with cross-functional teams and effectively communicating technical concepts to non-technical stakeholders

Inne informacje

Remote from Poland

emagine

emagine

218 aktywnych ofert

Zobacz wszystkie oferty
Aplikuj teraz