Nowa
Data Flow Engineer
Brak informacji o wynagrodzeniu
SeniorFull-time
#347261·Dodano wczoraj·0
Źródło: SquareDevTech Stack / Keywords
AICloudCybersecuritySAP S/4HANASAPServiceNowApacheKafka
Firma i stanowisko
SquareDev is a member of the QnR Group, a leading technology organization specializing in end-to-end custom software solutions, Artificial Intelligence, Cybersecurity, SAP S/4HANA, SAP Business One, ServiceNow, and FinTech solutions. SquareDev uses state-of-the-art technology to build solutions for customers and partners, participating in research projects across Europe and collaborating with top universities and enterprises on AI, Data, and Cloud.
Wymagania
- At least 6 years of relevant experience.
- Bachelor’s or Master’s degree in Computer Science, Engineering or a related technical field.
- At least one of the following certifications: Cloudera Certified Developer for Apache NiFi or Cloudera Data Flow (CFM) related certification, or equivalent.
- Skills in designing, building and maintaining complex flows in Apache NiFi, with 2-3 years of daily hands-on work, ideally in a CDP environment, and at least one large delivered integration project where NiFi was the central tool.
- Strong Python skills for data processing, custom NiFi logic, automation and integrations.
- Solid experience with REST API integrations — endpoint calls, OAuth/JWT, rate limiting and error recovery.
- Hands-on experience building CDC pipelines to and from relational databases, using native NiFi processors, connectors and SQL Builder.
- Practical knowledge of Apache Iceberg (tables, schema evolution, partitioning) and its integration with NiFi, Spark or Flink, preferably in CDP.
- Experience with data governance in CDP — Apache Atlas for metadata, lineage and tagging, and Apache Ranger for security policies and audit on NiFi flows.
- Experience with Apache Kafka as a message broker (topics, producers and consumers, schema registry, NiFi integration) and Apache Avro for serialisation and schema evolution.
- English at B2 level (CEFR) or higher.
Obowiązki
- Designing, building, testing and maintaining complex data flows in Cloudera DataFlow (Apache NiFi) — ingest, transform, enrich, route and deliver data.
- Building and tuning Change Data Capture (CDC) pipelines in real time or near real time, using NiFi together with Kafka, Debezium or SQL-based CDC connectors.
- Connecting external systems through REST APIs, JDBC, Kafka and other protocols.
- Managing data schemas in Avro and keeping metadata and lineage clean in Apache Atlas.
- Setting up security and governance for data flows through Apache Ranger policies.
- Monitoring pipelines, setting up alerts and fixing performance or reliability issues.
- Working with data engineers, architects and business stakeholders to gather requirements and shape the architecture of data flows.
- Writing and keeping up to date SOPs, runbooks and technical documentation.
- Taking part in upgrades and migrations of CDP, NiFi and Kafka.
SquareDev
2 aktywne oferty