Nowa
Data Engineer
120 - 137 PLN/ godz.B2B (netto)
SeniorFull-time·B2B
#346857·Dodano dziś·0
Źródło: nofluffjobs.comTech Stack / Keywords
PythonAirflowKafkaETLGCPBigQuery
Firma i stanowisko
The position is within the banking industry. The workplace is hybrid with one day per week in the office in Warsaw. The recruitment process includes a phone interview, technical interview, project interview, and decision.
Wymagania
- Minimum 5 years of experience as a Data Engineer in GCP environment.
- Very good knowledge of BigQuery, Bigtable, Scylla Cloud, and relational databases (Oracle, PostgreSQL, ScyllaDB).
- Experience in database design, optimization, and tuning.
- Good knowledge of Apache Airflow, Dataflow, Dataproc, and on-premise ETL tools (Informatica PowerCenter, Apache NiFi).
- Experience working with Kafka and Google Pub/Sub.
- Very good knowledge of Python and PySpark.
- Ability to design and expose REST APIs.
- Knowledge of AutomateNow by Infinity Data.
- Experience building high-performance data processing solutions.
Obowiązki
- Designing and developing data integration processes in GCP and on-premise environments.
- Building and maintaining ETL/ELT processes.
- Designing, optimizing, and tuning databases.
- Developing streaming and queueing solutions.
- Creating and maintaining workflows and data processing schedules.
- Exposing and developing REST API services.
- Creating tools to support efficient data loading in Python.
- Monitoring and optimizing data process performance.
Oferta
- Sport subscription
- Private healthcare
Karta sportowa
Opieka zdrowotna
ITFS
146 aktywnych ofert