Data ETL Engineer

130 - 160 PLN/ godz.B2B (netto)
MidFull-time·B2B
#328084·Dodano 20 dni temu·23
Źródło: nofluffjobs.com
Aplikuj teraz

Tech Stack / Keywords

SQLBigQueryETLCI/CDPythonTerraform

Wymagania

  • 3+ years of experience in SQL development and query optimization, particularly in BigQuery environments.
  • Experience designing and implementing ETL/ELT pipelines and data transformation processes.
  • Hands-on experience with GCP data services such as BigQuery, Data Fusion, Cloud Composer/Airflow, or similar tools.
  • Practical experience with Data Vault modeling.
  • Programming experience in Python and familiarity with Terraform.
  • Experience with CI/CD pipelines and DevOps tools (e.g., Git, Jenkins, Ansible).
  • Experience working in Agile environments and DataOps practices.
  • Strong analytical and problem-solving skills.
  • Important: The client requires a visit to Kraków for two days each month.

Nice to have:

  • Experience designing data ingestion pipelines for formats such as CSV, JSON, and XML.
  • Experience integrating data from REST or SOAP APIs, SFTP servers, and enterprise data sources.
  • Knowledge of data contract best practices.
  • Experience with Java development or building custom plugins for data integration tools.
  • Experience with continuous testing and delivery for cloud-based data platforms.
  • Strong communication and collaboration skills.
  • Ability to work independently and manage multiple tasks.
  • Proactive mindset with a strong problem-solving approach.
  • Willingness to learn and continuously improve technical skills.
  • Team-oriented attitude and ability to work effectively in cross-functional teams.

Obowiązki

  • Design, build, test, and deploy data models and transformations in BigQuery using SQL and related technologies.
  • Develop and maintain ETL/ELT pipelines to transform raw and unstructured data into structured datasets using Data Vault modeling.
  • Integrate data from multiple sources, including on-premise systems, APIs, and cloud-based platforms.
  • Monitor and troubleshoot data pipelines for performance issues, failures, or data inconsistencies.
  • Optimize ETL/ELT processes for performance, scalability, and cost efficiency.
  • Review and implement business and technical requirements in data transformation processes.
  • Ensure solutions meet non-functional requirements, including security, reliability, scalability, and compliance with IT standards.
  • Manage code repositories and CI/CD pipelines using tools such as Git and Jenkins.
  • Collaborate with DevOps and data teams to enable automated deployment, testing, and monitoring.
  • Provide bug fixes, enhancements, and technical documentation, and support knowledge transfer to operational teams.

Oferta

  • Sport subscription
  • Private healthcare
  • International projects
Karta sportowa
Opieka zdrowotna

Inne informacje

Hybrid from Kraków, 2 days per week in the office

CRESTT sp. z o.o.

CRESTT sp. z o.o.

40 aktywnych ofert

Zobacz wszystkie oferty
Aplikuj teraz