Tech Stack / Keywords
PythonSQLCloudPower BIBigQueryAgile
Firma i stanowisko
You will join a data-driven organization focused on building scalable and efficient data platforms to support business decision-making. The project involves designing and implementing modern data solutions within a cloud-based environment, with a strong emphasis on data quality, performance optimization, and cost efficiency.
Wymagania
- At least 3 years of professional experience in data engineering roles, preferably in Retail or consulting environments
- Strong knowledge of Python and SQL, with hands-on experience in building and maintaining data pipelines
- Experience working with cloud platforms, preferably GCP
- Familiarity with BI tools such as Power BI or Looker
- Experience with BigQuery and Airflow
- Solid understanding of Agile methodologies and project management in Agile environments
- Strong analytical and statistical skills, including a proven ability to validate and ensure data accuracy and quality
- English and Polish proficiency at minimum B2 level
Obowiązki
- Designing and developing data pipelines (both batch and streaming) to ensure reliable data flow across systems
- Optimizing data processing performance and controlling infrastructure costs
- Implementing monitoring systems, alerts, and data quality control mechanisms
- Collaborating with Data Analysts and business stakeholders to gather and refine data requirements
- Supporting junior team members through mentoring and conducting code reviews
- Actively participating in Agile ceremonies, including sprint planning and effort estimation
- Working closely with global teams to align on solutions and best practices
Oferta
- Duration: 3 months, starting in June, with the possibility of extension
- Type of cooperation: B2B contract
- Equipment: will be provided by the client
Spyrosoft
143 aktywne oferty