Nowa
Data Analyst
20 200 - 26 900 PLN/ mies.B2B (netto)
MidFull-time·B2B
#350693·Dodano dziś·0
Źródło: SOLID.JobsTech Stack / Keywords
PythonBigQueryAzureSQL
Firma i stanowisko
Firma apreel powstała w kwietniu 2010 roku. W miarę rozwoju firmy i równolegle ze wzrostem poziomu zaufania klientów, jej działalność poszerzyła się o usługi Outsourcingu Specjalistów IT. Dziś to właśnie ten obszar stanowi główny filar działalności apreel.
Wymagania
- Experience developing data models from scratch for green field projects in multiple domains.
- Deep understanding of data warehousing concepts, dimensional modelling, and normalization/denormalization techniques.
- Expertise in tools such as Erwin Data Modeler, PowerDesigner, or similar.
- Strong understanding of data product design principles and lifecycle.
- Strong SQL skills and experience with relational (Oracle, SQL Server, PostgreSQL) and cloud databases (Snowflake, BigQuery, Redshift).
- Good understanding of Azure cloud data services (Data Lake, Data Factory, Azure SQL).
- Familiarity with various databases, operating systems, file types, and data formats.
Preferred Skills:
- Experience with advanced data modelling techniques.
- Business analysis expertise to bridge technical and business requirements.
- Skilled in Python, Spark and SQL programming.
- Project management skills.
- Proficiency with data visualization tools such as Tableau, Power BI, or similar.
- Minimum 3 years of experience in a similar position.
- Knowledge of Python, BigQuery, Azure, SQL.
- Languages: Polish, English.
Obowiązki
- Lead initiatives to extract and standardize financial data from various formats, including PDF, HTML, XBRL and iXBRL, ensuring data accuracy and consistency.
- Conduct exploratory data analysis to identify trends, raise important questions, and derive actionable insights.
- Design, implement and optimize conceptual, logical and physical data models for enterprise-scale data products.
- Develop and maintain data models using ERD diagrams and manage the data dictionaries, for transactional, star and flat schemas etc for different storage structures.
- Partner with data engineering teams to democratize the data model for designing efficient data pipelines.
- Define and enforce data modelling standards and best practices.
- Utilize programming languages such as Python, Spark, Regex, Shell Scripts and SQL for data manipulation, analysis, and automation of processes, including meta-programming and dynamic code generation.
- Manage and optimize databases (SQL Server, Neo4j, Snowflake, Postgres), understanding join types, aggregate functions, and data storage formats (Parquet, AVRO, Delta).
Oferta
- 20.2k–26.9k PLN netto/month (B2B)
- B2B contract with flexible working hours (100%)
- Fully remote work
Elastyczne godziny
apreel
231 aktywnych ofert