Senior Data Engineer with Databricks Azure

120 - 135 PLN/ godz.
SeniorFull-time
#350882·Dodano dziś·0
Źródło: LinkGroup
Aplikuj teraz

Tech Stack / Keywords

DatabricksAzureAzure DatabricksCloudSecurityNetworkGitHubCI/CD

Firma i stanowisko

You own the day-to-day operations and continuous improvement of an Azure Databricks-based data platform. You bridge cloud infrastructure and data products/business applications, ensuring the platform is stable, secure, monitored, and production-ready, while delivering targeted enhancements (new interfaces, new data products).


Wymagania

  • Hands-on experience operating a cloud data platform in production
  • Strong practical knowledge of Azure Databricks (jobs/workflows, clusters/compute, permissions, troubleshooting)
  • Data pipeline/integration experience (batch and/or streaming), production hardening mindset
  • Monitoring/observability skills (dashboards, alerts, health checks)
  • Security fundamentals: access management, secrets, least-privilege, secure connectivity
  • Strong ownership and communication; comfortable coordinating across infra/security/app/data teams

Nice to have:

  • Azure ecosystem around Databricks (e.g., storage/data lake, networking, identity)
  • Delta/Delta Lake optimization concepts and operational tuning
  • Familiarity with service request / change processes (ITSM-style)
  • Data governance exposure (metadata, lineage, quality reporting)

Obowiązki

Run & Operate (≈70%):

  • Own CDP operations: stability, controls, security, reliability, and cost awareness
  • Monitor pipelines/platform health; improve alerting and Data Health Monitoring
  • Handle service requests: access, configuration, troubleshooting, incident triage/support
  • Coordinate production changes/releases for platform and data product updates
  • Maintain secure connectivity to the application landscape (identity/access, secrets, network patterns)
  • Drive operational improvements: runbooks, automation, root-cause analysis, prevention actions

Improve & Extend (≈30%):

  • Deliver data integrations (build/test/release) across multiple data products
  • Implement/extend interfaces for downstream consumers (APIs and/or consumption patterns)
  • Extend platform capabilities for new data interfaces and new data products
  • Support metadata/discovery and data quality enablement (pragmatic, operational tooling)
  • Role level security and authentications for Databricks apps and underlying data
linkgroup

linkgroup

460 aktywnych ofert

Zobacz wszystkie oferty
Aplikuj teraz