Databricks Specialist / Data Engineer

Databricks Specialist / Data Engineer

emagine Polska

Remote

B2B

Hexjobs Insights

Senior Databricks Specialist / Data Engineer needed to design and maintain Python libraries for data pipelines in a Scrum team. Requires significant experience in Python and Databricks.

Schlüsselwörter

Python
Databricks
PySpark
CI/CD
Scrum
DataOps
Grafana
Data Quality
Azure ADLS Gen2
Airflow

Databricks Specialist / Data EngineerAbout the Role We are looking for a Senior Databricks Specialist + Python to design, evolve, and govern a common reusable Python library that serves as a core foundation for batch and streaming pipelines across the Medallion Architecture (Bronze, Silver, Gold). This is a highly technical, standards-driven role, ideal for professionals with strong software engineering maturity who enjoy defining best practices, enforcing consistency, and building scalable frameworks used by multiple data engineering teams.You will operate as an individual contributor within Scrum-based teams, collaborating closely with engineers, Product Owners, and DataOps to deliver robust, maintainable, and production-grade data solutions.Key Responsibilities Design, implement, and maintain a shared Python library for Databricks, supporting batch and streaming pipelines.Develop reusable PySpark modules, base classes, and abstractions for Bronze, Silver, and Gold layers.Actively participate as a Scrum team member in Sprint Planning, Daily, Refinement, Review, and Retrospective ceremonies.Define and enforce software engineering best practices, including coding standards, documentation, testing strategies, and versioning.Establish and maintain code quality standards, including linting, formatting, and static analysis.Collaborate with Product Owners and fellow engineers to clarify requirements and deliver incremental value.Maintain and improve CI/CD pipelines using GitLab and Databricks Asset Bundles (DABs).Ensure controlled releases, backward compatibility, and smooth adoption of the common library across teams.Integrate logging, monitoring, and data quality controls using Grafana and DQX.Work closely with DataOps to ensure stability, observability, and reliability in production environments.Key Requirements Minimum 10 years of professional experience developing in Python.At least 5 years of hands-on experience with Databricks, including PySpark development in production environments.Proven experience working as a member of Scrum or Agile teams.Solid experience designing Python libraries, frameworks, or shared components.Strong knowledge of software engineering best practices, including: Object-Oriented Programming (OOP).Design patterns.Unit and integration testing.CI/CD pipelines.Experience with code standardization and quality tools, such as linting and formatting tools (e.g., pylint, flake8, black or equivalent).Strong understanding of batch and streaming data processing.Experience with Medallion Architecture and data lifecycle best practices.Familiarity with Airflow, Terraform, and Azure ADLS Gen2.Professional working proficiency in English, both written and spoken.Nice to Have Databricks Certified Associate or Professional certification.Microsoft Azure Fundamentals (AZ-900) or equivalent basic Azure certification.Experience contributing to shared platforms or internal frameworks used by multiple teams.Experience working in international or distributed environments.Soft Skills & Ways of Working Strong communication and collaboration skills in cross-functional and multicultural teams.Sense of ownership and accountability for delivered solutions.Ability to give and receive constructive feedback, particularly during code reviews.Comfortable working with ambiguity and evolving requirements.Proactive mindset with focus on continuous improvement.Team-oriented attitude, valuing shared success over individual ownership.Ability to balance technical excellence with delivery commitments.Work Model Predominantly remote.Availability to work from the Leiria office once per month for team alignment, ceremonies, or workshops.Who You Are You care deeply about clean, maintainable, and well-tested code.You enjoy building platforms and frameworks that others rely on.You thrive as an individual contributor within Scrum teams.You value collaboration, transparency, and engineering discipline.You actively contribute to raising the technical maturity of the team.

Aufrufe: 14
Veröffentlichtvor 8 Tagen
Läuft abin 2 Monaten
Art des VertragsB2B

Ähnliche Jobs, die für Sie von Interesse sein könnten

Basierend auf "Databricks Specialist / Data Engineer"

Keine Angebote gefunden, versuchen Sie, Ihre Suchkriterien zu ändern.