
Mid-Level Data Engineer – Cloud Data Pipelines
ITDS Polska Sp. z o.o.
21000 - 24150 PLN / HOUR
Hybrid
B2B
Status
Hexjobs Insights
Stanowisko: Inżynier Danych. Obowiązki: budowanie i optymalizacja przepływów danych. Wymagania: 4+ lata doświadczenia, znajomość Pythona, Java, Google Dataflow. Korzyści: pakiet medyczny, elastyczne godziny, rozwój w branży finansowej.
Schlüsselwörter
Python
Java
Google Dataflow
ETL
Google Cloud
SQL
NoSQL
JSON
Parquet
data governance
Vorteile
- Stabilna i długotrwała współpraca z bardzo dobrymi warunkami
- Możliwość rozwijania umiejętności w branży finansowej
- Uczestnictwo w wydarzeniach społecznych i szkoleniach
- Dostęp do atrakcyjnego pakietu medycznego
- Elastyczne godziny pracy
Technologies we use
About the project
Your responsibilities
- Develop and implement efficient data pipelines for collecting, transforming, and storing data across various platforms, ensuring reliable data flow.
- Integrate data from a range of sources including cloud platforms, databases, APIs, and external services.
- Troubleshoot and optimize existing pipelines for performance and scalability.
- Implement ETL processes to convert raw data into valuable insights for analytics and reporting.
- Collaborate with cross-functional teams to understand data needs and support application requirements, including weekend or non-office hours support.
- Build scalable, automated workflows capable of handling large data volumes with high reliability and low latency.
- Set up monitoring and alert systems to minimize downtime and maximize pipeline performance.
- Document data flows, architecture, and processing logic to ensure maintainability and transparency.
Our requirements
- 4+ years of experience as a Dataflow Engineer, Data Engineer, or similar, working with large datasets and distributed systems.
- Proficiency in programming languages such as Python and Java.
- Hands-on experience with data pipeline orchestration tools, especially Google Dataflow.
- Experience working with cloud data platforms like Google Cloud (BigQuery, Dataflow).
- Strong expertise in ETL frameworks, real-time data streaming, and processing.
- Familiarity with data formats like JSON and Parquet.
- Knowledge of SQL and NoSQL databases, along with best practices in data governance, quality, and security.
- Excellent troubleshooting skills for complex data issues.
- Strong communication skills to effectively collaborate with both technical and non-technical stakeholders.
Optional
- Certifications or additional experience with Google Cloud services or data engineering tools.
This is how we organize our work
This is how we work
What we offer
- Stable and long-term cooperation with very good conditions
- Enhance your skills and develop your expertise in the financial industry
- Work on the most strategic projects available in the market
- Define your career roadmap and develop yourself in the best and fastest possible way by delivering strategic projects for different clients of ITDS over several years
- Participate in Social Events, training, and work in an international environment
- Access to attractive Medical Package
- Access to Multisport Program
- Access to Pluralsight
- Flexible hours
Benefits
#GETREADY to meet with us!
ITDS’s Whistleblower Procedure
Aufrufe: 7
| Veröffentlicht | vor 29 Tagen |
| Läuft ab | in 1 Tag |
| Art des Vertrags | B2B |
| Arbeitsmodus | Hybrid |
Ähnliche Jobs, die für Sie von Interesse sein könnten
Basierend auf "Mid-Level Data Engineer – Cloud Data Pipelines"
Keine Angebote gefunden, versuchen Sie, Ihre Suchkriterien zu ändern.