Data Engineer (m/f/d)
München, DE, 80335
We are seeking an experienced Data Engineer (m/f/d) to design, build, and optimize data pipelines and infrastructure, ensuring reliable, secure, and high-quality data for PPRE’s platforms. As a key player in PPRE’s Data-Driven Organization, you will transform data through the Data Lakehouse, embed governance standards, and support integration into the Operational Data Store (ODS). In this role, you will collaborate with Data Stewards, Data Solution Architects, Data Analysts, and Product Owners to deliver Data Products and maintain data lifecycle management. With a high degree of autonomy, you will drive innovation and ensure governed, high-quality data for enterprise-wide use.
Key Responsibilities
Pipeline & Infrastructure Development
- Independently develops, builds, and optimizes complex data pipelines and infrastructure for reliable ingestion, transformation, and delivery of governed data.
- Connect source systems to the ODS and transform Bronze → Silver layer data according to approved business and technical rules.
- Maintain scalable, secure, and high-performance pipeline components. Leads complex, multi‑domain pipeline design decisions, ensuring architectural alignment across departments and anticipating cross‑platform implications.
Implementation of Governance & Quality Standards
- Drives advanced governance standardization, converting governance requirements into scalable, repeatable operational processes across multiple Data Products.
- Leads the implementation of data governance standards, including validation rules, schema checks, format rules, referential integrity checks, and quality thresholds. Proactively identifies quality gaps and driving resolutions.
- Champions innovation in automated DQ frameworks and leads the definition of new quality KPIs and lineage monitoring capabilities across domains.
- Execute and Develop Databricks automated Data Quality gates: ingestion validations, transformation gates, and certification gates.
- Log failures and events for Steward review and support continuous remediation.
Collaboration With Cross-Functional Stakeholders
- Influences cross‑departmental decision‑making, advising senior stakeholders on data transformation logic, lineage integrity, and platform strategy.
- Represents Data Engineering in governance committees and contributes to firm‑wide standards evolution.
- Work directly with Data Stewards to understand and implement business rules, transformation logic, and KPI definitions.
- Collaborate with Data Analysts to ensure downstream use cases are supported by accurate, complete, governed data.
- Provide pipeline mapping and technical lineage details for Stewards and Governance Officers.
- Influences cross‑functional teams (Stewards, Architects, Analysts, Business Owners) by providing expert guidance on complex technical design decisions, data transformations, and platform optimization
Lineage, Documentation & Metadata
- Owns senior‑level lineage governance, ensuring completeness across complex interconnected pipelines and driving structural improvements across domains.
- Support automated lineage monitoring scripts and ensure pipeline behavior matches documented lineage.
- Maintain metadata integrity across all pipeline layers.
Data Quality Monitoring & Remediation Support
- Detect anomalies through Data Quality gates, Support remediation in collaboration with Stewards and Engineers.
- Leads root‑cause analysis and resolution of major, cross‑pipeline data quality issues, acting as a coordinator for Engineers and Stewards and ensuring sustainable structural fixes
- Help maintain Data Quality KPI dashboards and event logs.
Operational Excellence & Continuous Improvement
- Drives engineering-wide improvements, promoting best practices, innovation, and automation across the full engineering lifecycle.
- Drives automation of checks and controls to improve efficiency and reduce manual work for Stewards and Analysts.
- Contribute to enhancements to pipeline reliability, performance, and observability.
Key Requirements
Technical Competencies
- Demonstrated ability to define and evolve engineering standards across teams and influence architectural direction
- Senior experience developing and optimizing data pipelines, transformations, and integration flows.
- Understanding of Databricks, Delta Lake layers (Bronze/Silver/Gold), and quality/validation frameworks.
- Ability to apply governance standards, lineage logic, schema enforcement, and automated DQ frameworks.
- Proficiency with SQL-based validation, transformation logic.
AI-assisted development fluency
- hands-on experience with AI coding tools (GitHub Copilot, Cursor, Claude Code or similar) applied to data engineering tasks, and familiarity with Databricks-native AI capabilities including Genie for natural language data exploration and pipeline validation. Clear-eyed about where AI-generated output can be trusted and where it needs scrutiny
Domain Competencies
- Advanced understanding of the business domain and competitive environment, with growing multidisciplinary knowledge
- Understanding of data lifecycle management (ingestion → storage → transformation → consumption).
- Familiarity with governed data environments (ODS, MDM, Unity Catalog).
- Ability to implement and maintain business rules, KPIs, and transformation logic in pipelines.
Core & Behavioral Competencies
- Ability to collaborate with Stewards, Architects, Analysts, and Business Owners.
- Demonstrates high autonomy and senior leadership, advanced problem‑solving capabilities, and seasoned judgement; mentors colleagues, drives change, and champions the improvement of engineering standards across the team
- Problem-solving mindset for complex tasks with ability to trace, diagnose, and remediate data issues
- Ability to operate within governance frameworks and escalate appropriately.
Experience Requirements
- 8+ years of relevant professional experience, demonstrating senior domain ownership, leadership of complex workstreams, and consistent delivery of high‑quality outcomes.
- Hands‑on experience building data pipelines using technologies such as Python, PySpark, SQL, or Scala.
- Experience orchestrating workflows with tools like Airflow, Azure Data Factory, Databricks Workflows, or Prefect.
- Exposure to containerization (Docker) and understanding of CI/CD for data workloads.
- Experience working with at least one major cloud platform (Azure; AWS).
- Experience applying dimensional modeling (Star/Snowflake) or data vault patterns.
- Experience owning or supporting pipelines running in production environments, including handling incidents.
- Exposure to infrastructure‑as‑code (Terraform, ARM templates) or automated environment provisioning.
- Experience implementing or using CI/CD pipelines (GitHub Actions, Azure DevOps, GitLab CI).
Benefits
- Onboarding: A mentor, a buddy program and a global welcome event will help you getting started.
- Learning: A large portfolio of continuous learning opportunities will help you staying relevant in your current role and growing into future workplace demands.
- Working place: An open and international working environment with a diverse and inclusive culture will inspire your everyday work. A working model which balances remote and office based work will give you the flexibility to organize the way you work.
- Benefits: A company pension scheme, well-being initiatives, sports offers and other local benefits will allow you achieving a positive work life balance.
A leading global real estate investor and manager, PIMCO Prime Real Estate is a PIMCO company and part of the PIMCO real estate platform, focusing on the Core and Core+ segments of the market and managing the Allianz Group’s $89B real estate mandate.
We manage a global investment portfolio c. $93.5B AUM, with an international team of c. 455 employees working in 16 offices in Belgium, China, France, Germany, Italy, Japan, Singapore, Spain, Sweden, the UK and the U.S.
PIMCO’s real estate platform is one of the largest and most diversified in the world, with $174B+ in assets and a broad set of solutions that leverage decades of expertise across public and private equity and debt markets.
Allianz Group is one of the most trusted insurance and asset management companies in the world. Caring for our employees, their ambitions, dreams and challenges, is what makes us a unique employer. Together we can build an environment where everyone feels empowered and has the confidence to explore, to grow and to shape a better future for our customers and the world around us.
At Allianz, we stand for unity: we believe that a united world is a more prosperous world, and we are dedicated to consistently advocating for equal opportunities for all. And the foundation for this is our inclusive workplace, where people and performance both matter, and nurtures a culture grounded in integrity, fairness, inclusion and trust.
We therefore welcome applications regardless of ethnicity or cultural background, age, gender, nationality, religion, social class, disability or sexual orientation, or any other characteristics protected under applicable local laws and regulations.
Great to have you on board. Let's care for tomorrow.