Platform Engineer – Azure Data Engineering
We digitize decisions with data—would you like to get involved? At paiqo, we develop modern lakehouse platforms on Azure. Our young team of data engineers, architects, and data scientists works without rigid hierarchies, with a high degree of personal responsibility and a passion for new technologies. Whether OneLake, Delta Lake, Azure Databricks, or Microsoft Fabric—with us, you orchestrate the latest cloud and AI services to provide reliable data for analysis, reporting, and AI at all times. Flexible working from home or from our offices is a given, as long as you live in Austria. As a Data & AI Platform Engineer – Azure Data Engineering you independently design and implement scalable data pipelines on Azure. Use Microsoft Fabric/OneLake, Azure Databricks and Delta Lake technologies to provide reliable data for analytics and AI. Tasks & Responsibilities: • Design, development, and maintenance of complex pipelines (batch & streaming) with Data Factory, Databricks, and Delta Lake • Integration of new Azure features such as mirroring (zero-ETL replication) for continuous synchronization of operational databases with OneLake and Microsoft Fabric for a unified lakehouse architecture • Implementation of DataOps: CI/CD pipelines with Azure DevOps/GitHub, monitoring via Azure Monitor, and automation with Delta Live Tables or Lakeflow • Ensuring data quality, governance, and access control (Azure Purview, Unity Catalog) • Collaboration with cross-functional teams and optimization of the platform in terms of performance and costs What we offer you • Flexible working models – Work without fixed working hours, remotely or in the office; residence in Austria is a prerequisite • Exciting projects – Development of modern lakehouse platforms and data pipelines for well-known customers • Latest Azure technologies – Use of Microsoft Fabric, Databricks, Delta Lake, and mirroring to modernize data platforms • Young, motivated team & open culture – Short decision-making processes, mutual trust, and a high degree of personal responsibility • Targeted further training – Fixed time frame for training and certifications; continuous learning is part of the work culture • Scope for innovation – Opportunity to try out new data tools and frameworks and contribute ideas • Long-term customer relationships – Work on projects with added value and a lasting impact through multi-year contracts If you want to take on responsibility, enjoy working with the latest Azure technology, and value an open, learning-oriented team culture, we look forward to receiving your application! • Several years of experience in developing data platforms on Azure (Data Factory, Databricks, Microsoft Fabric) • In-depth knowledge of SQL, Python/Spark, and Delta Lake technologies • Familiarity with Azure Monitoring, data catalogs (Purview), and CI/CD in DevOps • Strong problem-solving and communication skills; ability to work independently • Very good German and English skills Apply tot his job