We are looking for technically skilled and self-driven engineers with hands-on experience in Azure cloud services and Databricks to join our Data Layer & Data Transport team in Veghel.
The Data Transport team, contributes to building and maintaining the Service Buses and integration connectors that enable reliable and secure data exchange between source and target systems. This includes developing new interfaces, optimizing existing ones, and ensuring robust monitoring and error handling. From Data Layer perspective we build a standardized layer that facilitates all operational communication between applications. The connectors between applications and the layer are built in C# and the layer itself consists of a Postgres database.
Your responsibilities
The Data Transport Module (DTM) is a core component within our data platform ecosystem, designed to facilitate secure, reliable, and efficient delivery of data from the platform to consuming applications. As organizations increasingly adopt a product-based approach to data, the DTM ensures that curated data streams and data products can be seamlessly transported to their intended destinations while preserving quality, lineage, and governance standards.
The primary business purpose of the Data Transport Module is to enable controlled and standardized movement of data products from the platform to applications that rely on them for decision-making, reporting, analytics, or operational processes.
By abstracting transport complexity, the module:
Ensures consistent data delivery across different applications and business domains.
Reduces integration effort and accelerates time-to-value for new data products.
Strengthens trust through governance, monitoring, and compliance with security standards.
You will play a key role in designing, developing, and deploying, and integrating cloud-native applications and data solutions that support our business goals by:
Design and develop scalable applications using Azure Serverless Functions, App Services, and .NET/C#.
Implement messaging and event-driven architectures using Azure Service Bus and Event Grid.
Deploy applications using Azure Container Apps or Docker.
Maintain APIs using API Manager
Develop and maintain data pipelines and analytics solutions using Databricks, Python, PySpark and Postgres.
Manage data governance and access using Unity Catalog
Collaborate with cross-functional teams in an Agile environment.
Take initiative and work independently to drive tasks to completion.
Collaborate with architects, DevOps engineers, and business stakeholders to ensure seamless and scalable data transport across the enterprise landscape
Functie-eisen
Your qualifications and skills
Azure Cloud
Azure Service Bus
Azure Event Grid
Azure Serverless Functions
API Manager
Azure App Service (API Management)
.NET/C# application development
Application deployment via Azure Container Apps or Docker
Databricks & Data Engineering:
Python
Postgres
Unity Catalog
Lakebase (optional but a plus)
Competenties
Other Requirements :
Available to work on-site in Veghel at least Tuesday, Wednesday, and Thursday
Must haves:
Experience working in Agile teams
Strong sense of ownership and ability to work independently
Arbeidsvoorwaarden
Bedrijfsinformatie
Our client is the global market leader in baggage handling (AIRPORTS) systems, and (WAREHOUSING) warehouse automation solutions.
Their baggage handling systems moves 3.2 billion pieces of luggage around the world per year, in other words 8.8 million per day. Its systems are active in 600 airports including 17 of the world’s top 25. More than 20 million parcels (300 packages per second) are sorted by its systems every day. These have been installed for a variety of customers including the four largest parcel and postal companies in the world. In addition, 12 of Europe’s top 20 e-commerce companies and many distribution firms have confidence in Vanderlande’s efficient and reliable solutions.
The company focuses on the optimisation of its customers’ business processes and competitive positions. Through close cooperation, it strives for the improvement of their operational activities and the expansion of their logistical achievements. Vanderlande’s extensive portfolio of integrated solutions – innovative systems, intelligent software and life-cycle services – results in the realisation of fast, reliable and efficient automation technology.
Established in 1949, with over 4,000 employees, all committed to moving its customers’ businesses forward at diverse locations on every continent. With a consistently increasing turnover of more than one billion euros, it has established a global reputation over the past six decades as a highly reliable partner for value-added automated material handling solutions.
Your department
The Central Data & Analytics department is tasked with extending, operating and further maturing the data platform. They provide Data & Analytics related products and services primarily catering to internal customers, but supporting their digital service portfolio. Following a hub & spoke model, we establish centralized standards that are then adopted and implemented by decentralized teams within the business.
At the heart of their efforts lies the Data Platform, the core product they are diligently enhancing. This Azure cloud enabled platform based on the Data Mesh principle empowers them to deliver valuable Data Products by transforming, processing, combining, and enriching data from various source systems, ultimately creating a reliable source of truth for their customers. To enable efficient data analysis and visualization, they actively govern, and maintain their Qlik Servers and Power BI Premium capacity. In addition, they provide templates, establish a clear workflow, and foster a community of experts.
Their work centers around their Federated Governance model, with the Central Data & Analytics Organization as the hub serving the spokes. Their scope encompasses Data Governance, Platform Operations, Core Platform Development, and Full stack Development. The technologies they use span across the Azure stack and Databricks ecosystem, ensuring seamless integration and optimal utilization of resources. By utilizing advanced analytics techniques and the latest technology, their focus on innovation empowers them to continuously improve and innovate the Data Platform.
Uiteraard staat deze vacature open voor iedereen die zich hierin herkent.
contact
Floris van Herpen
floris.van.herpen@yacht.nl
Zo verloopt het solliciteren via Randstad Professional | Yacht. Ontdek hoe we jou kunnen helpen om een baan te vinden.