- Hybride
- Utrecht
- Nederland
- 36 uur
- Fulltime
Imagine
...You will get the opportunity to work in beating heart of data within the Rabobank. You will work together with people who have an insatiable curiosity in technology, data, and self-development. You will heavily invest in your career. You will make memories.
You can make a difference
Within the Tribe Global Data & Analytics Platform you will work in the centre of the data driven enterprise.The tribe contains 4 area's:
- Global Data Platform
- Analytics Platform
- Data Governance Platform
- Onboarding & Support
For about 40 squads are dived over these areas. We need your help to design, build and run ground breaking solutions that are as valuable to our 7 million customers.
You will be responsible for
You will work within the area Global Data Platform (GDP), as member of the squad that maintains Apache Airflow application within Rabobank.- Run, maintain and extend the Apache Airflow application on Kubernetes
- Building and running the Apache Airflow scheduling and orchestration tool and its supporting components like, GitSync, the Windmill API and the notification app.
- Build new features as: building new workflow sensors, building interfaces with Azure Data Factory, Azure Synapse and Azure Databricks.
- Create Python based applications that automate business processes. Examples include: REST API
(Python, FastApi) for creating new user accounts. Azure Function that sends e-mails through Microsoft Graph API, event driven Azure Function for code synchronisation between git repository and a target application. - Implement everything you do using a CI/CD Azure DevOps, including static code analysis, static security tests and automated unit and integration tests.
- Manage the application in Azure environment using Azure Kubernetes Service, Azure Functions and Azure PostgreSQL
- Monitor and run your own applications together with your team in a true DevOps fashion
- Standby is required for this position
Experience
Above all we are looking for new colleagues with an insatiable curiosity in data, technology and self-development on a medior/senior level.- Proficiency in Python, Bash and Powershell
- Excellent Debugging Skills
- Knowledge of frameworks
- Core Python Concepts (data structures, exceptional handling, object-oriented programming (OOPs), multithreading, packages, functions, upgrading versions, generators, iterators)
- Readable code with proper documentation
- Usage of Phyton Shell
- Familiar with ORM (Object Relational Mapper) libraries
- Decorators
- Kubernetes experience, either as a developer or system administrator (certifications Kubernetes Administrator - CKA or Kubernetes Application Developer). AKS is nice to have.
- Azure Foundation AZ-900
- Good knowledge of CI/CD Azure DevOps
- Familiar with ITIL (ITSM) processes as change management, incident management
- Produce design and operational documentation to a high standard
Understanding of the Red Hat Linux Operating System
- Familiarity with the operating system and its commands/utilities
- Configuring and managing software, storage, processes, and services,
- Understanding best practices for permissions and authentication, firewalls, and file management.
- Scripting, containers, and automation.
- Managing Linux servers, querying SQL databases, and setting up repositories using technology such as Docker.
- Viewing system information, modifying network configuration, and starting/stopping key services and processes
Nice to have
- DP-200 - Implementing an Azure Data Solution
- DP-900 - Azure Data Fundamentals
- DP-203 Azure Data Engineer Associate
Competences
- Strong communication skills
- Critical thinker
- Open communication
- Pro active
- Working together
- Providing feedback
- Willing to develop further in Azure
- Strong information/data analysis skills
- A customer focused mind-set and having a structured way of working are key talents
- Quick learner
- Curiosity
What do we offer?
We would love to help you achieve this by focusing firmly on your growth, development, and investing in an environment where you keep learning every day. We give you the space to innovate and initiate. In this way, we offer you numerous opportunities to grow and help you exceed your expectations, to do the right thing exceptionally well, and to therefore grow as a professional. In addition, with us (on the basis of a 36-or 40 hour working week), you can also expect:- Based on your experience: up to € 6.130,64,- gross per month (scale 9)
- Thirteenth month's salary and 8% holiday allowance
- an extra budget of 11% of your gross salary to be used at your discretion. Buy extra holiday hours, add more to your pension savings or ask for part of the extra budget to be paid out.
- personal development budget of € 1,400-
- a combination of working from home and at the office
- Relocation belongs to the possibilities
This is a selection of the terms of employment for a DevOps Engineer based on a 36-hour working week. You can find all terms of employment on rabobank.jobs/en/conditions-of-employment.
You and the job application process
- We will hold the interviews through a video call.
- A security check is part of the process.
- We respect your privacy.
#LI-RD2
- Artikel
- Traineeship
- Traineeship
- Artikel
In 5 stappen
We leren je graag beter kennen.
Stap 1
Sollicitatie
Leuk dat je solliciteert! Wij nemen alle cv's en brieven door. Na sluitingsdatum krijg je zo snel mogelijk een reactie.
Stap 2
Eerste gesprek
Je maakt online kennis met meestal je leidinggevende en een directe collega. We willen weten of je bij de functie en het team past. En jij hebt vast ook veel vragen aan ons.
Stap 3
Tweede gesprek
We willen je graag een tweede keer spreken. Bij dit online gesprek gaan we dieper in op de inhoud van de functie en spreek je vaak nog met een andere collega.
Stap 4
Aanbod
Ben jij de nieuwe collega die we zoeken en word jij ook blij van ons? Dan krijg je een goed aanbod per e-mail. Bij sommige functies moet je eerst nog een assessment doen.
Stap 5
Screening
Tijdens de screening onderzoeken we of je betrouwbaar genoeg bent om voor Rabobank te werken.
Dev Ops engineer Python + Kubernetes
Upload Deze website is beschermd door reCAPTCHA en de Google Privacybeleid en Servicevoorwaarden zijn van toepassing.