Amsterdam
Senior
40 uur per week
Startdatum op 1 mei
Verlopen
137 Dagen geleden
KLM Logo
KLM

Data Engineer

A full migration to Google Cloud Platform (GCP)

What are you going to do?

As a senior data engineer, you will architect and construct data pipelines, transitioning us from our legacy data warehouse systems to cutting-edge data lake solutions. You'll be a vital member of our cross-functional BI product team, serving diverse business units such as Safety, Security, and Crisis Management. Our KLM business partners rely on your expertise to deliver scalable, future-proof solutions, leveraging both structured and unstructured data. After crafting these solutions, you'll oversee their seamless operation and ensure their sustained performance. And naturally, you'll do all this within an agile framework, preparing us for an impending cloud migration. Your contributions will be crucial in powering data marts and providing actionable insights to our customers through dashboards.

Functie-eisen

Your profile

We are looking for passionate and talented Software Engineers who specialize in Data Engineering. With everything you do, you keep the end goal in mind. You are capable of doing analyses and discovering the core problem in no-time and you provide the best solution for this. Next to this you are able of explaining these issues in an understandable and clear way to stakeholders at every level of the organization. You coach your junior teammates on a technical level. You look for opportunities, make actions out of them and convince the decision makers.

Besides all of this you have:

  • Preferably Bachelor degree or higher in Computer Science, Software Engineering or other relevant fields (less important in light of work experience)
  • At least 4 years of experience building production-grade data processing systems as a Data Engineer

(In-depth) knowledge of:

  • The Hadoop ecosystem.
  • Building applications with Apache Spark; including Spark Streaming
  • Columnar storage solutions like Apache HBase; including knowledge about data modelling for columnar storage
  • Experience with Event Streaming Platforms like Apache Kafka
  • Development on a cloud platform (Azure & GCP)
  • Experience with 2 or more server-side programming languages, e.g.Python, Scala, Java, etc.
  • Understanding of common algorithms and data structures
  • Experience with databases and SQL
  • Experience with conceptual, logical and physical data modelling
  • Knowledge of continuous integration/continuous deployment techniques
Screening
Om de betrouwbaarheid en integriteit van kandidaten te toetsen is screening onderdeel van de sollicitatieprocedure. Voor vragen over de screeningprocedure kunt u contact opnemen met de vermelde contactpersoon bij de betreffende vacature of interim opdracht.