40 uur per week
Startdatum op 3 april
363 Dagen geleden

Senior Data engineer

ASML is looking for a Senior Data engineer with experience of: Kubernetes, Docker, Airflow, Python and Databricks

Introduction to the job:

Do you love making complex issues simple? Do you have good analytical capabilities, and enjoy the challenge of creating workflow tooling and reports to make our development process more effective? Then look no further and apply as DataOps Engineer for the Availability Project to help build and migrate the backend of AVP tool.

Job mission:

Maintain and develop data automation processes to support the AVP tool. Ensure the smooth running of semi-automated Machine Learning pipelines. Develop parts of the pipeline that can ensure full automation and help with migration of existing Airflow pipelines into Databricks going forward.

Job description:

  • Assemble large and complex data sets, troubleshoot issues and maintain database architecture
  • Monitor and audit data quality
  • Set up and maintain automated data processes
  • Develop and optimize ETL/pipelines/data streams from production equipment to centralized data warehousing, and other data warehousing work as needed
  • Set and maintain database and data standards, including access control to sensitive information
  • Optimize data storage for live and historical reporting; tuning, diagnosing issues, automation of monitoring or repetitive tasks, etc
  • Guide and upgrade our methods to align with emerging best practices
  • Communicate with stakeholders and write/update relevant documentation¬†


  • Bachelor or master level in technical field of study or (Data Engineering, Business Information Systems, Industrial Engineering or Information Management)


  • Strong affinity with data sources, data analytics, reporting systems, and development process
  • Full-Stack capabilities, Kubernetes, Jenkins, Artifactory, Airflow, FTP servers, Docker, Grafana, Big Data
  • Experience with data processing languages: SQL & Python
  • Demonstrated competence in data warehousing (design and operations), ETL, and database design
  • Familiarity with Agile/Scrum concepts and experience working in an Agile environment
  • Track record of excellence in current and prior roles
  • Minimum 2 years of experience in comparable role
  • Able to work under pressure while managing competing demands and tight deadlines
  • Technologies and capabilities: Python, Linux, API, SQL
  • Keywords and experiences: database security, database performance monitoring, database architecture, database administration (DBA), at least (1) MySQL, PostgreSQL.
  • Bonus: Airflow Certified
  • Bonus 2 : Databricks Certified