Data Engineering, Analytics & Full Stack Development
General Information
- Location: Discussable
- Start date: ASAP
- Duration: 12 months, extension possible
- Hours per week: 40
- ZZP: Yes
Job Mission
As a Full Stack Developer, you will be responsible for delivering end-to-end data solutions that support critical business processes and decision-making. You will work across the full data lifecycle, from gathering requirements and modelling data to engineering pipelines and developing reporting solutions.
Operating within a modern Microsoft-based data platform, you will contribute to building scalable, secure, and high-performing data solutions. You will collaborate closely with cross-functional teams in a global environment, ensuring that data products are reliable, accessible, and aligned with business needs.
Roles and Responsibilities
- Design and develop end-to-end data solutions from business requirements to production delivery
- Translate business needs into technical specifications, data models, and scalable architectures
- Develop and maintain data pipelines and data products using SQL, PySpark, and related technologies
- Design and implement data models using Data Vault and dimensional modelling techniques
- Build and maintain dashboards and reports in Power BI to deliver actionable insights
- Perform data ingestion, transformation, and integration within Microsoft Fabric and Azure Synapse
- Implement data quality controls, validation processes, and monitoring mechanisms
- Optimize performance of data pipelines, queries, and data workflows
- Implement data security and compliance measures in line with governance standards
- Collaborate with data architects, engineers, analysts, and business stakeholders
- Support user acceptance testing and ensure successful delivery of data products
- Document data engineering processes, architecture, and data flows
- Provide technical guidance and support to colleagues on data technologies and best practicesÂ
Education and Experience
Must Have
- 5+ years of experience in data engineering or full stack data development roles
- 3+ years of experience with Microsoft Fabric or Azure Synapse and Power BI
- Strong experience with data modelling, including Data Vault and dimensional modelling
- Proven experience delivering end-to-end data solutions and analytics products
- Proficiency in programming languages such as Python, SQL, or Java
- Strong experience with Apache Spark and distributed data processing
- Experience with data lake and lakehouse architectures
- Strong knowledge of Azure cloud architecture and data services
- Experience with CI/CD pipelines and DevOps practices for data workflows
- Experience working with large datasets and complex data pipelines
Nice to Have
- Knowledge of ERP systems, preferably D365 F&O or similar
- Understanding of finance or accounting data models
- Experience with big data technologies such as Hadoop or distributed storage systems
- Familiarity with data governance, data security, and privacy regulations
- Experience working in international or complex organizational environments
- Experience with cloud platforms such as AWS or GCP
- Familiarity with AI or machine learning concepts
Skills
- Strong analytical and problem-solving capabilities
- Excellent communication skills, bridging technical and business stakeholders
- Ability to work effectively in cross-functional and international teams
- High level of ownership and accountability for delivery quality
- Attention to detail in data modelling, engineering, and reporting
- Proactive mindset, identifying improvements and optimizing processes
- Ability to manage priorities in a fast-paced environment
- Collaborative approach to working with diverse teams
- Continuous learning mindset, staying current with data and cloud technologies
- Ability to translate complex data into actionable insights
Tech Stack
- Microsoft Fabric
- Azure Synapse Analytics
- Azure Data Factory
- Power BI
- SQL
- Python
- Apache Spark
- Data Vault, Dimensional Modelling
- Azure Cloud Services
- CI/CD pipelines, Azure DevOps
- Data Lake & Lakehouse architectures

