Hi, we are AH Technology, food retail reinventors
Albert Heijn goes back. More than 130 years, to be exact. So, for millions of people, food shopping is synonymous with the Albert Heijn brand. We did not achieve this with a complacent, ‘hey, if it works today, it’ll work tomorrow’ approach, but by continually rethinking our propositions and staying plugged in to customer demands. Over 1,200 stores? Check. The leading online food retail platform? Check, we build, run, and love it. ;-)
So, what is next? Accelerating our transition into a hybrid Food & Tech company. We want to leverage data and technology to future-proof food shopping and make better food more easily accessible to everyone. At AH Technology, you team up with inspiring peers in the data, digital and tech domains. A community of diverse individuals who share a common goal; to create data platforms and solutions that enable better decisions. Are you ready to reinvent the way millions of people buy and enjoy their food? We invest heavily in data, digital and tech. And that includes investing in your career!
Job Description
As a data platform engineer, you will be working on our data platform named Thor and User Access Management solution. We are building a self-service data platform with the aim that the platform team is not the bottleneck for development. This means that we are working hard to enable other data engineers to do as much themself whereas we work on improving and optimizing our pipelines and codebase.
A few cool new features we have recently implemented:
- The enablement of streaming data sets
- Custom build User Access Management with fully API based back end
- Documentation that writes itself
Essential Experience Required
Databricks: Data Engineer Associate Certification (level of expertise or certification itself)
Terraform: Associate Certification level
Kafka
Python/Pyspark skills. (The ability to build a highly performant, simple and clearly structured python package that will be used in production)
In depth Azure knowledge (data engineer modules we use are below):
Eventhub/streaming
Storage
Log analytics
Data explorer
Event Grid System Topic
Devops: specifically to bridge gap between development and IT:
Lean software development
Automate infrastructure and processes
Version Control: Mastery in Git (daily use, getting stuff done in Git, Git workflow etc)
Desirable
Kafka optimizations
Databricks Unity catalog
Building RESTful APIs
Further skills level and/or certification:
Databricks: Certified Data Engineer Professional Certification (level skills or certification)
Python: PCAP™ – Certified Associate in Python Programming (level or certification)
Responsibilities
Append to our data platform to take it to the next level with regards to data processing, quality
Provide a platform that allows structuring of data into a scalable and easily understood architecture
Work in a multi-disciplined team where you'll build a platform that will be used by your data engineer peers to create beautiful business solutions that create customer value drivers
Be able to implement/build methodologies as well as (understand how to) scale them together with the businesses
Maintain a good, current and demonstrable knowledge of adjacent application and market developments both for inspiration and for benchmarking the concepts
Qualifications
5+ years industrial experience in the domain of large-scale data platform teams (building a data product), data engineering
Hands-on experience incl. solid programming experience in data ingestions patterns (streaming)
Curious, proactive, fast learner able to quickly picking-up new areas
Experience with agile methodologies
Perfect communication skills