The Job in short
You will work in the heart of our clients’ engineering department to design and implement Data Intensive Applications.
- BSc or MSc degree in Computer Science, Information Science or another relevant study
- 5+ years of professional experience as Data Developer in cloud-related projects (AWS, Google Cloud or, Azure)
- Mastering data exploration, data modeling, data visualization, and posing algorithmic insight
- Good skills in Python, R or ETL
- Deep understanding of SQL, NoSQL and Hadoop and when (not) to use these
- Setting up robust and high-performance distributed streaming data pipelines, using Kafka or similar solutions
- Experienced at working in CI/CD driven development environments, setting up automated data quality checks and monitoring capabilities
- Thorough understanding of API setup and use.
- Good understanding of the use of data by Neural networks, Deep Learning, Machine Learning and/or AI
- Agile/Scrum experience
- Advanced level of English (spoken and written) with the ability to communicate effectively in different levels of organizations
- Understanding of Data Warehousing and Data Lake build-up
- Experience with large scale data environments producing enormous amounts of data every second
- Containerization (like Docker) and use container orchestration tooling (like Kubernetes, Docker Swarm or similar tools)