: In this role, you'll be leading and managing data engineering projects with moderate complexity. Your primary focus will be on designing and building data pipelines to enhance data-informed decision-making within the business. You'll also be involved in creating a new big data solution to handle large volumes of data efficiently and support future growth. Your work will influence stakeholders and contribute to professionalizing the company's reporting platform.
- Lead and manage data engineering projects to build data pipelines for better decision-making.
- Design and implement a new big data solution to handle large volumes of data and support future growth.
- Stay up-to-date with the latest development, testing, and deployment techniques to enable quick releases of new data pipelines and data sources.
Skills Required: Mandatory:
- A master's degree or equivalent in a relevant discipline.
- Proficiency in Python/R and object-oriented programming.
- Ability to design and write modular and extensible code.
- Experience with source code version control (GitHub).
- Writing automated tests and ensuring code quality.
- Prior experience with scientific computing, data science, and machine learning is an advantage.
- Proficiency in statistical software ecosystems such as R and Python is a plus.
- Experience with technologies like PyQT, REACT, or similar is beneficial.
Location: The Hague
Duration: 12 Months (Option to extend)
Michael Bailey International is acting as an Employment Business in relation to this vacancy.