- English
- Dutch
- German
Description
We have a current opportunity for a Data Analytics Engineer (m/f/d) on a contract basis in Basel available.
Start: ASAP / can wait for right candidate
Duration: 15 months
Location: Basel and remote 100% workload
We are looking for a data engineer to join our clients' team to help them in implementing new data sets and migrating existing data sets to the on-premises data lake which will eventually migrate to the cloud.
Tasks & Responsibilities:
• Develop required code for end-to-end data pipelines with optimal extraction, transformation, and loading of data from a wide variety of data sources using Spark, SQL and other technologies. Incorporate business rules and functional capabilities, security, data retention requirements etc
• Translate functional and technical requirements into detail design
• Participate and contribute to overall data architecture and design
• Work with stakeholders including the business area sponsors, product owners, data architect, data engineers, project managers and business analysts to assist them with their data-related technical issues and support their data needs.
• Ensure effective design and development of system architectures through frequent product deliveries; employing effective governance methods for transparency and communication.
• Remain up-to-date with industry standards and technological advancements that will improve the quality of your output
• Interact with the business to identify, capture and analyse business requirements
Must haves:
• Experience in building data ingestion pipelines for data warehouse and/or data lake architecture(****)
• Hands-on development in using open-source data technologies such as Hadoop, Hive, Spark, HBase, Kafka, Impala, ELK etc. preferably with Cloudera (****)
• Strong experience in data modelling, design patterns and building highly scalable applications(****)
• Experience with at least one major programming language: C#, Python, Java, etc. (****)
• Experience with relational SQL and NoSQL databases: SQL Server, Sybase IQ, Postgres, Cassandra, etc.
• Experience with data pipeline and workflow management tools: Airflow, RunDeck, Nifi etc
• Experience with stream-processing systems: Kafka, Spark-Streaming, etc.
• Experience with CICD pipelines and agile methodologies like Scrum, Kanban
• Excellent verbal and written communication skills and ability to explain complex technical concepts in simple terms.
• Experience in Automated Testing, Test-driven Development, debugging, troubleshooting, and optimizing code
Interpersonal skills:
• Excellent verbal and written communication skills and ability to explain complex technical concepts in simple terms
Nice to have:
• Experience with cloud-based technologies such as Databricks, Snowflake, Azure Synapse
• Knowledge of micro-services architecture and experience in API creation and management technologies (REST, SOAP etc)
• Experience supporting and working with cross-functional teams in a dynamic environment.
• Assist and support proof of concepts as data technology evolves
If you are interested, please apply with your latest CV
Michael Bailey International is acting as an Employment Business in relation to this vacancy.
Share Now