This is an exciting opportunity for an experienced developer of large-scale data solutions. We are seeking someone with good experience in building solutions using Microsoft Azure services and a proven track record in delivering high quality work to tight deadlines.
What You'll be Responsible For ?
Designing and implementing highly performant data ingestion pipelines from multiple sources using Azure Databricks
• Delivering and presenting proofs of concept to of key technology components to project stakeholders.
• Developing scalable and re-usable frameworks for ingesting of data sets
• Integrating the end to end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained
• Working with event based / streaming technologies to ingest and process data
• Working with other members of the project team to support delivery of additional project components (API interfaces, Search)
• Evaluating the performance and applicability of multiple tools against requirements
• Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.
The position must have the following competencies:
• Strong Communication
• Building trust
• Decision making / problem solving
• Delegating Responsibility
• Customer / client focus
• Planning & Organizing
• Managing Stress levels at critical times
• Technical / professional knowledge
• Team Player
• Troubleshooting skills in Azure Databricks.
Why Join us?
Please be aware that we will contact only
candidates who best match the requirements of the position.