Job Description
CrossBoundary Access is recruiting a Data Engineer. The Data Engineer will be responsible for accessing raw data from diverse sources, transforming it into clean, actionable datasets, and supporting the analysis of these datasets for the Access team.
This individual will be setting up the data infrastructure and data processes for Access, leveraging the infrastructure already in place within CrossBoundary. They will apply software engineering best practices to design, build, and maintain data pipelines and models, working closely with business users to deliver transparent and effective data solutions.
The Data Engineer will be Access's first data hire and will report to the Commercial Director.
Who You Are
Self-starter who is passionate about creating lasting change in Africa
Pioneer who is excited to work with new technologies and innovations
Creative thinker with an ability to craft non-traditional and innovative solutions to practical problems
Can build strong relationships with clients and colleagues in stressful environments
Empathetic and thoughtful leader that can drive performance within a multi-disciplinary team
Comfortable with ambiguity; ability to operate effectively in a changing, challenging context
Primary Responsibilities
Design and implement data extraction, transformation, and loading processesÃÂâÃÂÃÂÃÂï
Build and manage a centralized data repository to support business intelligence and machine learning applicationsÃÂâÃÂÃÂÃÂï
Utilize tools such as SQL, Python, and cloud-based data infrastructure platforms (AWS Redshift, Azure, Google Cloud) to maintain data pipelinesÃÂâÃÂÃÂÃÂï
Apply data modeling principles to prepare datasets for analysis
Collaborate with the business users to deliver transparent and effective data analyses and solutions
Ensure high performance, availability, and quality of data systems through monitoring and optimizationÃÂâÃÂÃÂÃÂï
Collaborate on data governance policies to ensure data compliance and securityÃÂâÃÂÃÂÃÂï
Document and communicate data architecture and processes to Access leadership
Qualifications
Required skills and qualifications:
Bachelor's degree or equivalent professional certification in computer science, data, or related fieldÃÂâÃÂÃÂÃÂï
2-4+ years' experience working in data engineering
Highly skilled in building and maintaining ELT processes for a range of complex data sources, including open-source APIs
Confident in designing and implementing industry-standard data modelling principles to prepare datasets for analysisÃÂâÃÂÃÂÃÂï
Experience working with data lake and/or Lakehouse data architecturesÃÂâÃÂÃÂÃÂï
Highly skilled in SQL
Skilled in data warehousing - Kimball technology
Proficient at working with cloud-based data tools; preference for experience with Snowflake, dbt, and Azure Data FactoryÃÂâÃÂÃÂÃÂï
Experience applying software engineering best practices and guidelines to analytics code and data model developmentÃÂâÃÂÃÂÃÂïÃÂâÃÂÃÂÃÂï
Strong interest in Microsoft stack technologies
Passion for learning new technologies and tools
Extremely curious and flexible
Strong presentation skills, ability to communicate clearly and effectively across diverse audiences
The ideal candidate will also have the following skills and qualifications:
Python
dbtÃÂâÃÂÃÂÃÂï
SnowflakeÃÂâÃÂÃÂÃÂï
Azure Data FactoryÃÂâÃÂÃÂÃÂï
Microsoft AzureÃÂâÃÂÃÂÃÂï
Microsoft DynamicsÃÂâÃÂÃÂÃÂï
Ability to build strong relationships with partners and colleagues in challenging environments
Comfort with ambiguity and an ability to operate effectively in a changing context
French speaking