Data Architect

Southern Africa
Posted 6 months ago

Reference No. NKSR-DA-01

Our client is looking for a Data Architect who will be responsible for defining standards and frameworks through which data will be collected, stored, retrieved, archived, and transferred across applications. They will be expected to set and revise the data architecture principles, create data models to enable the implementation of end-to-end data solutions. Data Architect is a senior technical role within the Data & Analytics Team aimed to drive a standard common data vocabulary, outlines high-level integrated designs to meet business needs that aligns with the greater organisation strategy. Tasked with designing and envisioning the bank’s data architecture, the ideal candidate will be expected to have a strong presence as well as contribute actively within the data engineering space to ease the vision execution and development to specifications.


They will be responsible for the conceptualization and visualization of data frameworks, the Data Architect is expected to have practical skills in many data management tools to enable data warehousing, data management, data modelling, ease ETL processes with a focus on the broader data strategy and the data governance needs of the company.


Responsibilities:


  • Ability to translate business requirements into technical specifications.
  • Define and design data integrations, data warehouses and data lake.
  • Define and redefine the data architecture framework, standards, and principles-including the governance and security framework.
  • Define and redefine the end-to-end data flows zooming on how data is generated and managed.
  • Collaborate with a wide range of technical stakeholders to ease implementation of data solutions.
  • Collaborate actively with leadership and management to devise and execute the bank’s data strategy to meet business and organizational goals and objectives.
  • Actively maintain a repository of all data architecture blueprints and artifacts.
  • Improve the scalability, security, performance, and reliability of the bank’s data architecture in a recurring manner.
  • Design and assist with the building and maintenance of batch or real-time data pipelines in production.
  • Assist with the maintenance and optimization of the data infrastructure required for accurate extraction, transformation, and loading of data from a wide variety of data sources.
  • Lead the automation of data workflows such as data ingestion, aggregation, and ETL processing.
  • Drive the strategy for day-to-day tasks of data cleaning, data wrangling, and data preparation for internal data consumers such as Data Scientists, Data Analyst, Data Champions, and the Bank at large.
  • Partner with data scientists, functional leaders in sales/front office/business lines, marketing, and product to deploy machine learning models.
  • Build, maintain, and deploy data products for analytics and data science teams on on-premises and cloud platforms (e.g., AWS, Azure, GCP).
  • Leverage data controls to maintain data privacy, security, compliance, and quality for allocated areas of ownership.
  • Managing the data and analytics infrastructure (DB, ETL layer, Reporting tools).
  • Proposing solutions and strategies to business challenges.
  • Making recommendations to adapt existing business strategies.
  • Collaborating with the rest of the DnA team, IT, and product development teams to achieve Bank goals and strategic objectives.


Requirements: Qualification and Skill


  • At least a bachelor’s degree in Computer Science, Engineering, Data Science, Statistical
  • Sciences or other quantitative related field. NB: A master’s degree in related field will be an advantage.
  • 6+ years of relevant working experience as a Data Engineer, BI Developer, Search Engineer, Technical Architect, Big Data Analyst, Solutions Architect, Data Warehouse Engineer, Data Science Software Engineer, ETL Developer.
  • Advanced skills and experience with relational databases and non-relational databases.
  • Experience with Oracle, SQL Server, mySQL and NoSQL databases, such as MongoDB, Cassandra, HBase.
  • Experience working with SSIS, SSAS and SSRS solutions
  • Experience working with on-premise and cloud Data Warehouse solutions (e.g., Snowflake, Redshift, BigQuery, Azure, etc.).
  • Experience working with data ingestion tools such as Fivetran, stitch, or Matillion.
  • Working knowledge of Cloud-based solutions (e.g. AWS, Azure, GCP).
  • Experience building and deploying machine learning models in production.


If you wish to apply for the position please send your CV to Nicole Koenig at nkoenig@caglobalint.com


Please visit www.caglobalint.com for more exciting opportunities.


Nicole Koenig


Recruitment Consultant


CA Global Finance


CA Global will respond to short-listed candidates only. If you have not had any response in two weeks, please consider your application unsuccessful however your CV will be kept on our database for any other suitable positions.

Apply now

Job Features

Job CategoryIT

Apply Online