Azure Data Architect


City: Toronto, ON, Canada
Title: Azure Data Architect
Category: IT
EmploymentType: Permanent
Description:

Staffmax Staffing is currently recruiting an experienced Azure Data Architect to join our financial institution clients.

Skills:

• Data Architecture experience in implementing Data Lake, Data warehouse or Hybrid Data Platform on cloud 

• Azure data platform - ADLS (data lake), Synapse, Purview, ADF, etc. 

• Exposure to Hadoop Distribution (CDH and Horton) 

• Developing Conceptual models and semantic views 

• Experience in Synapse warehouse models and singleton data ingestion approach 

• Implementing data pipelines for retrieval / Ingestion / presentation / semantics 

• Using data ingestion/data technologies and design patterns and producing new frameworks / framework variants 

• Skills in data modelling (both structured and unstructured data) working directly with the business & data scientists 

• Data Management using CDH and metadata repository technologies (Collibra/Purview) 

• Skills in data acquisition (landing, ingestion and metadata) of various data types including Salesforce, XML, json, flat file systems (ISAM/VSAM) and Relational data 

• Skills in data manipulation: Azure Data factory and Data Bricks, Eventing environment, orchestrated by Kafka 

• Data presentation via visualisation technologies e.g. Power BI. Exposure to Tableau

Other Skills Required:

• Data management and analytics 

• Data discovery processes 

• Familiar with Scrum and working in an Agile team 

• Generation of data catalogues/models from the metadata repository and exposing that to various user communities 

• ELT technologies – Azure Data Factory, Sqoop, Syncsort/ETL in both full load or CDC modes 

• Developing technical specifications corresponding with the lake architecture, based on business and functional analysis 

• Capturing and enhancing metadata from source systems and management of data drift 

• Security integration with Okta and fine grained definition of Security Policies for the lake resources. Implementation of those policies in Sentry or Ranger providing Authorisation controls at the required granularity 

• Experience in large scale distributed processing architectures e.g. enterprise distributed caching, low latency data driven platforms 

• Expertise in fault tolerant systems in Azure including Clustering & multi AZ deployments 

• Working knowledge of installing, configuring and operationalising Big Data clusters e.g. Azure Object or Data Lake Storage Gen2 

• Understanding of configuration management and Devops technologies (e.g. GitLab / Jenkins / Octopus ) 

• BI / Data Prep /Visualisation / ELT tools; Power BI/Tableau 

• Knowledge of the various flavours of SQL required by above

Role:

• Responsible for working with our business users to develop Business Conceptual data models that are realised from the pipelines. 

• Secondly, Data Classifications (e.g. Security and Personal), Content access policies for CRUD, Security and Privacy. (e.g. CBAC who is a member of the “Group” who can see this data), Ownership / Stewardship, History/Auditing, Searchability / Discoverability (Because some data will be semi-structured or unstructured) 

• Analyses the data needs of the company and uses skills in coding to maintain secure databases 

• Collect and organize the data obtained 

• Uses their training in analytics and various coding programs to analyse information and draw conclusions based on their findings 

• Using mathematical and statistical theory methods, present findings to management which are then used to improve various initiatives within the company 

• Look at a set of data and seek out patterns and utilize the information to remain competitive in the business of their employer 

Salary:

  • $110-$150K/yr
  • Bonus

Benefits:

  • Extended Health Care
  • Dental Care
  • Vision Care
  • Paid Vacation

Type of Work: 

  • Full-Time Permanent
  • Remote work, and eventual on-site

*Must be fully vaccinated

Company:
Staffmax Staffing & Recruiting