Contract Duration: 18 months
Estimated Regular Hours/Week: 40.00
onsite
Skills Required:
Cloud Platforms: Deep understanding of Azure ecosystem, including Azure Data Factory, Data Lake Storage, Blob Storage, power apps, and Functions. Additionally, in-depth understanding and implementation of API management such as Apigee. Big Data Technologies: Proficiency in Databricks, Spark, PySpark, Scala, and SQL. Data Engineering Fundamentals: Expertise in ETL/ELT processes, data pipelines, data modeling, schema design, and data warehousing. Programming Languages: Strong Python and SQL skills, with knowledge of other languages like Scala or R beneficial. Data Warehousing and Business Intelligence: Strong ERD concepts, designs, and patterns, Understanding of OLAP/OLTP systems, performance tuning, Database Server concepts, and BI tools (Power BI, Tableau). Data Governance: Strong understanding of RBAC/ABAC, Data Lineage, Data leak prevention, Data security, and compliance. Deep understanding and implementation knowledge of audit and monitoring in Cloud.
Experience Required:
Seven (7) years of applying Enterprise Architecture principles, with at least five (5) years in a lead capacity. Five (5) years of hands-on experience with Azure Data Factory, Azure Databricks, API implementation and management solution, and managing Azure resources. Five (5) years of experience in the following: developing data models and pipelines using Python; working with Lakehouse platforms; GitHub CI/CD pipelines and infrastructure automation, Terraform scripting; and with data warehousing systems, OLAP/OLTP systems, and integration of BI tools.
Education Required:
This classification requires the possession of a bachelor’s degree in an IT-related or Engineering field.
Additional Information:
Onsite - livescan required |