General Responsibilities
- Review business requirements, familiarize with and understand business rules and transactional data model
- Define conceptual, logical model and physical model mapping from data source to curated model and data mart.
a) Analyze requirements and recommend changes to the physical model.
b) Develop scripts for the physical model, create database and/or delta lake file structure.
c) Access Oracle DB environments, set necessary tools for developing solution.
- Implement data design methodologies, historical and dimensional models
a) Develop curated model to store historical data captured incrementally from source
b) Design dimensional data mart models, create source-to-target-mapping documentation, design and document data transformation from curated model to data mart
c) Perform data profiling, assess data accuracy, design and document data quality and master data management rules
- Functionality Review, Data Load review, Performance Review, Data Consistency checks.
a) Help troubleshooting data mart design issues
b) Review performance of ETL with developers and suggest improvements
- Participate in end-to-end integrated testing for Full Load and Incremental Load and advise on issues
- Plan for Go Live, Production Deployment.
a) Work with system administrator, ETL developers and ministry team to define production deployment steps.
b) Configure parameters, scripts for go live. Test and review the instructions.
c) Review release documentation
- Go Live Support and Review after Go Live.
a) Review data models, ETL process, tools and provide recommendation on improving performance and reduce ETL timelines.
b) Review Infrastructure and any performance issues for overall process improvement
- Proactively communicate with stakeholders on any changes required to conceptual, logical and physical models, communicate and review dependencies and risks.
- Knowledge Transfer to Ministry staff, development of documentation on the work completed.
a) Document share and work on the architecture end-to end-working knowledge, Troubleshooting steps, configuration and scripts review.
b) Transfer documents, scripts and review of documents.
Data Modeler Requirements:
- 7+ years BI Data Architect experience in enterprise applications and solutions design/development and related with data warehousing, data lake implementations and dimensional modelling.
- Collect business-level questions and propose approaches to address business needs and provide data insights.
- Expand documentation and knowledge of business processes relative to available data to provide contextual guidance for operation/project, reporting and insights generation.
- Ability to design and articulate complex technical concepts into executable development work packages.
- Knowledge of BI tools for metadata modeling and report design (e.g. Power BI)
- MS SQL Server Technology, Azure Data Lake, Azure Databricks
- Expert knowledge developing data warehouse solutions on MS Stack (Azure Data Lake, SQL, ADF, Databrciks, Power
BI) to store and retrieve centralized information. Experience designing the data warehouse using dimensional and delta lake concepts. - Create/maintain enterprise data model and data dictionary. Help development team to optimize database performance. Coordinate with the Integration department to identify future needs and requirements.
- Extensive knowledge of data modelling tools (e.g. SAP Power Designer, Visio)
- Review, install and configure information systems to ensure functionality and security. Analyze structural requirements for new data warehouse and applications
- Experience using Oracle database server and tools (12c, 19c), PL/SQL for development of Business Intelligence applications.
- Demonstrated skills in writing SQL stored procedures and packages for datamarts and reporting.
- Demonstrated experience in Azure Dev Ops
- Demonstrated experience in performance tuning of Business Intelligence applications, including data model and schema optimization
Skills:
- 7+ years in data modelling and data warehouse design (Must Have)
- 2+ years Azure Data Lake and Azure Databricks SQL Warehouse (Must Have)
- 5+ years SQL (Must Have)
Assets:
- Knowledge of Curam IBM COTS solutions (Social Assistance…
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: