More jobs:
Datamart Developer
Job in
Riyadh, Riyadh Region, Saudi Arabia
Listed on 2025-12-02
Listing for:
Master Works
Full Time
position Listed on 2025-12-02
Job specializations:
-
IT/Tech
Data Analyst, Data Engineer, Data Warehousing
Job Description & How to Apply Below
We are seeking an experienced Datamart / Semantic Layer Developer to develop and implement business-oriented datamarts and semantic layers on Teradata EDW, CDP Hive, and Trino platforms. The candidate must possess strong SQL development skills, dimensional modeling knowledge, telecommunications domain expertise, and the ability to translate technical specifications into optimized analytics solutions.
Experience RequiredMinimum 5+ years in datamart development and semantic layer implementation.
Core Responsibilities Datamart Development- Develop and implement star schema and snowflake schema dimensional models on Teradata EDW
- Build subject-area datamarts (Customer, Revenue, Network, Product, Finance) based on design specifications
- Create and optimize fact tables, dimension tables, bridge tables, and aggregate tables
- Implement slowly changing dimensions (SCD Types 1, 2,
3) logic and dimensional hierarchies - Develop complex SQL queries, stored procedures, and views for datamart population
- Implement data transformation and aggregation logic for business metrics and KPIs
- Develop semantic layers using TIBCO Data Virtualization on Teradata and CDP platforms
- Build semantic models using Trino for distributed query processing and data access
- Create virtual views, materialized views, and business-friendly data abstractions
- Implement business logic, calculated measures, KPIs, and derived metrics in semantic layer
- Develop data access policies, row-level security, and governance rules
- Optimize semantic layer performance through caching, indexing, and query optimization
- Work across Teradata, CDP Hive, and Trino platforms for datamart and semantic layer implementation
- Develop HiveQL queries and tables in CDP (Cloudera Data Platform) environment
- Integrate data from Teradata EDW and CDP Hive through Trino for unified semantic access
- Create cross-platform queries and federated views using Trino connectors
- Implement partitioning, bucketing, and optimization strategies in Hive tables
- Translate design documents (HLD, LLD) and mapping specifications into SQL code
- Develop ETL/ELT processes to populate datamarts from EDW sources
- Optimize query performance using indexing (PI, SI, NUSI), statistics, partitioning, and aggregations
- Conduct unit testing, data validation, and reconciliation between source and target
- Debug and troubleshoot performance issues in datamarts and semantic layers
- Work closely with datamart designers, EDW developers, BI teams, and business analysts
- Implement business requirements and KPI calculations as per specifications
- Create technical documentation: SQL scripts, deployment guides, data lineage
- Support UAT activities and assist business users in validating data accuracy
- Provide production support and resolve data or performance issues
- Teradata (Must Have):
Advanced SQL development, stored procedures, performance tuning, utilities (BTEQ, TPT) - Strong understanding of Teradata architecture, indexing (PI, SI, NUSI), partitioning, and statistics
- CDP Hive:
HiveQL development, table creation, partitioning, bucketing, optimization in Cloudera environment - Trino (Presto
SQL): SQL development using Trino, federated queries, connector configuration - Expert-level SQL across multiple platforms for complex queries and transformations
- Oracle SQL and PL/SQL development experience
- TIBCO:
Hands-on development experience with TIBCO Data Virtualization for semantic layer implementation - Experience creating virtual views, business views, and semantic models in TIBCO
- Understanding of data virtualization concepts and query federation
- Knowledge of BI tool integration with semantic layers
- Strong understanding of star schema and snowflake schema dimensional models
- Knowledge of fact table design, dimension design, and SCD implementations
- Ability to translate dimensional models into physical database objects
- Understanding of dimensional modeling best practices (Kimball methodology)
- Understanding of telecom business processes, KPIs, and data flows
- OSS:
Network performance, inventory, fault management metrics - BSS:
Billing, customer analytics, revenue, churn, product performance - Telecom KPIs: ARPU, churn rate, CLTV, network utilization, revenue metrics
- Full SDLC experience (Agile/Scrum, Waterfall)
- Strong analytical and debugging skills for performance troubleshooting
- Good communication skills for technical collaboration
- Unix/Linux scripting for automation (plus)
- Version control:
Git, SVN
- Bachelor's degree in Computer Science, Information Technology, or related field
- Experience with data profiling and data quality tools
- Knowledge of ETL tools (Ab Initio, Informatica)
- Understanding of data governance and metadata management
- Experience with BI tools:
Tableau, Power BI, Qlik
- Developed and…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×