More jobs:
Data Architect/Data Scientist
Job Description & How to Apply Below
The Data Architect is responsible for designing, governing, and optimizing the organization's enterprise data architecture. The role ensures that data is reliable, secure, high-quality, and accessible to support business operations, analytics, AI/ML initiatives, and digital transformation goals. The Data Architect serves as the strategic technical lead for data platforms, data standards, and data governance frameworks.
Responsibilities- Define the enterprise data architecture blueprint (data flow, storage, processing, access layers).
- Develop data models (conceptual, logical, physical) for operational and analytical systems.
- Ensure alignment of data architecture with business strategy, digital transformation, and regulatory requirements.
- Define data domain boundaries and maintain the organization's data taxonomy.
- Design data integration patterns (ETL/ELT, streaming, API-based ingestion).
- Oversee the integration of data from multiple systems (ERP, CRM, IoT, SaaS platforms).
- Ensure interoperability standards (JSON/REST, XML, CSV).
- Define data exchange protocols with external entities and third‑party systems.
- Define and enforce data quality rules (accuracy, completeness, consistency, timeliness).
- Work with data stewards to ensure consistent data definitions and metadata management.
- Monitor and report data quality KPIs.
- Define and enforce data governance policies (ownership, lineage, cataloging).
- Support MDM initiatives and maintain enterprise data dictionaries.
- Work with Data Stewards to ensure consistent data definitions and classification.
- Bachelor's in Computer Engineering, Computer Science, Information Systems, or related field.
- Master's preferred; certifications in data platform architecture (Azure, AWS, GCP) are a plus.
- 12+ years of experience in data architecture, data engineering, or enterprise architecture.
- Experience in cloud data platforms (Azure preferred for government, data platforms such as Informatica or BigID).
- Experience with Azure Data Factory, Synapse, Databricks, Data Lake storage.
- ETL/ELT patterns, data warehouse design, and MDM.
- Knowledge of data governance (DAMA-DMBOK or DCAM frameworks).
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×