×
Register Here to Apply for Jobs or Post Jobs. X

Data​/Information Architect

Job in Tempe, Maricopa County, Arizona, 85285, USA
Listing for: Circle K
Full Time position
Listed on 2025-12-03
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below

Join to apply for the Data/Information Architect role at Circle K.

Key technologies:

  • Azure Databricks
  • Azure Data Lake
  • Ansible
  • SQL
  • Terraform
  • Kafka
  • Azure CLI
  • Databricks CLI
  • Power Shell
  • Bash
Responsibilities
  • Designed end-to-end architecture of a unified data platform covering all aspects of the data lifecycle from ingestion to consumption.
  • Handled key activities for enterprise architecture practice for specific sectors.
  • Managed and mentored architecture talent.
  • Delivered architecture initiatives that showed clear business efficiency aligned with strategy.
  • Developed architecture roadmaps and delivery blueprints.
  • Communicated effectively with customers and stakeholders.
  • Delivered architectural initiatives in digital, cloud, and open‑source technologies, including large transformational engagements.
  • Designed and developed Databricks applications; maintained cloud technologies (Azure, AWS, others).
  • Deep knowledge of Spark architecture (Core, SQL, Data Frames, Streaming, RDD caching, MLib).
  • Experience with Python, SQL, Spark/Scala.
  • Strong understanding of data modeling, conceptual, logical, and physical models.
  • Maintained awareness of emerging technologies and their application.
  • Architected and designed cloud‑centric solutions using IaaS, PaaS, and SaaS best practices.
  • Built and supported mission‑critical components with disaster‑recovery capabilities.
  • Designed multi‑tier systems and services for large enterprises.
  • Applied infrastructure and application security technologies and approaches.
  • Gathered requirements through proven techniques.
Required Qualifications
  • Excellent coding skills in Python or Scala (preferably Python).
  • 10+ years of experience in architecture, design, implementation, analytics with at least 12 years in Data Engineering.
  • Designed and implemented 2–3 end‑to‑end Databricks projects.
  • 3+ years of experience with Databricks components:
    Delta Lake, db Connect, db API 2.0, SQL Endpoint Photon engine, Unity Catalog, Databricks workflows orchestration, security management, platform governance, data security.
  • Followed architectural principles to design best‑suited solutions.
  • Well‑versed in Databricks Lakehouse concept and enterprise implementation.
  • Strong understanding of data warehousing, governance, and security standards around Databricks.
  • Knowledge of cluster optimization and integration with cloud services.
  • Ability to create complex data pipelines and write unit and integration tests.
  • Strong SQL and Spark‑SQL skills.
  • Performance optimization expertise to improve efficiency and reduce cost.
  • Experience designing batch and streaming pipelines.
  • Extensive knowledge of Spark and Hive frameworks.
  • Experience with major clouds (Azure, AWS, GCP) and services (ADLS/S3, ADF/Lambda, cloud databases).
  • Excellent communication and cross‑platform collaboration.
  • Positive attitude toward learning and upskilling.
  • Responsible for setting best practices around Databricks CI/CD.
  • Understanding of composable architecture to leverage Databricks fully.
  • Experience with ML tools such as MLflow, Databricks AI/ML, Azure ML, AWS Sage Maker.
  • Ability to translate complex technical challenges into actionable decisions for stakeholders.
  • Coordinated system dependencies and interactions.
  • Experience delivering solutions using SAFe Agile, Waterfall, or Iterative methodologies.
Preferred Qualifications
  • REST API knowledge.
  • Understanding of cost distribution.
  • Experience with migration projects for unified data platforms.
  • Knowledge of DBT.
  • Dev Sec Ops  experience including Docker and Kubernetes.
  • Full‑life‑cycle software development knowledge.
  • JavaScript, Power Shell, Bash scripting.
  • Data ingestion technologies:
    Azure Data Factory, SSIS, Pentaho, Alteryx.
  • Visualization tools:
    Tableau, Power BI.
  • Knowledge of industry trends and standards.
Seniority Level

Mid‑Senior level

Employment Type

Full‑time

Job Function

Information Technology

Industries

Retail

Circle K is an Equal Opportunity Employer.

The Company complies with the Americans with Disabilities Act (the ADA) and all state and local disability laws. Applicants with disabilities can request reasonable accommodations under the ADA and state/local laws, if not an undue hardship.

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary