More jobs:
Data Engineer
Job in
Tempe, Maricopa County, Arizona, 85285, USA
Listed on 2025-12-01
Listing for:
Compunnel, Inc.
Full Time
position Listed on 2025-12-01
Job specializations:
-
IT/Tech
Data Engineer, Data Analyst, Data Warehousing
Job Description & How to Apply Below
We are seeking an experienced Data Engineer to collaborate with business stakeholders and technical teams to acquire and migrate data sources critical to business objectives. This role involves designing, developing, and managing robust ETL pipelines, working with both structured and unstructured data, and ensuring efficient data management across various platforms. The candidate will also be expected to build cross-platform data strategies, automate processes, and optimize data delivery.
Key Responsibilities:
Required Qualifications:
- Education: Bachelor's degree in Computer Engineering, Computer Science, or a related discipline (Master's Degree preferred).
Experience:
- 7+ years of experience in ETL design, development, and performance tuning using tools such as SSIS or Azure Data Factory (ADF) in a multi-dimensional Data Warehousing environment.
- 3+ years of experience with Python or SQL to set up and operate data pipelines.
- 7+ years of advanced SQL programming experience (PL/SQL, T-SQL).
- 5+ years of experience in Enterprise Data & Analytics solution architecture.
- 3+ years of extensive hands-on experience with Azure, especially in data-heavy/analytics applications involving relational and No
SQL databases, Data Warehouses, and Big Data technologies. - 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Azure SQL DW/Synapse, and Azure Functions.
- 2+ years of experience in defining and enabling data quality standards for auditing and monitoring.
- Strong analytical abilities and intellectual curiosity.
- In-depth knowledge of relational database design, data warehousing, and dimensional data modeling concepts.
- Deep understanding of REST and API design principles.
- Excellent collaboration, communication, and teamwork skills.
- Self-starter with the ability to thrive in a fast-paced development environment.
- Agile experience is highly desirable.
- Proficiency in the development environment, including IDEs, database servers, GIT, Continuous Integration, unit testing, and defect management tools.
- Strong experience with Python, Spark, and Pyspark.
- Strong leadership capabilities.
Preferred Qualifications:
- 2+ years of experience with Big Data Management (BDM) for relational and non-relational data formats such as JSON, XML, Avro, Parquet, and Copybook.
- Knowledge of Dev Ops processes (CI/CD) and infrastructure as code.
- Experience with Master Data Management (MDM) and Data Quality tools.
- Familiarity with Kafka.
- Knowledge of key machine learning concepts and MLOps.
Technologies We Use:
#J-18808-LjbffrTo View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×