×
Register Here to Apply for Jobs or Post Jobs. X

Senior SQL and ETL Engineer

Job in Downey, Los Angeles County, California, 90242, USA
Listing for: Tech Providers,
Full Time position
Listed on 2026-01-10
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Database Administrator, Data Warehousing
Job Description & How to Apply Below

Overview

Title: Senior SQL and ETL Engineer
Location: Remote (CA resident only)
Contract: 12+ Months

Position Details Background Checks Yes

Responsibilities
  • A Senior Programmer is responsible for leading and/or working on the most complex IT applications design, documentation, development, modification, testing, installation, implementation and support of new or existing applications software.
  • Plan, install, configure, test, implement and manage a systems environment in support of an organization’s IT architecture and business needs.
  • Engage in activities aligning with common programmer roles, including but not limited to analyzing requirements, translating requirements into prototypes, planning system architecture, writing and maintaining code, testing, and ensuring software quality and functionality.
  • Design user interfaces, work with customers to test applications, and document software and systems usage.
  • Ensure information security/information assurance policies are applied to the delivery of application software services.
  • For operating systems, analyze requirements, evaluate and validate system software environments, and monitor performance and security considerations.
Skills and Qualifications
  • Skills Required
    • Knowledge and experience in applications software development principles and methods.
    • Experience with operating systems installation and configuration procedures.
    • Understanding of software design principles, data structures, data modeling, and data warehousing.
    • Familiarity with government regulations, infrastructure requirements, and system performance considerations.
    • Experience with data quality processes, data security, and governance principles.
    • Ability to establish cooperative working relationships and communicate effectively.
  • Additional Skills Required
    • 1. Strong expertise in SQL, PL/SQL, and T-SQL with advanced query tuning and data modeling across Oracle, SQL Server, Postgre

      SQL, and MySQL.
    • 2. Proficiency in ETL/ELT tools including Azure Synapse Analytics, Azure Data Factory, and SSIS for scalable workflows.
    • 3. Data warehouse data modeling (star schema, snowflake, dimensional hierarchies) for analytics and large-scale reporting.
    • 4. Data integration, validation, cleansing, profiling and end-to-end data quality processes.
    • 5. Experience with data warehouse architectures including staging areas, data marts, data lakes and cloud ingestion.
    • 6. Best practices for scalable ETL engineering, including metadata-driven design and automation.
    • 7. Proficiency in Python and PySpark (familiarity with Shell/Perl) for ETL automation and large datasets.
    • 8. Experience with structured/semi-structured data (CSV, JSON, XML, Parquet) and REST API ingestion.
    • 9. Knowledge of Azure data security and governance practices.
    • 10. Optimization of ETL and data warehouse performance (indexing, partitioning, caching, pipeline optimization).
    • 11. Familiarity with CI/CD using Git/Git Hub Actions for ETL deployment across environments.
    • 12. Ability to collaborate with analysts and business stakeholders to translate requirements into datasets, KPIs, and reporting structures.
Experience

This classification requires a minimum of seven (7) years of experience in electronic data processing systems study, design, and programming. At least four (4) years of that experience must have been in a lead capacity.

Additional Experience Required
  • 1. 3 years of experience in the past 4 years developing and optimizing SQL, PL/SQL, and T-SQL logic across Oracle and SQL Server.
  • 2. 3 years in the past 4 years working with mainframe including data extraction, mapping, and conversion into modern ETL/ELT pipelines.
  • 3. 3 years in the past 4 years designing, orchestrating, and deploying ETL/ELT pipelines using Azure Synapse Analytics, Azure Data Factory, SSIS, and Azure Dev Ops CI/CD.
  • 4. 3 years in the past 4 years building and maintaining enterprise data warehouses using Oracle, SQL Server, Teradata, or cloud platforms.
  • 5. 3 years in the past 4 years working with big data technologies such as Apache Spark, PySpark, or Hadoop.
  • 6. 3 years in the past 4 years integrating structured and semi-structured data and consuming APIs using Python/PySpark.
  • 7. 3 years in the past 4 years developing analytics-ready datasets and supporting BI platforms (Power BI or Cognos).
  • 8. 3 years in the past 4 years performing data cleansing, profiling, and data quality assurance for regulated environments.
  • 9. 3 years in the past 4 years supporting production ETL operations and SLAs for reporting workloads.
Education

This classification requires a bachelor’s degree in an IT-related or Engineering field. Additional qualifying experience may be substituted for the required education on a year-for-year basis.

#J-18808-Ljbffr
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary