Data Engineer
Listed on 2026-01-01
-
IT/Tech
Data Engineer, Database Administrator
Job Title and Location
Data Engineer –
Location: Irvine, CA (Onsite for the first 90 days, then Hybrid)
We are looking for a Data Engineer who is hands‑on, collaborative, and experienced with Microsoft SQL Server, Snowflake, AWS RDS, and MySQL
. The ideal candidate has a strong background in data warehousing, data lakes, ETL pipelines, and business intelligence tools
.
This role plays a key part in executing data strategy – driving optimization, reliability, and scalable BI capabilities across the organization. It’s an excellent opportunity for a data professional who wants to influence architectural direction, contribute technical expertise, and grow within a data‑driven company focused on innovation.
Key Responsibilities- Design, develop, and maintain SQL Server and Snowflake data warehouses and data lakes, focusing on performance, governance, and security.
- Manage and optimize database solutions within Snowflake, SQL Server, MySQL, and AWS RDS
. - Build and enhance ETL pipelines using tools such as Snowpipe, DBT, Boomi, SSIS, and Azure Data Factory
. - Utilize data tools such as SSMS, Profiler, Query Store, and Redgate for performance tuning and troubleshooting.
- Perform database administration tasks, including backup, restore, and monitoring
. - Collaborate with Business Intelligence Developers and Business Analysts on enterprise data projects.
- Ensure database integrity, compliance, and adherence to best practices in data security.
- Configure and manage data integration and BI tools such as Power BI, Tableau, Power Automate
, and scripting languages (
Python, R).
- Proficiency with Microsoft SQL Server
, including advanced T‑SQL development and optimization. - 7+ years working as a SQL Server Developer/Administrator, with experience in relational and object‑oriented databases.
- 2+ years of experience with Snowflake data warehouse and data lake solutions.
- Experience developing pipelines and reporting solutions using Power BI, SSRS, SSIS, Azure Data Factory, or DBT
. - Scripting and automation experience using Python, Power Shell, or R.
- Familiarity with data integration and analytics tools such as Boomi, Redshift, or Databricks (a plus).
- Excellent communication, problem‑solving, and organizational skills.
- Education: Bachelor's or Master's degree in Computer Science, Information Systems, Data Science, or a related field.
- SQL Server / Snowflake / MySQL / AWS RDS
- ETL Development (Snowpipe, SSIS, Azure Data Factory, DBT)
- BI Tools (Power BI, Tableau)
- Python, R, Power Shell
- Data Governance & Security Best Practices
Vaco by Highspring is an Equal Opportunity Employer and does not discriminate against any employee or applicant for employment because of race, color, sex, religion, national origin, age, disability, status as a veteran, union membership, ethnicity, gender, gender identity, gender expression, sexual orientation, marital status, political affiliation, or any other protected characteristics as required by federal, state or local law. Vaco by Highspring and its parents, affiliates, and subsidiaries are committed to full inclusion of all qualified individuals.
If reasonable accommodation is needed to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment, please contact .
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).