×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Data Engineer

Job in Santa Clara, Santa Clara County, California, 95053, USA
Listing for: Palo Alto Networks
Full Time position
Listed on 2026-01-02
Job specializations:
  • IT/Tech
    Data Engineer
Job Description & How to Apply Below
Position: Staff Data Engineer

Company Description Our Mission

At Palo Alto Networks® everything starts and ends with our mission:

Being the cybersecurity partner of choice, protecting our digital way of life.

Our vision is a world where each day is safer and more secure than the one before. We are a company built on the foundation of challenging and disrupting the way things are done, and we’re looking for innovators who are as committed to shaping the future of cybersecurity as we are.

Who We Are

We believe collaboration thrives in person. That’s why most of our teams work from the office full time, with flexibility when it’s needed. This model supports real‑time problem-solving, stronger relationships, and the kind of precision that drives great outcomes.

Job Description Your Career

Our Data & Analytics group is responsible for working with various business owners/stakeholders from Sales, Marketing, People, GCS, Infosec, Operations, and Finance to solve complex business problems which will have a direct impact on the metrics defined to showcase the progress of Palo Alto Networks. We leverage the latest technologies from the Cloud & Big Data ecosystem to improve business outcomes and create through prototyping, Proof‑of‑Concept projects and application development.

We are looking for a Staff IT Data Engineer with extensive experience in Data engineering, SQL, Cloud engineering and business intelligence (BI) tools. The ideal candidate will be responsible for designing, implementing, and maintaining scalable data transformations and analytical solutions that support our business objectives. This role requires a strong understanding of data engineering principles, as well as the ability to collaborate with cross‑functional teams to deliver high‑quality data solutions.

This is an in office role in our HQ (Santa Clara, CA)

Your Impact
  • Design, develop, and maintain data pipelines to extract, transform, and load (ETL) data from various sources into our data warehouse or data lake environment

  • Aptitude for proactively identifying and implementing GenAI‑driven solutions to achieve measurable improvements in the reliability and performance of data pipelines or to optimize key processes like data quality validation and root cause analysis for data issues, is a nice‑to‑have

  • Collaborate with stakeholders to gather requirements and translate business needs into technical solutions

  • Optimize and tune existing data pipelines for performance, reliability, and scalability

  • Implement data quality and governance processes to ensure data accuracy, consistency, and compliance with regulatory standards

  • Work closely with the BI team to design and develop dashboards, reports, and analytical tools that provide actionable insights to stakeholders

Qualifications Your Experience
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field

  • 5 years of experience in data engineering, with a focus on building and maintaining data pipelines and analytical solutions

  • Demonstrated readiness to leverage GenAI tools to enhance efficiency within the typical stages of the data engineering lifecycle, for example by generating complex SQL queries, creating initial Python/Spark script structures, or auto‑generating pipeline documentation, is a nice‑to‑have

  • Expertise in SQL programming and database management systems

  • Hands‑on experience with ETL tools and technologies (e.g. Apache Spark, Apache Airflow)

  • Familiarity with cloud platforms such as Google Cloud Platform (GCP), and experience with relevant services (e.g. GCP Dataflow, GCP Data Proc, Biq Query, Procedures, Cloud Composer etc).

  • Experience with Big data tools like Spark, Kafka, etc

  • Experience with object‑oriented/object function scripting languages:
    Python/Scala, etc

  • Experience working SFDC Data Objects (Opportunity, Quote, Accounts, Subscriptions, Entitlements) would be highly desired

  • Experience with BI tools and visualization platforms (e.g. Tableau) is a plus

  • Strong analytical and problem‑solving skills, with the ability to analyze complex data sets and derive actionable insights

  • Excellent communication and interpersonal skills, with the ability to collaborate effectively with cross‑functional teams

Additional…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary