×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer-API Hub​/Data Solutions; San Antonio, Austin or Dallas

Job in San Antonio, Bexar County, Texas, 78208, USA
Listing for: H-E-B
Full Time position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Job Description & How to Apply Below
Position: Staff Data Engineer-API Hub/Data Solutions (San Antonio, Austin or Dallas)

Job Description: Since H-E-B Digital Technology’s inception, we’ve been investing heavily in our customers’ digital experience, reinventing how they find inspiration from food, make food decisions, and ultimately get food into their homes. This is an exciting time to join H-E-B Digital—we’re using the best available technologies to deliver modern, engaging, reliable, and scalable experiences to meet the needs of our growing audience.

As a Staff Data Engineer, you’ll use an advanced analytical, data-driven approach to drive a deep understanding of our fast-changing business and answer real world questions. You’ll work with stakeholders to develop a clear understanding of data and data infrastructure needs, resolve complex data-related technical issues, and ensure optimal data design and efficiency. Once you’re eligible, you’ll become an Owner in the company, so we’re looking for commitment, hard work, and focus on quality and Customer service.

“Partner-owned” means our most important resources—People—drive the innovation, growth, and success that make H-E-B The Greatest Omnichannel Retailing Company.

What You’ll Do

Builds/supports more complex data pipelines, application programming interfaces (APIs), data integrations, data streaming solutions. Designs data patterns that support creation of datasets for analytics; implements calculations, cleanses data, ensures standardization of data, maps/links data from more than one source. Performs data validation and quality assurance on work of senior peers and writes automated tests. Maintains/streamlines/orchestrates existing data pipelines end to end.

Builds large-scale batch and real-time data pipelines with big data processing frameworks. Designs implements monitoring capabilities based on business SLA and data quality. Designs/develops/maintains large data pipelines; diagnoses/solves production support issues. Uses/contributes to refinement of Digital Engineering-related tools, standards, and training. Designs/develops real-time streaming requirements by using structured streaming from either pub-sub or Tibco on datalake. Implements features to continuously improve data integration performance.

Implements Infrastructure as code, security, and CI/CD for data pipelines. Engages/collaborates with external technical teams to ensure timely, high-quality solutions. Works closely with application/stakeholder teams to have a clear understanding of data. Performs full SDLC process, including planning, design, development, certification, implementation, and support. Builds strong relationships with cross functional teams to accomplish impactful results. Works with teams such as Data Engineering, Application and Product Managers.

Peer reviews with team members; learns/adapts from peer review of own code. Contributes to overall design, architecture, security, scalability, reliability, and performance. Mentors/supports Senior Data Engineers. Builds more complex data models to deliver insightful analytics; ensures highest standard in data integrity. Has knowledge in machine learning concepts. Build APIs using FastAPI framework to deliver composite APIs to the applications. API service migration from on prem to cloud.

Ownership of data domains. Improve data quality.

Who You Are

8+ years of hands-on experience related to data engineering in developing data pipelines and APIs. Experience with advance SQL, Python and preferred knowledge in Java. Proven experience with SQL, Spark, Databricks, AWS Lambda, S3, Data lake. Experience ingesting data from data lake to Elasticsearch/Open search. Strong knowledge of messaging systems like Kafka, GCP pubsub or Tibco EMS. Experience in infrastructure as a code using Terraform.

Experience in Dev Ops tools such as Git Lab CI/CD, and Jenkins. Experience with orchestration tools such as Argo or Databricks workflow. A solid understanding of Big Data and Hybrid Cloud infrastructure. Up to date on latest technological developments. Should be able to evaluate and propose new data pipelines pattern. You have an advanced understanding of SDLC processes. You have a comprehensive knowledge of CS fundamentals: data structures, algorithms, and design patterns.

You have advanced knowledge of system architecture and design patterns. You can understand architecture, design, and integration landscape of multiple H-E-B systems. You have experience with common software engineering tools such as Git, JIRA, Confluence, etc. You have a high level of comfort in Lean Startup or Agile development methodologies. You have a related degree or work experience, preferably a Bachelor’s degree in related work stream.

Excellent written, oral communication and presentation skills. Understanding of Data Engineering.

Bonus

Dev Ops Certifications Cloud certifications

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary