×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

ETL DEVELOPER

Job in Billings, Yellowstone County, Montana, 59107, USA
Listing for: Kampgrounds Of America, Inc.
Full Time position
Listed on 2026-02-07
Job specializations:
  • Software Development
    Data Engineer
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below
Job Description

Posted Thursday, February 5, 2026 at 7:00 AM

Kampgrounds of America, Inc. (KOA) is the world’s largest network of privately owned campgrounds and the leader in outdoor hospitality. KOA has 500+ locations across the United States and Canada including a mix of franchised and company-owned parks (OAK). Founded in 1962, the mission of KOA is “connecting people to the outdoors and each other,” and those who represent the brand share the values of being family-oriented, passionate, entrepreneurial, customer-focused, and innovative.

At KOA, we believe the outdoors is fun and for everyone. We are committed to having an environment where all are treated with dignity and respect. We strive to:

  • intentionally create a sense of community and belonging for our guests, employees and franchise partners
  • continually educate ourselves and expand our knowledge to foster an inclusive and supportive environment
  • sustain a culture that promotes diversity of thought and experiences
  • ensure everyone has the ability to experience the outdoors and that our facilities are accessible to all
  • drive change in our company and industry through action and implementation

Summary:

The ETL Developer (Extract, Transform, and Load) is responsible for designing, developing, and maintaining robust data integration solutions and transforming KOA data into a format that is consumable by end users and applications. This position builds scalable ETL pipelines, ensuring data accuracy, consistency, and efficient delivery from various sources into the data warehouse for analytics and reporting.

Essential Duties and Responsibilities:

  • Design and develop data aggregation pipelines that extract data from various sources, transform it into the desired format, and load it into the appropriate data storage systems using tools like Azure Data Factory, SSIS, or custom scripts.
  • Optimizes data pipelines and data processing workflows for performance, scalability, and efficiency.
  • Monitor, test, and troubleshoot ETL jobs to ensure high availability, performance, and data quality.
  • Implements data quality checks and validations within data pipelines to ensure the accuracy, consistency, and completeness of the data.
  • Collaborate with data scientists and data analysts to optimize models and algorithms for data quality, security, and governance.
  • Write and optimize complex SQL queries and scripts for data manipulation.
  • Integrates data from different sources, including databases, data warehouses, APIs, and external systems.
  • Develop source-to-target mappings and detailed transformation logic.
  • Transforms raw data into a usable format by applying data cleansing, aggregation, filtering, and enrichment techniques.
  • Ensures data quality and accuracy by performing data validation, monitoring, error-handling processes, and cleaning as needed.
  • Identifies and implements optimizations to continually enhance query performance, reduce processing time, and increase overall productivity.
  • Delivers well-defined, transformed, tested, documented, and code-reviewed datasets for analysis.
  • Collaborate with business analysts and stakeholders to understand data requirements and deliver effective data solutions.

Non-Essential Duties and Responsibilities:

  • This job description is not intended to cover or contain a comprehensive listing of activities, duties, or responsibilities. Other duties, responsibilities and activities may change or be assigned at any time with or without notice.

Required Education and Experience:

  • Bachelor's degree in Computer Science, IT, or related field (or equivalent experience).
  • Proven experience in designing and building ETL pipelines.
  • Experience with cloud data platforms (Azure, AWS, GCP). Strong SQL proficiency and database knowledge (e.g., MS SQL Server , Oracle ).
  • Experience with at least one major ETL tool (Data Factory, SSIS)
  • Programming skills in Python, Java, or similar.
  • Understanding of data warehousing concepts and dimensional modeling.
  • Excellent analytical, problem-solving, and communication skills.

Preferred Education and Experience:

  • Familiarity with CI/CD practices and data governance
  • Experience with Power BI

Physical Demands and Working Conditions:

  • Work is performed…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary