×
Register Here to Apply for Jobs or Post Jobs. X

Junior Data Engineer

Job in Fayetteville, Washington County, Arkansas, 72702, USA
Listing for: Arkansas Office of Skills Development
Full Time position
Listed on 2025-12-06
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Salary/Wage Range or Industry Benchmark: 20 - 30 USD Hourly USD 20.00 30.00 HOUR
Job Description & How to Apply Below

apprenticely is helping walton arts center hire a junior data engineer in fayetteville, ar & rogers, ar
.

about the employer:

at walton arts center/walmart amp, we strive to be the place where a diverse mix of talented people want to come, to stay, and do their best work. We pride ourselves on bringing the best arts and entertainment to our audiences in northwest arkansas, and we know our organization runs on the hard work and dedication of our passionate and creative employees.

Walton arts center and the walmart amp aspire to be welcoming spaces where people can be their authentic selves. We value diversity and inclusivity in every aspect of our operations.

$20 - $30 an hour

the junior data engineer at the walton arts center and walmart amp is a critical role responsible for building and maintaining data systems, data warehouses, and data pipelines to enable data-driven decision‑making. This role collaborates with the senior director of technology services and other teams to strategize and implement roadmaps for data management, ensuring data integrity, security, and accessibility across platforms.

principle

responsibilities (essential functions)
  • build and maintain reliable etl/elt pipelines that move data from core systems (ticketing, crm, f&b, digital platforms, finance systems, etc.) into snowflake.
  • write clean sql queries, views, and transformations across sql server, snowflake, and postgresql/mysql in support of analytics, dashboards, and operational needs.
  • perform routine data quality checks, validation steps, and error handling to ensure accuracy and trust in our data sources.
  • assist in integrating cloud storage systems (azure blob, aws s3) with snowflake using copy commands, file stages, and automated workflows.
  • develop and schedule batch jobs and automated processes to keep data refreshed across systems.
  • support power bi dashboard development by preparing optimized datasets, modeling relationships, monitoring refresh schedules, and resolving performance issues.
    0
  • partner with business units to gather report requirements, translate them into technical specifications, and deliver clear and actionable insights.
  • monitor daily pipelines, system performance, and data model health, escalating issues before they impact the business.
  • troubleshoot and debug data failures, broken feeds, or mismatched numbers between source systems and snowflake.
  • document data flows, pipeline logic, table definitions, and reporting standards to maintain clarity and consistency across the technology services division.
  • participate in data governance efforts, including privacy, security, and compliance alignment across platforms.
  • support ai and automation initiatives by preparing datasets for analysis, assisting with feature engineering, and validating model outputs when needed.
position requirements technical skills & knowledge
  • strong sql skills, including select/insert/update/delete, joins, group by, window functions, and query optimization.
  • experience with snowflake: data loading, copy commands, warehouse usage, json parsing, tables, and views.
  • familiarity with sql server, postgresql/mysql, and general relational database concepts.
  • basic understanding of the azure ecosystem (blob storage, functions, monitor) and aws services (s3, lambda).
  • python skills for scripting, api calls, data manipulation with pandas/numpy, file processing, and automation tasks.
  • power bi fundamentals: data modeling, relationships, basic dax, and report/dashboard building.
  • strong excel skills: advanced formulas, pivot tables, and data analysis.
  • understanding of etl/elt concepts, pipeline design, and data modeling best practices.
  • ability to work with multiple file formats including csv, json, xml, and parquet.
  • experience with version control (git), branching strategies, and collaboration workflows.
  • basic devops understanding: deployments, environments, monitoring, and resource troubleshooting.
  • awareness of cost optimization principles for cloud platforms and data operations.
professional skills
  • strong analytical and problem‑solving skills, with the ability to debug and resolve data issues quickly.
  • ability to communicate clearly with both technical and non‑technical…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary