×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Data Application Developer

Job in Oklahoma City, Oklahoma County, Oklahoma, 73116, USA
Listing for: Global Payments Inc.
Full Time position
Listed on 2025-12-21
Job specializations:
  • IT/Tech
    Data Engineer
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below

Join to apply for the Data Application Developer role at Global Payments Inc.

Summary

As our Data Application Developer you will monitor and control client operating systems to ensure efficient and continuous performance. You will advise on the technical aspects of information systems and products, cost, technical requirements to meet customer needs/system environment, prepare flow charts, models and procedures, and conduct feasibility studies to design possible system solutions. You will also design, develop and maintain python‑based interactive data applications that solve business problems through automation, integration and workflow optimisation.

What

Part Will You Play? Primary Responsibilities (60% – Application Development)
  • Design, develop, and deploy interactive data applications using Python frameworks (Streamlit, Flask, Dash, Gradio) that solve specific business problems.
  • Build workflow automation applications that integrate with multiple systems via REST APIs, webhooks, and database connections.
  • Create self‑service data exploration tools that enable non‑technical users to interact with data without writing SQL.
  • Implement user input handling, form processing and dynamic content generation based on user interactions.
  • Develop API endpoints and integration layers to connect applications with external systems and data sources.
  • Design intuitive user interfaces and workflows that prioritise user experience and business value.
  • Deploy and maintain production applications with proper error handling, logging and monitoring.
  • Collaborate with business stakeholders to understand requirements and translate them into functional applications.
Supporting Responsibilities (30% – Data Engineering)
  • Build and maintain data pipelines using SQL and Python to support application functionality.
  • Develop ETL/ELT processes to extract, transform and load data from various sources into Snowflake or equivalent platforms.
  • Write efficient SQL transformations including window functions, CTEs and complex joins.
  • Implement data quality validation and monitoring within pipelines.
  • Create and maintain data models that support application requirements.
  • Optimize query performance and troubleshoot data pipeline issues.
  • Work with data governance requirements including masking policies and access controls.
Strategic Responsibilities (10% – Architecture & Design)
  • Contribute to platform architecture decisions and data modelling strategies.
  • Evaluate new technologies and frameworks for application development.
  • Provide input on integration patterns and system design.
  • Participate in code reviews and technical documentation.
  • Influence best practices for application development and deployment.
  • Work with data scientists to build applications around machine learning models and inference endpoints.
  • Partner with business users to understand pain points and identify automation opportunities.
  • Collaborate with Dev Ops and infrastructure teams on deployment and monitoring strategies.
  • Support analysts and other users in leveraging applications effectively.
Minimum Qualifications
  • Bachelor's Degree in Computer Science, Information Systems, Data Science or related field (or equivalent practical experience).
  • Typically 6 years of relevant experience in application development, data engineering or related roles.
  • Demonstrated experience building and deploying web applications used in production environments.
Desired Skills and Capabilities
  • Python Development:
    Strong proficiency in Python with experience building web applications using frameworks such as Streamlit, Flask, Dash, or Gradio.
  • SQL Proficiency:
    Moderate to advanced SQL skills including transformations, window functions, CTEs and query optimisation.
  • API Integration:
    Hands‑on experience working with REST APIs, including authentication, request handling and third‑party integrations.
  • Data Manipulation:
    Proficiency with Pandas for data processing, aggregations, joins and transformations.
  • Cloud Data Platforms:
    Experience with Snowflake, Big Query, Redshift, Databricks or equivalent modern data warehouse platforms.
Technologies You'll Work With
  • Data Platform:
    Snowflake (primary); experience with other cloud data warehouses…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary