×
Register Here to Apply for Jobs or Post Jobs. X

Programmer Analyst

Job in Falls Church, Fairfax County, Virginia, 22042, USA
Listing for: GeoLogics
Full Time position
Listed on 2025-12-19
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Database Administrator
Salary/Wage Range or Industry Benchmark: 60 - 74 USD Hourly USD 60.00 74.00 HOUR
Job Description & How to Apply Below
Position: Programmer Analyst (US Citizenship or Green Card required)

Overview

TITLE:

Programmer Analyst (US Citizenship or Green Card required)

RATE RANGE: $60/hr – $74/hr W2 (no health benefits while on contract—an hour worked is an hour paid)

LOCATION:

Remote or Falls Church, VA

DURATION: 6 months

SCHEDULE:

9/80 (every other Friday off)

*** No C2C, we can NOT work with outside agencies/vendors, and we can NOT do 1099—US CITIZENSHIP or Green Card IS REQUIRED***

Key Responsibilities
  • Design and implement data pipelines to ingest, extract, transform, load (ETL) data and store large datasets from various sources
  • Build and maintain data warehouses, including data modeling, data governance, and data quality
  • Ensure data quality, integrity, and security by implementing data validation, data cleansing, and data governance policies
  • Optimize data systems for performance, scalability, and reliability
  • Collaborate with customers to understand their technical requirements and provide guidance on best practices for using Amazon Redshift
  • Work with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver data solutions
  • Provide technical support for Amazon Redshift, including troubleshooting, performance optimization, and data modeling
  • Identify and resolve data-related issues, including data pipeline failures, data quality issues, and performance bottlenecks
  • Develop technical documentation and knowledge base articles to help customers and AWS engineers troubleshoot common issues
Requirements
  • Bachelor’s or Master’s degree in Computer Science or a related field, with at least 6 years of experience in Information Technology
  • Proficiency in one or more programming languages (e.g., Python, Java, Scala)
  • 8+ years of experience in data engineering, with a focus on designing and implementing large-scale data systems
  • 5+ years of hands-on experience in writing complex, highly-optimized queries across large data sets using Oracle, SQL Server and Redshift
  • 5+ years of hands-on experience using AWS Glue, python/pyspark to build ETL pipelines in a production setting, including writing test cases
  • Strong understanding of database design principles, data modeling, and data governance
  • Proficiency in SQL, including query optimization, indexing, and performance tuning
  • Experience with data warehousing concepts, including star and snowflake schemas
  • Strong analytical and problem-solving skills, with the ability to break down complex problems into manageable components
  • Experience with data storage solutions such as relational databases (Oracle, SQL Server), No

    SQL databases, or cloud-based data warehouses (Redshift)
  • Experience with data processing frameworks such as Apache Kafka, Fivetran
  • Experience in building ETL pipelines using AWS Glue, Apache Airflow, and programming languages including Python and Py Spark
  • Understanding of data quality and governance principles and best practices
  • Experience with agile development methodologies such as Scrum or Kanban
Preferred Skills
  • Experience with Dataiku
  • Experience with building reports using Power

    BI and Tableau
  • Experience with Alteryx
  • Relevant cloud certifications (e.g., AWS Certified Data Analytics – Specialty)
  • Experience with AWS services and best practices
Experience Required
  • 8+ years of experience in data engineering, with a focus on designing and implementing large-scale data systems
  • 5+ years of hands-on experience in writing complex, highly-optimized queries across large data sets using Oracle, SQL Server and Redshift
  • 5+ years of hands-on experience using AWS Glue, python/pyspark to build ETL pipelines in a production setting, including writing test cases
Education Preferred
  • Bachelor’s or Master’s degree in Computer Science or a related field, with at least 8 years of experience in Information Technology

If you would like to interview for this position, please send an updated resume to Dee Smith, Sr. Technical Recruiter  

Rates listed are not a guarantee of salary/rate. Rate offered at time of hire will depend on many factors including education, experience, interview results and skill level. Geo Logics is an Equal Opportunity/Affirmative Action Employer that is committed to hiring a diverse and talented workforce. EOE/Disability/Veteran

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary