×
Register Here to Apply for Jobs or Post Jobs. X

Senior Associate Data Engineering

Job in Atlanta, Fulton County, Georgia, 30383, USA
Listing for: Publicis Groupe Holdings B.V
Full Time position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below

Company description

Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity. United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clients’ businesses through designing the products and services their customers truly value.

Overview

Employer:

Sapient Corporation

Job Title:

Senior Associate Data Engineering

Job Requisition: .6

Job Location:

384 Northyards Blvd. NW, Atlanta, GA 30313. Will work from company office Atlanta, GA and various unanticipated client sites and Sapient offices nationally;
Telecommuting available on a hybrid basis at company discretion.

Job Type: Full Time

Rate of Pay: $ to $ per year

Duties:
Write scripts and troubleshoot for performance using relevant programming languages. Build test automation framework for existing data ingestion code. Work with large data sets, real-time/near real-time analytics, on distributed big data platforms. Work on data engineering functions including, but not limited to Data Extract, transformation, loading and integration in support of data warehouse. Data ingestion, validation, and enrichment pipeline design and implementation.

Build performance testing framework for existing data ingestion pipelines. Involved in implementing Transaction History Batch processing on a daily/weekly schedule. Develop and Implement ETL jobs using AWS Glue to extract, transform, and load data from various sources. Demonstrable experience in data platforms involving implementation of end-to-end data pipelines. Experience in data modelling, warehouse design and fact/dimension implementations. Experience working with code repositories and continuous integration.

Data modelling, querying, and optimization for relational, No

SQL, time series, data warehouses and data lakes. Experience in implementing data pipelines for both streaming and batch integrations.

#LI-DNI

Qualifications
  • Design, develop, and manage data pipelines (both batch and streaming) using Python, PySpark, UNIX Shell Scripting, SQL.
  • Integrate AWS Services including Lambda, Dynamo

    DB, EC2, RDS, S3, Athena, Data pipeline, API gateway, Glue, EMR to extract, transform, and load data from various sources and also to improve accessibility/efficiency
  • Design, develop, and implement complex ETL processes using Informatica Power Center ETL tool for pulling the sales/reporting/alerting data from source systems, doing various data transformations like filtering, aggregation, sort, routing, joining.
  • Use GIT, Bitbucket for code versioning check-ins to enable retrieve any previous version of the code.
  • Automate the process of ETL testing using Informatica, Python, Robot Framework by integrating with Jira dashboard to reduce the manual testing effort.
  • Develop and optimize complex SQL queries in Oracle Database for performance improvement of the batch processes.
  • Manage enterprise job scheduling using TWS (Tivoli Workload Scheduler), Autosys , Event Engine, and Control-M.
  • CI/CD pipelines using Jenkins for automating build, test, and deployment processes. Integrate Jenkins with various tools including GIT.
  • Integrated XL Release with Jenkins, GIT, and other CI/CD tools for an efficient release process.
  • Develop and consume RESTful APIs using Python for system integrations and data exchange.
  • Build scalable Python APIs using frameworks like Flask and FastAPI for various data-driven applications.
  • Additional information

    Will work from company office Atlanta, GA and various unanticipated client sites and Sapient offices nationally;
    Telecommuting available on a hybrid basis at company discretion.

    #J-18808-Ljbffr
    Position Requirements
    10+ Years work experience
    To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
    (If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
     
     
     
    Search for further Jobs Here:
    (Try combinations for better Results! Or enter less keywords for broader Results)
    Location
    Increase/decrease your Search Radius (miles)

    Job Posting Language
    Employment Category
    Education (minimum level)
    Filters
    Education Level
    Experience Level (years)
    Posted in last:
    Salary