×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in Morrisville, Wake County, North Carolina, 27560, USA
Listing for: TEKsystems
Full Time, Part Time position
Listed on 2026-01-18
Job specializations:
  • IT/Tech
    Cloud Computing, Data Engineer
Job Description & How to Apply Below

* Top Skills' Details***
* * Data Engineering experience
* * Experience with AWS/Public Cloud

* Strong Python experience

* Snowflake experience

* Data Automation with AI (very important to this role)
Building agents- using langchain or langraph
They are using using langgraph
Aws- understanding how to deploy on those

* Description
* Our client is currently seeking a Data Engineer with AI Automation experience. This role is a short-term role until they find funding to extend, but the manager said it could be 6 months before that happens with budget constraints. The person we put into the role could also be under consideration for the perm role.
These are the areas where we can utilize someone who has experience in one of these areas:
AI engineer - Experience building AI agents, MCP server/tools, etc.
Data/ETL Engineer

- Experience with Apache Airflow, Glue ETL, and Snowflake
Python developer with AWS experience - to build scripts for different use cases and strong AWS knowledge is important
UI developer(Angular) - we have a design ready for a dashboard where we can use them to finish that project.
Ideally, someone would have experience in all 4 areas above but if someone is an SME in one of these areas, they will consider them.
Data Engineer with AI Experience

What You Will Do:

* Design and implement AI Agents to monitor and optimize cloud resources based on findings and recommendations from Cloud Service Providers.

* Develop predictive models for drift detection, cost anomaly detection, and forecasting of public cloud resources and spend.

* Automate operational workflows using machine learning and intelligent scripting.

* Integrate AI-driven insights with Cloud Service Providers like AWS, GCP, Azure, and existing data and tools.

* Conduct anomaly detection for security, cost optimization, and performance analytics.

* Design, build, and maintain scalable ETL pipelines using AWS Glue and other cloud-native services.

* Utilize AWS Athena for interactive querying of data stored in data lakes.

* Manage and optimize data storage and processing using Snowflake cloud data platform.

* Orchestrate complex workflows and data pipelines using Apache Airflow DAGs.

* Continuously evaluate emerging AI technologies and tools for operational improvements.

* Maintain documentation and best practices for AI/ML integration in cloud systems.

Our

Minimum Requirements Include:

* Bachelor's or Master's degree in Computer Science, Data Science, or related technical field, or equivalent experience.

* Proven ability building and deploying ML models, with at least 2 years focused on cloud operations.

* Solid knowledge of cloud technologies (AWS, GCP, Azure, OCI).

* Experience with Python, PySpark, and ML libraries such as PyTorch, Tensor Flow, or scikit-learn.

* Comfortable working with streaming data, APIs, and telemetry systems.

* Experience with AWS Glue ETL, AWS Athena, Snowflake, and Apache Airflow DAGs.

* Strong communication and multi-functional collaboration skills.

* Experience with Agile and Dev Ops operating models, including project tracking tools (e.g., Jira), Git (any Version Control systems), and CI/CD systems (e.g., Git Lab, Git Hub Actions, Jenkins).

* Proficient in general-purpose programming languages (Python, Golang, Bash) and development platforms and technologies.

Preferred Qualifications:

* Understanding of Cloud Technologies and Services of one or more providers including AWS, GCP, Azure, Oracle, and Alibaba.

* Established record of leading technical initiatives, delivering results, and a commitment to fostering a supportive work environment.

* Hard-working, dedicated to providing quality support for your customers.

* Full stack development experience with Angular for frontend and Flask for backend application development

* Additional

Skills & Qualifications
* Cloud management and optimization

* Cloud account provisioning

* Multi cloud management

* Managed services

Public cloud infrastructure- all public accounts- not deployments- they have top level governance of clouds- gov/security/ etc. they oversee that- GPC/Azure/ AWS/ etc.
They have administration access to cloud.
Responsible for security, governance, cost management

Cloudx platform automation- this group of developers helps to rebuild different platforms to support various business units.
Data from cost optimization- want to present data to but in a meaningful way
Governance control- do that at scale-automation
they dont use all LLM models to be used - build a pipeline with only the custom models
Building the data pipelines
Most deployments are done in AWS
Public cloud
Snowflake- all the data aggregated here
Use Glue pipelines - in AWS
Python scripts for the data

Will dive more into AI in the coming months.
Questions around cloud spend is already avail- they released- agentic AI approach
MPC servers build those to support current environment
Build AI pipelines for MCP servers
Someone who knows data
Data automation
AI would be a good fit

Building agents- using langchain or langraph
They are using using langgraph
Aws-…

To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary