More jobs:
COSMOS - Data Engineer IV
Job in
Little Rock, Pulaski County, Arkansas, 72208, USA
Listed on 2025-12-01
Listing for:
University of Arkansas
Full Time
position Listed on 2025-12-01
Job specializations:
-
Software Development
Data Engineer
Job Description & How to Apply Below
* The candidate must have 4+ years of experience as a data engineer/software developer/software engineer/database administrator, or other similar roles; or a PhD degree in Computer Science, Information Science, or a related discipline;
* The candidate must have 2+ years of experience leading a team of data engineers.
* Lead a team of data engineers;
* Collecting and analyzing raw data from various sources including social media platforms;
* Organize and maintain datasets;
* Improving data quality and process efficiency;
* Design and manage data ETL pipelines that encompass the journey of data from source to destination systems processing 10 million+ data points daily, utilizing Kafka for real-time data streaming and Mongo
DB for No
SQL storage across Kubernetes clusters;
* Design and deploy scalable microservices in Python and Golang, leveraging Flask
API, Graph
QL, and Docker, ensuring sub-second response times and efficient concurrency with go routines.
* Migrate large amounts of data from legacy databases to Mongo
DB to achieve sub-second access latencies and optimize storage for unstructured data through Elasticsearch integration;
* Setup and manage the infrastructure required for ingestion, processing, and storage of data;
* Evaluate the model needs and objectives, interpret trends and patterns of data;
* Conduct complex data analysis and report on results;
* Prepare data for analysis and reporting by transforming and cleansing it;
* Combine raw information from different sources;
* Explore ways to enhance data quality and reliability;
* Identify opportunities for data acquisition;
* Develop analytical tools and programs;
* Collaborate with teams at COSMOS on several projects;
* Managing services and operational infrastructure for system reliability and resiliency;
* Creating continuous integration continuous deployment (CI/CD) pipelines with Jenkins and Git Lab CI for automating service/system deployment;
* Integrate Prometheus for monitoring, Grafana for real-time dashboarding/visualization, and log analysis with Kibana sourced from Elasticsearch;
* Front-end development (HTML/CSS, JavaScript, Node.js, etc.);
* Training machine learning (ML) models on datasets;
* Creating continuous integration continuous deployment (CI/CD) pipelines with Jenkins and Git Lab CI for automating service/system deployment;
* Integrate Prometheus for monitoring, Grafana for real-time dashboarding/visualization, and log analysis with Kibana sourced from Elasticsearch;
* Front-end development (HTML/CSS, JavaScript, Node.js, etc.);
* Deploying machine learning (ML) models;
* Enhance the system’s fault tolerance by incorporating alerting mechanisms;
* Develop frameworks like Spring Boot, React;
* Work on other tasks as asked.
* Expert proficiency level in working with data models, data pipelines, ETL processes, data stores, data mining, and segmentation techniques;
* Expert proficiency level in working with programming/scripting languages (e.g., Java and Python);
* Expert proficiency level in working with data integration platforms and SQL database design;
* Expert proficiency level in working with numerical, analytical, and data security skills;
* Expert proficiency level in collecting raw data from various social media platforms;
* Expert proficiency level in creating CI/CD pipelines;
* Expert proficiency level with front-end development (HTML/CSS, JavaScript, Node.js, etc.);
* Expert proficiency level with training and deploying machine learning (ML) models on datasets;
* Ability to lead a large team of data engineers (5+ members);
* Expert proficiency level with Kafka and Mongo
DB for No
SQL storage across Kubernetes clusters;
* Expert proficiency level with microservices, Python, Golang, Flask
API, Graph
QL, and Docker;
* Expert proficiency level with Elasticsearch, Grafana, Prometheus, and Kibana.
* Expert proficiency level in data modeling concepts (ERD, Dimensional Modeling, Data Vault) and data APIs (RESTful API);
* Expert proficiency level in data processing software (e.g., Hadoop, Spark, Tensor Flow, Pig, Hive) and algorithms (e.g., Map…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×