Mid-Senior DataOps Engineer
Listed on 2025-12-09
-
IT/Tech
Data Engineer, Cloud Computing
About Us:
Perfict Global is a leading IT consulting services provider focused on providing innovative and successful business workforce solutions to Fortune 500 companies. Our trained and experienced professionals constantly strive to bring together the best technologies available to manage client's complex business and technology, participate in implementation activities and collaborate in new ways to meet client needs.
We provide excellent benefits such as Medical, Dental, Vision ++ a fun company to work!!!
(looking for mid to senior level Data Ops Engineer)
(at least two days on site per week)
Job Description:We are seeking a highly skilled and experienced Senior Data Ops Engineer to join our Regulatory Reporting Platform Team. The ideal candidate will have a strong background in data operations, cloud infrastructure, and automation. This role involves designing, implementing, and managing process automation and data workflows that support our Regulatory reports.
Responsibilities:Develop and improve automation scripts and processes to enhance process and data quality and reduce delivery cycle time.
Create and maintain optimized data pipelines for data ingestion, processing, and storage.
Ensure data quality, integrity, and security across all data workflows.
Work closely with data & reporting application engineers, analysts, and other stakeholders to understand data/process requirements and deliver solutions.
Monitor data workflows and troubleshoot process issues, providing recommendations with identified benefits & risks.
Maintain comprehensive documentation of data workflows, processes, and infrastructure in the Platform Knowledge Base in Confluence.
Availability to provide off-hours support, coordinate with the data and app teams to lead the production and non-production deployment and validation efforts.
Skills and Experience:
Extensive experience in building and managing data pipelines using tools like Apache Spark, Hadoop, or similar.
Proficiency in the AWS cloud platform.
Strong experience with automation tools and scripting languages (e.g., Python, Bash).
Expertise in SQL and No
SQL databases. Specifically, Snowflake and RDS.
Experience in designing and implementing ETL/ELT processes.
Hands-on experience with IICS for data integration and management.
Knowledge of data security best practices and tools. Experience in auditing process, end-to-end evidence producing and review discussions.
Experience in Batch scheduling (CA7, Airflow, IICS scheduler, event based, time based), monitoring tools.
Skills:
SL Regulatory Reporting Tool.
Experience with Dev Ops practices and tools.
Experience in setting up and managing CI/CD pipelines and working with source code management tools (Git Hub, Bitbucket).
Experience implementing scheduler and deployment automation solutions.
Skills in data visualization tools like Tableau or Power BI.
Working experience in SAFe Agile Scrum (and Kanban) methodology.
Relevant certifications in cloud platforms, data engineering, or Dev Ops are a plus.
Strong analytical and problem-solving skills.
Excellent verbal, written communication, and presentation skills, interact with all levels of technology and business teams.
Ability to work collaboratively in a team environment.
Ability to adapt to new technologies and methodologies.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).