×
Register Here to Apply for Jobs or Post Jobs. X

Data Operations Associate

Job in Raleigh, Wake County, North Carolina, 27601, USA
Listing for: Tiverton
Full Time position
Listed on 2025-12-31
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Salary/Wage Range or Industry Benchmark: 60000 USD Yearly USD 60000.00 YEAR
Job Description & How to Apply Below

TIVERTON is an investment firm exclusively focused on the food and production agriculture sector. The firm oversees $2.2+ billion of assets across debt and equity strategies in the US. The team combines deep agricultural operating experience and financial professionals to provide tailored, long‑term capital solutions to the space. For more information, please visit (Use the "Apply for this Job" box below)..

Position Summary

Tiverton is seeking a Data Operations Associate to support our investment process and portfolio operations through data engineering, analytics, and AI‑powered automation. This hybrid role combines data infrastructure development with investment analytics, working across deal sourcing, due diligence, portfolio monitoring, and LP reporting. The ideal candidate is a technically proficient generalist who enjoys building solutions across the full data stack—from pipeline engineering to business intelligence—and is excited to apply AI/ML tools to solve real‑world problems in agricultural private equity.

The role offers broad exposure to both the investment side (deal flow, due diligence and fund analytics) and operations side (portfolio company data, reporting automation, and other analytics). The successful candidate will be self‑motivated and energized by working with a group of thoughtful, smart, and skilled colleagues and will enjoy being part of a young, hungry and collaborative organization focused on becoming the pre‑eminent investment firm in US agriculture.

Primary

Responsibilities
  • Data Infrastructure & Pipeline Engineering (40%)
    • Build and maintain ETL pipelines pulling data from internal and external sources into our Snowflake data warehouse
    • Develop Python and SQL automation scripts for recurring data processes
    • Manage Snowflake data warehouse—schema design, query optimization, and data modeling
    • Build API integrations for third‑party data sources (pricing data, B2B data providers, market intelligence)
    • Implement data quality checks, validation rules, and monitoring to ensure pipeline reliability
    • Create web scraping solutions for data collection from public sources
    • Maintain code repositories with proper version control and documentation
  • Investment Analytics & Deal Support (30%)
    • Support deal pipeline analytics and sourcing workflows in our CRM
    • Build models and analytics for sector trends (crop prices, land values, farm credit metrics)
    • Extract and analyze data from appraisal documents, financial statements, and industry reports
    • Develop due diligence analytical frameworks and data rooms for new investments
    • Create LP reporting dashboards and automated quarterly reporting processes
    • Support investment team with ad‑hoc analytical requests and data visualization
  • AI/ML Implementation & Automation (20%)
    • Leverage LLMs (OpenAI, Claude) to accelerate document analysis, data extraction, and research workflows
    • Build AI‑powered automation for deal screening, document processing, and data enrichment
    • Implement intelligent solutions for pattern recognition, anomaly detection, and data quality
    • Use prompt engineering and AI coding assistants to rapidly prototype analytical tools
    • Develop RAG (Retrieval‑Augmented Generation) systems for knowledge management
  • Portfolio Company Support & Reporting (10%)
    • Support portfolio company reporting requirements and data requests
    • Build dashboards and reporting tools for portfolio operations teams
    • Troubleshoot data issues and provide technical support to portfolio companies
    • Partner with investment team to ensure clean, reliable data for portfolio monitoring
Required
  • Technical Skills
    • Strong proficiency in Python (pandas, requests, sqlalchemy) and SQL for data analysis and automation
    • Experience with data pipelines, ETL processes/tools (Fivetran, etc.) or data engineering workflows
    • Working knowledge of cloud data warehouses (Snowflake, Databricks, Big Query, Redshift)
    • Proficiency in business intelligence tools (Power BI, Tableau, Sigma, or Looker)
    • Advanced Excel skills including complex formulas, pivot tables, and data modeling
    • Experience with API integrations and web scraping (REST APIs, Beautiful Soup, or similar)
    • Comfortable with AI/ML tools:
      Lang Chain, OpenAI API, Claude API, or similar frameworks
    • Git version control
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary