×
Register Here to Apply for Jobs or Post Jobs. X

Data Developer

Job in Buffalo, Erie County, New York, 14266, USA
Listing for: HeronAI
Full Time position
Listed on 2025-12-02
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Job Description & How to Apply Below

We believe in a world where data analytics is not a privilege but a universal right and simple enough so anyone use it to drive the profitability of their businesses to transform the world. With industries like accounting and manufacturing facing massive workforce disruptions (i.e., 75% of the workforce retiring in the next 10 years), we’re positioned to step in and change how data drives decisions.

Heron

AI (MIT / Fin Tech, SaaS) empowers mid-sized firms to streamline strategic growth and month-end advisory reporting. Using proprietary algorithms to help mid-sized companies connect all their disparate data into one place, analyze it, create visual dashboards, and action plans to drive strategic decision-making. Our goal is to help reduce analysis and reporting time from 4-12 weeks to under 5 minutes

Our goal not to be just a tool, but to reshape how data drives decisions for over 10 million people in the next 10 years.

Our goal is to reshape how businesses use data by reducing reporting time from weeks to minutes, helping industries like accounting and manufacturing overcome challenges like workforce disruptions. As a Data Developer, you’ll play a critical role in optimizing data pipelines, integrating diverse data sources, and building the foundation for scalable, secure analytics.

What You’ll Do
  • Design, implement, and maintain scalable ETL pipelines for data ingestion and transformation from tools like Quick Books, Excel, and other financial platforms.
  • Collaborate with engineering teams to ensure seamless data flow between backend APIs, databases, and visualization tools.
  • Optimize the performance of databases (e.g., Postgres, Dynamo

    DB), ensuring efficient handling of large-scale, complex datasets.
  • Develop and maintain data models that align with analytics and reporting needs.
  • Work closely with the Dev Ops team to ensure data pipelines are secure, reliable, and cost-efficient.
  • Conduct data quality checks and implement automated validation processes to ensure accuracy and consistency.
  • Troubleshoot and resolve data-related issues, ensuring minimal downtime and disruption.
  • Contribute to the company’s SOC 2 compliance efforts, ensuring data security and privacy protocols are followed.
What Makes This Role Exciting
  • As one of our first hires, you’ll have the freedom to define how we build solutions.
  • Choose the methodologies and tools that best meet our customers’ needs.
  • Work on a high-growth platform backed by MIT, Harvard, Techstars, and Forbes-recognized innovation.
  • Join us at an exciting time—we’ve grown our waitlist from 250 to 1,700 users in 10 months and recently secured $1.5M in seed funding.
  • Solve critical gaps in the data analytics market and automate workflows for industries facing major workforce transformations.
Who You Are
  • A professional with 5+ years of experience in data engineering, ETL development, or a related field.
  • Proficient in Python or another programming language for data processing.
  • Skilled in designing, optimizing, and managing databases like Postgres or Dynamo

    DB.
  • Experienced in building scalable ETL pipelines for data ingestion, transformation, and storage.
  • Knowledgeable about cloud platforms (AWS preferred) and tools like Lambda, S3, and Redshift.
  • Comfortable with data modeling and understanding the needs of analytics platforms.
  • Passionate about ensuring data accuracy, security, and reliability in fast-paced environments.
  • Familiar with SOC 2 compliance or other data security frameworks.
Exceptional Candidates Will Bring
  • Experience integrating data from financial systems (e.g., Quick Books, Excel).
  • A strong understanding of data visualization requirements, including performance optimization for dashboards.
  • Familiarity with real-time or near-real-time data processing pipelines.
  • A history of contributing to high-growth startups or scaling data-intensive platforms.
What Good Looks LikeQ1 Targets:
  • Build ETL pipelines for Quick Books and Google Sheets with robust data validation.
  • Deliver backend support for 5 pre-designed dashboard templates.
  • Ensure dashboards generate actionable insights with no data errors.
Q2 Targets:
  • Support incremental API integrations for Xero and Hub Spot.
  • Optimize ETL performance to…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary