×
Register Here to Apply for Jobs or Post Jobs. X

Data Analytics & Engineering

Job in Manado, Indonesia
Listing for: PropHero
Full Time position
Listed on 2025-12-28
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Data Science Manager, Data Warehousing
Job Description & How to Apply Below

Prop Hero is an AI-driven marketplace transforming property investment. Backed by global VCs and founded by McKinsey alumni,
we’re expanding our team
. Join our thriving, and flexible culture
, surrounded by ambitious individuals driving change. At Prop Hero, we’re making property investment as simple as shares or ETFs—
be part of the future we’re building!

We are value-driven!

🤝
BELIEVE
:
We have a contagious passion and entrepreneurial spirit.

🔍
CONNECT
:
We care for each other and create a “one-team” spirit.

📈
RAISING THE BAR
:
We push for exceptional performance and never settle for mediocrity.

🔥
OWN IT
:
We are owners no matter the circumstances.

🌐
DELIVER
:
We deliver meaningful, measurable outcomes driving a positive impact.

Do these values resonate with you? Keep reading!

How you will shape Prophero:

As a Senior Data & Analytics Engineer at Prop Hero, you will own the complete data lifecycle - from real-time ingestion to analytics-ready insights. You'll architect event-driven pipelines to stream data from external sources (Hub Spot, APIs, webhooks) into Postgre

SQL, then transform raw data into dimensional models and actionable dashboards. Working in an AWS ecosystem, you'll build the data infrastructure (Lambda, Event Bridge, RDS) while also designing snowflake-modeled data marts and Metabase visualizations. This is a true end-to-end role with equal emphasis on both data engineering (50%) and analytics engineering (50%), where you'll bridge technical infrastructure and business intelligence, ensuring our teams have both reliable data pipelines and clean, business-ready datasets for property valuation models and market analysis.

  • Event-Based Data Streaming:
    Design and implement event-driven pipelines using AWS services (Lambda, Event Bridge, Kinesis/MSK, SQS) to ingest data from external sources in real-time.
  • Hub Spot Integration:
    Build and maintain streaming data pipelines between Hub Spot CRM and Postgre

    SQL, handling webhook events, API polling, and CDC patterns for sub-minute data freshness.
  • External API Integration:
    Develop robust connectors for third‑party APIs, webhooks, and data sources, ensuring reliable data capture with proper error handling and retry logic.
  • AWS Infrastructure Management:
    Deploy and manage AWS resources (Lambda, RDS, Event Bridge, Cloud Watch, S3) for scalable data solutions.
  • Monitoring & Alerting:
    Build comprehensive monitoring dashboards and alerting systems to track pipeline health, data freshness, and error rates.
  • Data Modeling (Snowflake Method):
    Design and implement dimensional data models in Postgre

    SQL using snowflake methodology, creating efficient fact tables, dimension tables, and slowly changing dimensions (SCDs).
  • Data Transformation Pipelines:
    Build SQL-based transformation workflows to convert operational database tables into analytics-ready data marts, ensuring data consistency and business logic integrity.
  • Data Marts Development:
    Create purpose-built data marts for different business domains (property valuation, customer analytics, market trends) optimized for analytical queries.
  • BI Development:
    Design, build, and maintain dashboards, reports, and visualizations in Metabase for self‑service analytics across the organization.
  • Analytics & Reporting:
    Perform data analysis to answer business questions, identify trends, and deliver actionable insights to product and leadership teams.
  • Metrics Definition:
    Partner with business stakeholders to define KPIs, metrics, and business logic; document metric definitions and calculation methods.
  • Data Quality & Validation:
    Implement schema validation, data type checking, and automated quality gates at both the ingestion layer and transformation layer to ensure data accuracy and consistency.
  • SQL & Database Optimization:
    Write efficient, performant SQL queries; optimize query performance and database design through proper indexing, query structure, materialized views, and connection pooling.
  • Documentation &

    Collaboration:

    Maintain clear documentation of pipeline architecture, data flows, API integrations, data models, transformation logic, and metric definitions; work closely with distributed teams across different time zones.
  • End-to-End…
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary