Posted at: 3 April

Analytics Engineer - Kuala Lumpur

Company

CompanyFasset

Fasset is a fintech company headquartered in Dubai, specializing in stablecoin-powered Islamic banking and financial services, targeting markets in Asia and Africa.

Remote Hiring Policy:

Fasset supports a hybrid work model, with team members located in various regions including Dubai, Islamabad, Jakarta, and Bahrain. While some roles may require relocation, the company embraces flexibility in work arrangements.

Job Type

Full-time

Allowed Applicant Locations

Malaysia

Job Description

Job Description

Location: Kuala Lumpur

Team: Data & Analytics

Reports to: Head of Data



Role Overview

We are looking for a Senior Analytics Engineer to lead the evolution of our data stack as we are in a high-growth phase that requires us to scale our data infrastructure. You will be the primary driver in implementing dbt (Data Build Tool) to centralize our business logic and move us away from a reactive service model into a scalable, "Data-as-a-Product" ecosystem.

Your goal is to build the foundational data layer that powers our Self-Service Data Platform and feeds our internal AI Assistants, ensuring that every metric is governed, tested, and reliable.



Key Responsibilities

  • Deploy & Lead dbt Implementation: Act as the internal champion for dbt. You will transition our logic out of siloed Metabase queries and into a modular, version-controlled, and tested transformation layer in AWS Redshift.
  • Architect the Single Source of Truth: Define the core data models that align the entire company. You’ll ensure that when a KPI changes, it updates everywhere simultaneously, eliminating "metric drift."
  • Fuel AI & Self-Service: Design the Gold Standard tables that will serve as the brain for our Self-Service Platform and AI Assistants, allowing stakeholders to get high-fidelity answers without manual intervention.
  • Build for History: Use dbt’s snapshotting and modeling capabilities to implement historical tracking (SCDs), enabling us to accurately report on user states
  • Pipeline Optimization: Collaborate with Data Engineering team to optimize AWS Airflow DAGs, ensuring our dbt runs are performant, cost-effective, and resilient.

Technical Requirements

  • dbt Experience: You have a deep understanding of dbt (models, macros, seeds, and tests).
  • Expert in SQL
  • Experience in Dimensional Modeling
  • Understanding of modern data stack: Experience with Git/GitHub, Airflow (or similar orchestrators), and BI tools like Metabase or Looker.
  • Forward-Thinking: A passion for how clean data architecture enables LLMs and AI automation within the enterprise.