Business Intelligence Engineer · Seattle

Hi, I’m Fani.

I build analytics products that turn messy, real-world data into clear decisions—dashboards people trust, and pipelines that don’t break when things change.

BI & Dashboards SQL & Warehousing ETL & Automation
Faniel Habte

What I focus on

  • Simple, self-serve metrics for busy stakeholders
  • Reliable data models + clear definitions
  • Automation that saves time without hiding the truth

My story

A quick snapshot of how I think, what I enjoy, and why I’m in this work.

How I got here

I started in operations and bookkeeping, where accuracy, communication, and clean records were essential. That foundation pushed me into analytics through Year Up and then Amazon, first as a Business Analyst Intern and now as a Business Intelligence Engineer.

As highlighted in my resume, my day-to-day work focuses on translating business questions into trusted metrics, decision-ready reporting, and scalable data workflows. I partner across teams to improve data quality, standardize KPI definitions, and deliver analytics that help leaders make faster, higher-confidence decisions.

Outside work

Outside of work, I build fun personal projects to learn new skills and simplify real workflows for people, teams, and developers.

  • People and community: Church Membership Portal simplifies membership, attendance, and contribution tracking.
  • Environmental and social reporting: World Prosperity Dashboard and USA Clean Energy Project support clearer reporting and better decision-making.
  • Developer productivity: Data Workbench, QueryQuest API Lab, and PulseLens Feedback Insight Engine simplify data exploration, API querying, and feedback analysis.
Community impact Sustainability reporting Developer tools

Timeline

A narrative timeline (high-level) — the detailed bullets live in my resume.

Career

Amazon

Seattle, WA, USA

Business Intelligence Engineer

Full-time · 2024 — Present

  • Led executive analytics for global sustainability operations with governed dashboards, production ETL pipelines, and KPI frameworks across regions.
  • Improved data reliability and decision speed through data quality monitoring, SLA governance, and self-service reporting used by senior leadership.

Business Analyst Intern

Internship · 2023 — 2024

  • Partnered with product and analytics teams to translate customer sentiment and engagement goals into measurable KPIs, analyses, and dashboard requirements.
  • Delivered recurring SQL-based insights for business reviews and improved reporting consistency through Tableau and QuickSight documentation.

Amazon Flex Driver

Part-time · 2022 — 2023

  • Delivered 100+ packages per day safely and on time by balancing speed, route efficiency, and customer-first service.
  • Used GPS and real-time route decisions to handle changing delivery conditions and improve daily completion efficiency.

Palma Trucking

Seattle, WA, USA

Bookkeeper & Operations

Full-time · 2022 — 2023

  • Managed daily accounts receivable/payable and transaction tracking to improve financial accuracy and speed up collections.
  • Coordinated with customers, brokers, and warehouse stakeholders while supporting dispatch and logistics continuity to keep operations and revenue flow on track.

Education

Western Governors University

Salt Lake City, UT, USA

B.S. Software Engineering

Degree Program · 2025 — Expected 2028

  • Building deeper software fundamentals in data structures, system design, and engineering practices to complement analytics and data engineering work.

Seattle Central College

Seattle, WA, USA

Year Up Technical Training Program

Technical Training · 2023 — 2024

  • Completed intensive training in software development, testing, and professional skills.
  • Built the technical and professional foundation to transition into analytics roles.

Skills

What I use confidently today and what I am actively learning next.

Analytics & BI

QuickSight Tableau Excel KPI design Executive reporting Stakeholder storytelling

Data engineering

Data modeling AWS S3 Glue Redshift ETL APIs Data quality Logging Testing & reliability System design

Programming

SQL Python PostgreSQL

Currently learning

PySpark Databricks SQL Databricks Volumes + Delta Databricks Jobs DuckDB + dbt workflows Great Expectations Incremental ETL design Streamlit app development LLM + analytics workflows

Personal projects

Real builds that show how I think. Each one includes what I build, what I learned, and what I’d do next.

World Prosperity Dashboard

Completed

A reproducible pipeline that turns raw geopolitical indicators into curated tables and narrative insights for trend and driver analysis.

  • Raw → curated transforms with snapshotting
  • Schema-first JSON outputs for insight generation
  • Logging + data-quality checks (outliers, missingness)
Python PostgreSQL Tableau Streamlit
API ingestion + ETL psycopg2 data loading Data modeling Dashboard storytelling Data quality checks

USA Clean Energy Project

In progress

Build end-to-end energy reporting pipelines from EIA datasets, landing raw to curated and publishing analytics-ready tables with a simple app/dashboard layer.

  • Ingest EIA datasets into Databricks volumes and raw tables
  • Create clean Delta models with quality checks + incremental snapshots
  • Publish curated marts for provider/fuel trend analysis and reporting
Python Databricks SQL Databricks Volumes Delta tables Streamlit Tableau
Databricks Python library API ingestion Great Expectations Logging Incremental ETL Databricks Jobs

Data Workbench

In progress

A lightweight data workbench where users upload datasets, run SQL-style analysis, and turn saved logic into reusable dbt-ready workflows.

  • Upload CSVs into a local analytical workspace
  • Run queries and save reusable views/tables
  • Generate starter dbt model scaffolds from saved logic
Python DuckDB dbt SQL Streamlit
Interactive data app prototyping SQL querying and aggregation Local analytics with DuckDB dbt model design patterns Reusable Python data workflows

QueryQuest API Lab

Research / R&D

A research-driven developer tooling project that explores a SQL-like query language for REST APIs, translating custom syntax into API calls and returning analysis-ready tabular outputs.

  • Design SQL-to-API syntax for auth, params, and pagination
  • Build parser/transpiler + execution engine with retries
  • Normalize JSON responses into queryable table-shaped results
Language design Parser/Transpiler REST APIs SQL Streamlit Data normalization

PulseLens Feedback Insight Engine

Drafted

A text-to-insights workflow that extracts sentiment and action items from feedback and returns structured JSON for downstream reporting.

  • Prompting for deterministic structure
  • Confidence scoring patterns
  • Evaluation ideas (sampling + QA checks)
Redshift SQL Bedrock LLM

Church Membership Portal

Drafted

A community-focused portal concept for managing member records, attendance, and contribution tracking with clear, role-based visibility.

  • Manage member profiles and attendance events
  • Support leader/admin roles with export-ready reporting
  • Roadmap includes payments integration and mobile-first UX
Web app CRUD Role-based access Reporting Dashboarding

Contact

Let’s connect — I respond fastest by email.

Collaboration summary

Open to thoughtful collaboration, feedback, and knowledge sharing across analytics, BI, and data engineering.