10+ years designing humanitarian data pipelines for IOM, WHO, and WFP. Now applying that domain depth to AI engineering — local LLMs, geospatial analysis, and production ETL systems.
End-to-end ETL system pulling live displacement and food security data from the HDX API into MS SQL Server, with automated ArcGIS Pro map generation and an interactive Streamlit dashboard.
RAG pipeline over classified humanitarian PDF reports using Llama 3 + LangChain. Runs fully air-gapped — no data leaves the machine. Designed for privacy-sensitive field operations.
High-performance custom matrix operations in C++ — SIMD optimization, cache-aware blocking, and Python bindings via pybind11. Benchmarked against NumPy.
I'm a data and AI engineer with a decade of field experience inside United Nations organizations — building the systems that track displacement, health clusters, and humanitarian response at national scale.
That background gives me something most engineers don't have: I understand why the data matters, not just how to move it. Now I'm applying the same rigor to AI engineering — local LLM deployment, geospatial pipelines, and high-performance systems.
I'm a lifelong learner and a genuine fan of algorithms and linear algebra — I see the math as the foundation that makes AI systems actually reliable, not just impressive in demos.