Systems Architect | Rust Engineer | Ph.D. Mathematician
I specialise in High-Performance Computing (HPC) and ML Infrastructure. My focus is replacing GIL-bound Python bottlenecks with highly optimised Rust and C++ kernels, leveraging SIMD, zero-copy data transfer (Apache Arrow), and in-process OLAP engines (DuckDB).
- Advanced Forecasting: Large-scale Hierarchical Time Series (HTS), Probabilistic Forecasting, and Handling Intermittent Demand (Sparse Data) for Supply Chains.
- Rigorous Statistics: Bringing statistics and ML to production without the Python overhead.
- Industrial AI: Anomaly Detection in manufacturing processes, Predictive Quality, and Root Cause Analysis using Causal Inference.
- Systems Programming: Porting interpretability-heavy Python logic to Rust/C++ (WASM/Native).
- GenAI Infrastructure: Building Model Context Protocol (MCP) servers and dependency-free inference engines for Foundation Models.
- Data Engineering: Designing zero-copy ETL pipelines using DuckDB, Polars, and Apache Arrow.
- Architecture: Full re-implementation of the Chronos-2 time-series foundation model in pure Rust.
- Objective: Remove heavy PyTorch/Python dependencies for edge and high-throughput environments.
- Tech:
Candle/Burn,WASM,Tokio.
- Performance: Achieved 2,900x speedup vs.
statsmodels/pandasloops by moving logic to C++. - Design: Hybrid architecture using DuckDB for parallelized data shuffling and Rust for vectorized statistical kernels.
- Tech:
Rust,DuckDB C-API,OpenMP.
- Implementation: Custom Rust-based servers implementing the MCP standard to inject dynamic context (DB schemas, API specs) into AI coding agents.
- Tech:
axum,serde,async-trait.
| Crate / Repo | Description | Stack |
|---|---|---|
| sipemu | 4+ Utility crates for statistical computing. | Rust |
| AnoFox-Statistics | High-performance statistical extension for DuckDB. | Rust, DuckDB |
| Polars-Statistics | FFI bindings for high-speed statistics on Polars DataFrames. | Python, Rust, Polars |





