Featured Projects
Bring-your-own-key AI analyzer: paste a Claude or Gemini key in the UI, upload a chat, get a 12-section dashboard covering love & affection, reciprocity, conflict, psychological patterns, 0–100 ratings across 8 dimensions, predictions, early warnings, word clouds, health timeline, and a blunt verdict. Keys live only in localStorage and are sent straight to the provider — no backend of mine touches them. Includes a free demo mode with fictional "Alex & Sam" sample data.
> scan_results.log
> dependencies
A bring-your-own-token dashboard built on top of GitHub's Dependabot API. Paste a PAT with `repo` + `security_events` scopes, click scan, and it fans out across every repo you own in parallel, returning a unified view of critical/high/medium/low vulnerabilities with links to each advisory. Designed for the case where you have 20+ personal repos and don't want to tab-hop GitHub to find what actually needs fixing.
> scan_results.log
> dependencies
Built an autonomous AI platform for SOC teams: IMAP/SMTP monitoring, LLM-based phishing classification (Gemini, GPT-4, Claude 3 — switchable), automatic IOC extraction (malicious domains, URLs, IPs), Celery-based task queuing, JWT auth + rate limiting, and a real-time Next.js dashboard with threat visualization. Designed to feed existing SOC workflows via REST API.
> scan_results.log
> dependencies
Five-step detection pipeline: regex-based log parsing → unsupervised Isolation Forest over message length and keyword features → rule-based severity classification (HIGH/MEDIUM/LOW) tied to events like failed logins and brute-force attempts → GPT-generated incident summaries → console alerts + report export. Optional Flask dashboard adds severity filtering and JSON export.
> scan_results.log
> dependencies
Upload a procedural PDF (mining, healthcare, defence domains). A multi-agent pipeline — document ingestion → procedural extraction → domain validation → spatial-temporal layout → visual spec — produces an animated canvas showing who does what, where, and when. Uses Claude for semantic reasoning, AWS Textract for PDF parsing, OR-Tools for spatial constraint solving, and FAISS for vector validation. Output is compatible with SpaceDraft's rendering engine.
> scan_results.log
> dependencies
Built for the Visagio Hackathon 2025 under the Agentic AI track. Users describe symptoms in plain language; Claude assesses urgency and recommends the right care setting; the system finds nearby open providers via Google Places and HotDoc scraping; directions, hours, and booking links render in a multi-screen mobile-first flow. Designed for multi-channel distribution (GP sites, WhatsApp, SMS, voicemail).
> scan_results.log
> dependencies
Backend engine that ingests market data + news, runs feature engineering → ML prediction → signal generation → risk check → execution. Supports swing and long-term quant strategies, paper or live. Architecture uses abstract interfaces (DataProvider, Broker, SentimentAnalyzer, PredictionModel) so any component can be swapped. A Next.js dashboard shows portfolio metrics, positions, signals, and backtests with TradingView Lightweight Charts.
> scan_results.log
> dependencies
Developed and optimised lightweight ML models for IoT intrusion detection using feature selection and model compression techniques, achieving high detection accuracy and validating deployment on constrained devices such as Raspberry Pi.
> scan_results.log
> dependencies
Connects to Gmail via OAuth, scans incoming mail for recruiter messages, and classifies them into application states (Applied, Interview, Offer, Rejected) using OpenAI + Anthropic. Presents everything in a searchable dashboard with manual overrides, file uploads for resumes/cover letters, and keyword-based auto-apply rules.
> scan_results.log
> dependencies
Full-stack health tracker with a personalised onboarding flow that captures fitness goals and daily schedule preferences. Users log workouts, meals, and daily activities and see unified progress metrics (calories, protein, steps, water). Includes a drag-and-drop day scheduler, two meal plan variants (gym days vs. rest days), and achievement badges.
> scan_results.log
> dependencies
Ingests vessel mooring-line sensor data, validates and smooths it, and displays live hook status on a dashboard. The data-quality pipeline: range checks → rate-of-change spike detection → cross-sensor outlier detection → temporal completeness monitoring. Includes multiple signal filters (SMA, EMA, median, dual-EMA), sensor health states (OK/DEGRADED/FAILED), and a 0–1 confidence score fed to both the risk engine and dashboard.
> scan_results.log
> dependencies
Yarn-workspace monorepo with Next.js 14 frontend and Express + Prisma backend. OAuth into each provider, automatic 5-minute background sync across all integrations, real-time WebSocket updates, intelligent caching for repeat lookups, and integration status monitoring. Search by email or user ID to see everything about one customer in one place.
> scan_results.log
> dependencies
Upload a raw fixture CSV (Game Date, Game Type, Grade, Teams), pick which grades to include, configure per-game duration and metadata options, preview the transformed output in the browser, and download the calendar-ready CSV. Next.js 14 frontend with a Python/Flask backend handling the heavier CSV parsing via pandas.
> scan_results.log
> dependencies
Companion backend to the Mooring Portal: simulates and analyzes real-time tension measurements from mooring hooks and bollards at a port, predicts when tension will breach safety thresholds, and alerts crew visually. Four-tier alert system (safe / caution / warning / critical) with outlier detection and calibration drift monitoring.
> scan_results.log
> dependencies
Two-part hackathon system: Python CLI producing synthetic oceanographic readings (temperature, salinity, pressure, wave height, current speed) with configurable drift, HTTP retry + exponential backoff; Node.js/Express dashboard receiving + streaming via Socket.io with in-memory 1000-record buffer.
> scan_results.log
> dependencies
Early-stage Next.js app that authenticates with Spotify via OAuth and requests scopes for playback history, top tracks, current playback, and private playlists. The scaffold handles login/logout and stores access tokens for API calls; feature pages for displaying listening data are still placeholder.
> scan_results.log
> dependencies
Prior version of my personal portfolio — pirate-themed dark interface with animated hero ("Sailing the Grand Line of Cybersecurity & Networking"), interactive 3D island rendered with Three.js + React Three Fiber via glTF, Formspree-backed contact form, and an admin panel for replying to submissions. PWA manifest for mobile app-like experience.
> scan_results.log
> dependencies
Node.js + Express web app that accepts a CSV upload plus a region name and returns computed statistics: min/max population countries filtered by positive net change, average + stddev, population density rankings, and Pearson correlation between population and area. Python reference implementation included alongside the web version.
> scan_results.log
> dependencies
Single-file Python solution that reads a CSV of countries and regions and returns two dictionaries: regional summary (standard error of population, cosine similarity between population and land area) and per-country detail (population, net change, regional percentage, density, rank within region). The coursework brief prohibited all imports, so everything is implemented from scratch.
> scan_results.log
> dependencies
Academic assignment analysing historical healthcare cyber-breach data. Three shell scripts: cyber_breaches.sh (max incidents by US state or year), preprocess.sh (data cleaning + year normalization), breaches_per_month.sh (median + MAD statistics to detect anomalies). Zero package manager, zero dependencies — just Bash, awk, and the TSV.