Problem
Foreign trade data often lives in scattered sources, inconsistent formats, and structures that are not directly comparable. That slows research down and makes reporting too manual.
Research and product-style work that turns fragmented foreign trade sources into one structured pipeline for analysis and reporting.
End-to-end analytics pipeline for collecting, cleaning, and visualizing foreign trade data.
Foreign trade data often lives in scattered sources, inconsistent formats, and structures that are not directly comparable. That slows research down and makes reporting too manual.
I created a Python-based collection and cleaning layer, a SQL model for queryable analysis, and a Streamlit interface for decision-support output. The goal was not only scraping, but turning data into a reusable analytical system.
The repeatable workflow shortens the path from raw data to decision-support reporting. It also establishes a base that can be extended to additional datasets later.
This project combines data engineering, analytical thinking, and economic context in one piece of work. It is one of the clearest examples of the site’s data-driven positioning.