kagandms (Kağan Samet Durmuş) Logo
Case Study

Foreign Trade Data Analytics & Web Scraping Project

Research and product-style work that turns fragmented foreign trade sources into one structured pipeline for analysis and reporting.

End-to-end analytics pipeline for collecting, cleaning, and visualizing foreign trade data.

Problem

Foreign trade data often lives in scattered sources, inconsistent formats, and structures that are not directly comparable. That slows research down and makes reporting too manual.

What was built

I created a Python-based collection and cleaning layer, a SQL model for queryable analysis, and a Streamlit interface for decision-support output. The goal was not only scraping, but turning data into a reusable analytical system.

Technology stack

  • Python for data collection and transformation
  • SQL for queryable structure and filtering logic
  • Streamlit for fast reporting interfaces
  • Web scraping to standardize fragmented sources

Data / workflow

  • Collect data from source systems
  • Normalize and clean the raw dataset
  • Prepare it for analytical querying in SQL
  • Generate reporting and visualization outputs

Outcome and impact

The repeatable workflow shortens the path from raw data to decision-support reporting. It also establishes a base that can be extended to additional datasets later.

Why it matters

This project combines data engineering, analytical thinking, and economic context in one piece of work. It is one of the clearest examples of the site’s data-driven positioning.