OSC-Coach

Python · Streamlit · Pandas · SQLite | 2024

The Problem

First-time contributors struggle to make their first open source contribution. They don't know where to start, what repositories accept beginners, or how to structure a pull request. This barrier prevents talented developers from entering the open source ecosystem.

My Approach

Built a Streamlit web application that provides:

  • Curated practice questions organized by difficulty and topic
  • Real-time filtering using Pandas for personalized learning paths
  • SQLite database for persistent user progress tracking
  • Step-by-step guides for submitting quality pull requests
  • Links to beginner-friendly repositories actively seeking contributors

Technical Implementation

The application uses Streamlit for the frontend, providing an interactive interface without requiring JavaScript. Pandas handles all data filtering and transformation—users can filter by programming language, difficulty level, and contribution type in real-time.

SQLite stores user progress persistently, allowing students to track which questions they've completed and resume from where they left off. The database schema is simple but effective: users, questions, and progress tables with proper foreign key relationships.

Deployed as a web application accessible to students across multiple universities. The lightweight architecture means fast load times and minimal hosting costs.

What I Learned

  • User-centric design: Iterated based on feedback from actual users, not my assumptions about what they needed
  • Database design: Learned to normalize schemas properly and write efficient SQLite queries
  • Data processing: Became proficient with Pandas operations—filtering, grouping, and transforming data efficiently
  • Deployment considerations: Managed state in a web application and handled concurrent user sessions

Impact

Secured second runners-up position at UIU CSE Project Showcase among 19 competing projects. Currently used by students at United International University to prepare for their first open source contributions.

View on GitHub ↗

Data Processing Pipeline

Python · Pandas · NumPy · SQL | Ongoing

The Problem

Raw data from multiple sources needs to be cleaned, validated, transformed, and loaded into a database for analysis. Manual processing is error-prone and doesn't scale. Need an automated pipeline that handles errors gracefully and runs on a schedule.

My Approach

Building an ETL (Extract, Transform, Load) pipeline that:

  • Extracts data from CSV files, APIs, and databases
  • Validates data quality and handles missing values
  • Transforms data using Pandas and NumPy operations
  • Loads processed data into SQL database with proper schemas
  • Logs errors and sends notifications when issues occur

Technical Implementation

Modular Python scripts organized by function: extraction, transformation, validation, and loading. Each module has clear inputs/outputs and can be tested independently.

Using Pandas for data manipulation—handling joins, aggregations, and reshaping operations. NumPy for numerical computations. SQL for schema design and query optimization.

Error handling at each stage with proper logging. If extraction fails for one data source, other sources continue processing. Failed records are logged for manual review.

What I'm Learning

  • Data quality: How to validate data, handle edge cases, and deal with inconsistent formats
  • Pipeline architecture: Designing systems that are maintainable, debuggable, and extensible
  • Performance optimization: Making Pandas operations faster by understanding vectorization
  • Error handling: Building resilient systems that fail gracefully and provide useful error messages

Portfolio Website

HTML · CSS · JavaScript | 2024

The Problem

Needed a professional online presence for hackathon applications and job opportunities. Requirements: fast loading, mobile-responsive, easy to update, and no framework overhead.

My Approach

Built from scratch using semantic HTML, custom CSS, and vanilla JavaScript. No dependencies, no build process—just clean code that loads instantly.

Designed with mobile-first principles, then enhanced for desktop. Used CSS Grid for layouts, Flexbox for components, and CSS custom properties for theming.

Technical Implementation

Semantic HTML5 structure with proper heading hierarchy and ARIA labels for accessibility. CSS follows BEM naming conventions for maintainability. JavaScript handles navigation, smooth scrolling, and dynamic content updates.

Hosted on GitHub Pages with custom domain. Optimized images, minified CSS, and achieved perfect Lighthouse scores for performance and accessibility.

What I Learned

  • Web fundamentals: Deep understanding of HTML, CSS, and JavaScript without framework abstractions
  • Responsive design: Building layouts that work across devices using modern CSS techniques
  • Performance: Optimizing load times and achieving high Lighthouse scores
  • Deployment: Using Git for version control and GitHub Pages for hosting

Automation Scripts

Python · Git | Ongoing

The Problem

Repetitive tasks in my workflow—renaming files, organizing directories, backing up data—waste time and are prone to human error. Need automated solutions that run reliably.

My Approach

Building a collection of Python scripts that automate common tasks. Each script solves one problem well: file organization, data backup, batch renaming, report generation.

Scripts are designed to be composable—output from one can be input to another. Using command-line arguments for flexibility and logging for debugging.

What I'm Learning

  • Python standard library: Using os, pathlib, shutil, argparse, logging effectively
  • Error handling: Writing robust code that handles edge cases gracefully
  • Documentation: Creating clear README files and inline comments for future reference
  • Unix philosophy: Building small, focused tools that do one thing well

What's Next

Currently working on projects involving API development, cloud deployment, and more complex data engineering challenges. Each project is an opportunity to learn new tools and patterns while solving real problems.

Preparing technical projects specifically for hackathon season—building with speed and clarity in mind. Looking for opportunities to collaborate with technical teams who ship fast and iterate based on feedback.