Here, you'll find some of the papers I've published in the past as well as some side projects I've pursued.
I decided to put here all the projects I pursued that have taken at least some form of MVP shape. For other projects or experiments that I tinker with that don't end up in a somewhat complete or usable state, I may just make a blog post about them.
It's this site!
I was toying around with the idea of having a place to share my thoughts and ideas, and my existing website design didn't really support that, at least, not without significant changes. So I took the opportunity to revamp my website and add my own tiny corner where I can share those thoughts and ideas. It was also a good refresher on frontend development, particularly with this more modern stack compared to my v1 website.
A GenAI-powered learning application that helps you learn faster through AI-powered note-taking, RAG-based question answering, and pedagogical agents for reviewing. It's like Cursor for learning, or an open-source NotebookLM.
I learn a lot of new things regularly, and though I find a lot of tools like Obsidian and Claude Desktop quite helpful, I've always wanted to try building my own GenAI-powered application that I can personally use to accelerate my learning curve. This is that application.
I wanted to make an end-to-end RAG application for myself, also as a reference repository for when I want to quickly start on other RAG-dependent projects. This repository was a way for me to consolidate a quick Streamlit frontend, a LangChain-enabled backend, Dagster pipelines, and a Weaviate vector DB backing.
Some of the code here is over-engineered for what it is, but I find it easier to just remove stuff I don't need. Most importantly, it's a great way for me to quickly ask a board game rules question without having to scour BoardGameGeek forums for edge cases while I'm playing.
The first time I decided to make a personal website, I made it using the stack I used day-to-day for work at the time: React.js + Material UI.
I've always wanted to make my own website, and getting to direct people to a URL with my name in it had a sense of fulfillment attached to it.
A fullstack single page application (SPA) web blog implemented in React+Redux (focusing on hooks) with Express and PostgreSQL. This is a sporadically growing project that serves as a sandbox for implementing various libraries and services.
As a teenager, I've longed to build the next big social media site. I would create scrap HTML and CSS code just to make something that resembles a home page. However, since I had very little programming experience then, none of the projects ever came to fruition.
In a way, this project is a small materialization of that dream. Here, I apply or experiment with some of the insights I've gained from coding with various React+Redux, Express, and PostgreSQL libraries in the industry.
A kind of mini-game I did during college. Similar to Coin Apocalypse, the game was developed with AS3 and Flixel and the pixel art was made by me using GIMP and Paint.
I made this project as an entry to Game Jolt's Peace, Love, and Jam game jam in 2014.
A fun passion project that I did during my early college years. The game was developed mainly using an AS3 library called Flixel in the FlashDevelop IDE. All the pixel art was also made by me using GIMP and Paint.
This game taught me a lot of things in my early programming years about code organization and object-oriented programming. It was also my first dive into critically thinking about UI/UX and game design.
These are papers based on the dissertations I've worked on with wonderful people during my university years, which I presented in international conferences.
(Best Student Paper Awardee)
A deep learning paper that focuses on the use of social media, specifically Twitter, for determining dengue activity in the Philippines. The paper describes the use of a gated recurrent unit (GRU) based neural network that classifies tweets as being related to one or more dengue centered topics. The results from analyzing these online interactions were then collated and compared against actual epidemiology data which turned out to have a significantly high correlation. The developed model showed immense potential in being able to quickly assess dengue activity to inform emergency response need.
This research describes an intelligent agent that automatically extracts the who, when, where, what, and why from news articles in the Philippines. The agent was developed using a combination of rule-based techniques and traditional machine learning models. The results showed promising performance, especially for the who, when, and what.