Machine Learning Home Price Index for Allegheny County
An exploratory case projects creating a Machine Learning model using XGBoost to predict sale prices of homes in Allegheny County PA and create a Home Value Index for years ranging from the 1800s to 2021. This was part of an interview process for a data scientist position at Fidelity Investments.
Introduction to ABAQUS input file
In this document you will find step-by-step instructions on how to use and interpret and ABAQUS input file. In this document we work through a simple example input file and describe the function and the syntax of each part.
GPU Lattice Boltzmann Method implemented using OpenCL
This project page contains detailed description on the formulation of the Lattice Boltzmann Method and additional specific boundary conditions. It also includes a working general code implemented in 2-dimensions for arbitrary binary obstacles. There are 3 versions of the code implemented in: Python, Cython, and OpenCL. Please find the code and all the information on our project website.
Win Probability Prediction and Player Performance Evaluation for NBA Games Using Machine Learning
We established a model to predict the win probability (WP) of a certain game at certain point based on the play-by-play data and team stats. Our prediction is then compared with true WP of the games with an error with in ±5%. Based on the WP prediction model, we generated the WP curve with respect to time remaining for each game. Based on that, we evaluate players in the league by their ability to add WP to their teams. More specifically, we fit a linear regression model the on court players data with WP prediction. Then the coefficient of this model is extracted as an indication of players contribution to their teams WP factor.
Automatic Differentiator with Built-in Optimizers for fitting Neural Networks backward propagation
This package provides and easy to use interface for performing automatic diferetiation. In automatic differentiation, the algorithm is able to quickly and efficiently compute machine precision differentiations of linear and non-linear complex vector valued functions. In this package, we include three built-in optimizers and root-finding algorithms that can efficienly optimize or solve the vector valued functions. The three built-in algorithms are: Newtons' Method, Gradient Descent, and Generalized minimal residual method (GEMRES).