Member-only story
How I won my 45th bronze medal on Kaggle’s playground competition season 4 episode 1
In my last blog post, I discussed how I made highly accurate predictions on Kaggle’s playground competition series 4 episode 1 using sklearn’s Random Forest model. The link to that blog post can be found here:- https://medium.com/@tracyrenee61/does-jaxs-linear-regression-model-outperform-sklearn-s-predict-proba-in-a-binary-classification-b3cf036e649b
In order to try out different models, however, I also made predictions on the competition dataset using Google research library, Jax. Image my pleasant surprise when I received an email from Kaggle saying that I had received a bronze medal on the Jupyter Notebook that I had created using Jax’s linear regression model!
I decided to use a regression model rather than a classification model because I had previously read on the Stackoverflow website that the predict_proba attribute is a regression methodology and not a classification one.
I have written the program to predict the probability that a customer would churn in Kaggle’s free online Jupyter Notebook and saved it to my Kaggle account.