Member-only story
How I achieved my 39th bronze medal on a Kaggle competition 3.23 using Jax
A couple of days ago I received an email from Kaggle that I had won my 39th bronze medal in their playground competition season 3 episode 23. I stopped writing posts about my bronze medals a while ago because I would like to achieve a silver medal. Since the code I used pertains to the Jax library, which I am studying at this time, I decided it would be a good idea to write about it.
Jax is a library developed by Google Brain in 2018, so it is newer than numpy and tensorflow. Jax has been written very similar to the numpy API, but it is not an exact science. For example, Jax arrays are immutable, so different techniques must be devised to work around this particular characteristic of the library.
Just by coincidence, I had to work around the standard code for normalisation to make it work in Jax.
The normalisation code that I would normally use when working with pandas or numpy is:-
X = (X — X.min()) / (X.max() — X.min())
This precise methodology does not work in Jax, however, so I had to make a few modifications, as seen in the code below:-