Member-only story
How I won my 43rd bronze medal on a Kaggle competition concerning smoking
I have been quite surprised to learn that I have so far won three bronze medals on Kaggle’s playground competition season 3 episode 24. I have to say that this is perhaps the first time I have earned a multiple of bronze medals on one competition, and I feel it is because I have used Google’s Jax library to complete the competition.
Google JAX is a machine learning framework for transforming numerical functions. It is described as bringing together a modified version of autograd (automatic obtaining of the gradient function through differentiation of a function) and TensorFlow’s XLA (Accelerated Linear Algebra). It is designed to follow the structure and workflow of NumPy as closely as possible and works with various existing frameworks such as TensorFlow and PyTorch. The primary functions of JAX are:-
- grad: automatic differentiation
- jit: compilation
- vmap: auto-vectorization
- pmap: SPMD programming
I have to say that I have tried to use the more advanced Jax functions in Kaggle competitions, but they had the effect of causing the program that I was creating to crash. It is for that reason that I have stayed with the mostly numpy compatible code to work through the program.