How I used a Jax regression model to predict on the hardness of minerals

Crystal X
6 min readNov 20, 2023

One of the great things about Kaggle competitions is the fact that they enable a person to develop his or her data science skills. The most recent Kaggle playground competition, being season 3 episode 25 is a regression problem that makes predictions on the hardness of minerals. The link to the competition can be found here:- https://www.kaggle.com/competitions/playground-series-s3e25/overview

Although there are other models that I could have used, I decided to use Jax’s linear regression model to solve the problem in order to show the different ways that Jax can be used. JAX is basically a Just-In-Time (JIT) compiler focused on harnessing the maximum number of FLOPs to generate optimised code while using the simplicity of pure Python. Some of the salient features of JAX are: Just-in-Time (JIT) compilation.

I have not used the JIT in this problem because it is not a large dataset that has been used. One thing that I have found is that when using Kaggle, larger datasets will crash when using Jax’s more sophisticated functions. Therefore, just to keep it simple, I have used Jax functions that are compatible with numpy. Although the numpy API is similar to Jax, it is not an exact duplication, so some alterations to numpy code have to be made to make it compatible with Jax.

--

--

Crystal X
Crystal X

Written by Crystal X

I have over five decades experience in the world of work, being in fast food, the military, business, non-profits, and the healthcare sector.

No responses yet