Member-only story

Make Predictions on the California House Price dataset using a Flax and Jax MLPR

Crystal X
5 min readJan 13, 2024

--

In my last post I discussed how to make predictions on the Boston House Price dataset, and that blog post can be read here:- https://medium.com/@tracyrenee61/predict-on-the-boston-house-price-dataset-using-a-mlpregressor-made-from-jax-and-flax-90b08e36474c

In this blog post I am going to illustrate how another dataset that has similar features will need a totally different optimiser in order to work. Optimizers are algorithms or methods used to change the attributes of the neural network, such as weights and learning rate, in order to reduce the losses.

The optimizer used to make predictions on the Boston House Price dataset was optax’s sgd, which is a canonical Stochastic Gradient Descent optimizer. This implements stochastic gradient descent. Stochastic gradient descent is an iterative method for optimising an objective function with suitable smoothness properties. It can be regarded as a stochastic approximation of gradient descent optimization, since it replaces the actual gradient by an estimate thereof.

Unfortunately, optax’s sgd optimiser did not work on the California House Price dataset, although the dataset is not too dissimilar from the Boston House Price dataset. I therefore had to experiment with other optimizers, and decided to try the adam…

--

--

Crystal X
Crystal X

Written by Crystal X

I have over five decades experience in the world of work, being in fast food, the military, business, non-profits, and the healthcare sector.

No responses yet