Member-only story

Predict on the Boston House Price dataset using a MLPRegressor made from Jax and FLAX

Crystal X
5 min readJan 12, 2024

--

I have written several blog posts that introduced the Jax library to the aspiring data scientists within the arena of machine learning. Jax, as I have previously written, is a numeric library that has much more power and functionality than the numpy library.

The Flax library is written on top of the Jax library, so a person needs to have some familiarity with Jax before he can effectively use Flax. In addition, a person needs to be knowledgeable of deep learning techniques in order to confidently use Flax because this library enables the data scientist to create neural networks.

A multilayer perceptron (MLP) is a misnomer for a modern feedforward artificial neural network, consisting of fully connected neurons with a nonlinear kind of activation function, organised in at least three layers, notable for being able to distinguish data that is not linearly separable.

Modern feedforward networks are trained using the backpropagation method and are colloquially referred to as the “vanilla” neural networks.

The first perceptron was introduced in 1958 by Frank Rosenblatt in his book, ‘Perceptron’. This network has evolved significantly over close to seven decades that data scientists have been experimenting with it. While the first…

--

--

Crystal X
Crystal X

Written by Crystal X

I have over five decades experience in the world of work, being in fast food, the military, business, non-profits, and the healthcare sector.

No responses yet