Explaining ReLU as a Switch - another way


In-going weights w in the ReLU neuron's layer. Out-going weights w' of the ReLU neuron (part of the next neural network layer's weights.)


A (ReLU) neuron has in-going weight connections and out-going weight connections (some of the weights in the next layer.)

When the output of a ReLU neuron is zero those out-going weights are disconnected from the system. They don’t do anything.

When the output of the ReLU(x) function is active (x>0) then x=ReLU(x). It’s like the ReLU function wasn’t there (x=x). And the out-going weights of the neuron are directly connected to the sum of the in-going weights.

You dealing with a switched linear system.

Then you are freed up to think of other ways of switching in and out other blocks of weights.

Comments

Popular posts from this blog

Neon Bulb Oscillators

23 Circuits you can Build in an Hour - Free Book

Q Multiplier Circuits