Abstract: Activation functions are pivotal in neural networks, determining the output of each neuron. Traditionally, functions like sigmoid and ReLU have been static and deterministic. However, the ...