• Neuronal firing is often understood as a fundamentally binary process, because a neuron either fires an action potential or it does not. This is often referred to as the “all-or-none” principle. Once the membrane potential of a neuron reaches a certain threshold, an action potential will fire. If this threshold is not reached, it won’t fire. There’s no such thing as a “partial” action potential; it’s a binary, all-or-none process.

        Frequency Modulation: Even though an individual neuron’s action potential can be considered binary, neurons encode the intensity of the stimulation in the frequency of action potentials. A stronger stimulus causes the neuron to fire action potentials more rapidly. Again binary in nature not analog.

        • floofloof@lemmy.caOP
          link
          fedilink
          English
          arrow-up
          9
          ·
          6 months ago

          Neuronal firing is often understood as a fundamentally binary process, because a neuron either fires an action potential or it does not. This is often referred to as the “all-or-none” principle.

          Isn’t this true of standard multi-bit neural networks too? This seems to be what a nonlinear activation function achieves: translating the input values into an all-or-nothing activation.

          The characteristic of a 1-bit model is not that its activations are recorded in a single but but that its weights are. There are no gradations of connection weights: they are just on or off. As far as I know, that’s different from both standard neural nets and from how the brain works.

        • ItsMeForRealNow@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 months ago

          So what you are saying is they are discrete in time and pulse modulated. Which can encode for so much more information than how NNs work on a processor.