Neuroscience extensively uses the information theory to describe neural communication,
among others, to calculate the amount of information transferred in neural communication
and to attempt the cracking of its coding. There are fierce debates on how information
is represented in the brain and during transmission inside the brain. The neural information
theory attempts to use the assumptions of electronic communication; despite the experimental
evidence that the neural spikes carry information on non-discrete states, they have
shallow communication speed, and the spikes' timing precision matters. Furthermore,
in biology, the communication channel is active, which enforces an additional power
bandwidth limitation to the neural information transfer. The paper revises the notions
needed to describe information transfer in technical and biological communication
systems. It argues that biology uses Shannon's idea outside of its range of validity
and introduces an adequate interpretation of information. In addition, the presented
time-aware approach to the information theory reveals pieces of evidence for the role
of processes (as opposed to states) in neural operations. The generalized information
theory describes both kinds of communication, and the classic theory is the particular
case of the generalized theory.