Neuromuscular function is typically evaluated by recording the surface electromyogram
(sEMG) in different motor tasks. Linear stochastic signal analysis is largely employed
to extract information from sEMG and motor unit activities, even though there are
well known nonlinearities in the neuromuscular system. Measures based on information
theory have been used in the neuroscience. These measures are model-independent and
can capture both linear and nonlinear features of time series. Here, we used Shannon's
entropy to quantify the neural information present in (1) the single motor unit spike
trains, (2) compound spike trains (a population signal), and (3) sEMG envelope (a
gross measure of muscle electrical activity). Participants performed isometric force
control tasks in two contraction intensities (2.5% and 5% of maximum voluntary contraction).
We also calculated the correlation (linear regression model) between the entropy and
either the coefficient of variation of muscle force or mean motor unit firing rate.
Results showed that entropy increases with force level and it is higher for the spike
trains. In addition, spike train entropy has moderate correlation with the mean motor
unit firing rate. The results are quite promising, and they show that measures from
information theory can be applied to signals from motor unit activities to differentiate
contraction intensities and, with further analysis, to understand the nonlinear properties
of the neuromuscular system.