Redundant trees are commonly used for protection
and restoration in communications networks. Zhang et al. presented a linear time algorithm
to compute node-redundant trees in 2-node-connected networks, which has become widely
cited in the literature. In this paper, we show that it is difficult to implement
this algorithm providing both correctness and linear complexity at the same time.
Therefore, we present a revised algorithm with strict linear time complexity. Moreover,
we generalize the concept of node-redundant trees from 2-node-connected networks to
arbitrary topologies, a crucial development since real networks can not always satisfy
2-connectedness, especially aftera failure.