Intuitively I think as long as your activation function is sufficiently expressive (e.g. not a step function or something), you should be go to go in theory since that’s what you’re feeding back. Might take a while.
The word I was looking for is "robust". Any realistic model must be able to accept tiny perturbations (like Gaussian noise) of the vectors, since that's how floating arithmetic works. An RNN can't be robustly Turing complete.