Commutativity is a much weaker claim because one is totally before or after the other. e.g. AB may commute with C so ABC=CAB but it is not necessarily the case that this equals ACB. With asynchrony you are guaranteed ABC=ACB=CAB. (There may be an exisiting mathematical term for this but I don't know it)
I'm not talking about a universe where all elements commute, I'm talking about a situation in which A, B, and C do not necessarily commute but (AB) and C do. For a rigorous definition: given X and Y from some semigroup G, say X and Y are asynchronous if for any finite decompositions X=Z_{a_1}Z_{a_2}...Z_{a_n} and Y=Z_{b_1}Z_{b_2}...Z_{b_m} (with Z's in G) then for any permutation c_1,...,c_{n+m} of a_1,...,a_n,b_1,...,b_m that preserves the ordering of a's and the ordering of the b's has XY=Z_{c_1}Z_{c_2}...Z_{c_{n+m}}. I make the following claim: if G is commutative then all elements are asynchronous, but for a noncommutative G there can exist elements X and Y that commute (i.e. XY=YX) but X and Y are not asynchronous.
To give a concrete example, matrix multiplication is not commutative in general (AB ≠ BA), but e.g. multiplication with the identity matrix is (AI = IA). So AIB = ABI ≠ BAI.
Or applied to the programming example, the statements:
I agree. T and U async with respect to each other means at least that T and U can be broken down into tasks t1, t2, t3, ... tn and u1, u2, ..., un, such that they can be interleaved in any order, but typically we still require that the t tasks are executed in sequential order. The divisions between the tasks are where they give up control, e.g. as they wait for data to be loaded into memory, or on a network call.
This is still a special case of what we mean by async wrt each other, because depending on the interleaving at each step and e.g. the data loaded into memory, the number of tasks may change, but the idea is that they still eventually terminate in a correct state.