When you show the string in memory order, they are the same. Its the operation on the string that's important, not the way you print the hex byte-order-dependent value. Both become 01 00 01 06
But we're trying to convert the string "2016" to the integer 2016.
we want to turn the sequence [0x32, 0x30, 0x31, 0x36] (same on both architectures) into [0x00, 0x00, 0x07, 0xe0] in big endian or [0xe0, 0x07, 0x00, 0x00] in little endian. You can't simply perform the same procedure in both architectures since it'll result in a reversed sequence in one of them...
0x36313032 - 0x30303030 = 0x06010002
how is 0x02000106 the same as 0x06010002?