Cryptography people tend to be good about providing test vectors, especially in the IETF CFRG. Unfortunately, it's not as a common a practice as one would like.
I was thinking more about the lack of testsuites for everything related to network protocols, such as DNS, HTTP or WebSockets and WebRTC.
The RFCs related to DNS are an endless list of deprecations that you have to "merge" in your head until you actually know what is allowed, what is deprecated, and what was extended. Especially with EDNS and all its options that are somewhat somewhere on the IANA website.
Before that, I realized that not a single server implementation implements HTTP's 206 Partial Content and/or Transfer-Encodings as specified; and lots of servers even reply with wrong buffer sizes when requesting multiple Content Ranges.
When reading through the Chromium and Firefox codebases, there's always dirty hacks that are implementation specific, so there isn't any end-to-end networked-only testsuite that verifies the network states and behaviours.
For my own Browser Stealth [1] I had to create a testsuite because I couldn't find anyone that's not related to known SSL attack vectors. Due to the peer-to-peer concept I decided to test network behaviours wherever possible.
The network protocols themselves (when speaking of RFCs) are just not tested, and freely interpreted at will - even in older projects like apache, caddy, curl, libaria and others. When reading the curl codebase you'll soon realize that it is a huge collection of hacks he had to implement just to make things work when the servers were behaving incompliant to the specifications.