I spent about five minutes just now--on my iPhone, to kill time before falling asleep, so I notably didn't even have a keyboard--barely glancing at your code on GitHub, and found a buffer overflow :( in your data channel DCEP notification implementation (and then spent like a half hour verifying it on my iPhone and typing this comment on my iPhone, so I am sadly now much more awake than when I started ;P).
You have MAX_DATA_CHANEL_NAME_LEN (which is misspelled; I was very confused when the GitHub search couldn't find it ;P) set to 255 and in RtcDataChannel you allocate name with that size (already suspicious, as in most other places you + 1 for a NUL byte; this is what caused me to search for usages: in samples/Common.c you incorrectly assume it is terminated, btw); but, in handleDcepPacket you get a 16-bit value for the length which you pass through dataChannelOpenFunc to onSctpSessionDataChannelOpen... and then it is that length (as pNameLen) which gets passed to STRNCPY (which I verified is a #define for strncpy) instead of the size of the buffer.
I had momentarily considered "maybe this field fundamentally can't be larger than 255 for some other reason I don't see here" (as I honestly couldn't remember what the length limitations for these things are), but the standard even makes clear that these fields can go up to the full 16-bit length:
> The DATA_CHANNEL_OPEN messages contains two variable length fields: the protocol and the label. A receiver must be prepared to receive DATA_CHANNEL_OPEN messages where these field have the maximum length of 65535 bytes.
(This then made me do a quick search for STRNCPY in your code, and I am noticing you are using the wrong length constants for most of the calls in the SDP parser; this happens to work, because they are all defined to be the same number--255--but is still highly suspicious and would make me want to do a much more thorough audit. Really, my recommendation would be to never call STRNCPY directly: abstract it behind a macro that enforces you pass a direct array and an offset so it can do sizeof() to get the right size at compile time.)
> We were trying to scheme a way to convince you to use Pion :p
Honestly, I guess I am just of the firm belief that this is 2020, and no one should be writing code in "Pure C" under any circumstance :/. You might totally find a place where I made a mistake like this in my code (as I am, after all, still using C++, and I have very few fundamental guarantees), but when I fix it it will fix that entire class of bug for all of my code everywhere, as I have templated abstractions for things like buffer management (where I have even been trying to slip in some pseudo-dependent types to elide runtime checks) and am able to use deconstructors to manage my resources; my biggest issue is dealing with asynchronous coroutines running code in objects as they get deallocated, but even there I have been able to design a mostly type-safe abstraction (that if nothing else should catch the coding mistake at runtime and safely terminate the process). Can I maybe convince you to rewrite this in C++20 or Rust? :(
> We were trying to scheme a way to convince you to use Pion :p
FWIW, I just determined that "Pion" was actually referring to yet another implementation of WebRTC you have done, this one in Go, and so is actually quite interesting from a security perspective (for the very reasons why I believe this one to be dangerous). I had not seen it before.
My biggest concern is that I only have 15MB of memory available on iOS in the Network Extension (an absolutely brutally small amount of memory), and so I have a hard time using anything written in Go (I had actually tried before, linking parts of Ethereum written in Go, and then having to come up with an alternative plan) but I will add it to my list of projects to evaluate. If it can manage to only use a megabyte or two of RAM as overhead I can seriously consider it (I have infinite text segment, though).
(My current intention is to rewrite a lot of Orchid's code in Rust at some point; their async/await support wasn't quite ready when I started this, and I had a deadline, but it has since been released. I actually feel bad that I wrote as much new security code in C++, but I have gone to great lengths to make what I am doing as safe as possible and I allocate a good amount of my time to safety engineering... and even then I will note that coroutine issue would likely be obvious how to make impossible to code in Rust.)
(Also: a friend of mine has noted that I come off as arrogant in my previous comment. I don't agree with a lot of his advice for it, but I do agree that I come off as arrogant. Part of it is probably that I am a bit arrogant, which I acknowledge and will admit sucks. I made the comment about the misspelled identifier as I thought it would be a light and humorous moment in an otherwise dark comment, not because I wanted to rub anything in; and I accept that "this didn't take me long to find" makes it seem like I am showing off, but I really really really wanted to make it clear that I didn't spend half the day analyzing the code to find one bug. I dislike Rust and I dislike the Rust brigade, but they are actually "correct", and to have an SDK for an AWS product that I actually have been wanting to use on another of my projects come out marketing how it is written in "Pure C"--along with all the bugs that one would expect from a project written in "Pure C"--is extremely disappointing and should be held up as an example of why I likely shouldn't even be tolerating C++ and maybe should code everything in Rust no matter the tradeoffs.)
> If it can manage to only use a megabyte or two of RAM as overhead I can seriously consider it
Lets see if we can make this happen! If you are interested I would love to help. I have been looking at making Pion WebRTC work with TinyGo, then it would work really well!
> I come off as arrogant in my previous comment
You are fine! Everyone has different communication styles. You bring up valid points, and I really appreciate your enthusiasm. Earlier in my career it would have hurt though, but I have been through much worse. I will work on adding a spell checker to travis, not sure how else to stop something like that from happening again.
> "Pure C"--is extremely disappointing
I walk two very distinct paths in life. At one time I work on Pion and have very idealistic goals. I am proud that it builds fast, it is accessible, community owned and safe code. The unfortunate reality is that many people are never going to use it and those that use it usually aren't in the position to pay. The majority of the working hours in my last two years have been on it.
There is a demand for the C SDK. People have things they want to build, and I want to empower them. I agree that C has issues, but all I can do is try to fight that (sanitizers, fuzzing, code coverage). I am excited about what people are going to build with it. I try to be pragmatic, and my paying work lets me go and accomplish things like Pion.
You have MAX_DATA_CHANEL_NAME_LEN (which is misspelled; I was very confused when the GitHub search couldn't find it ;P) set to 255 and in RtcDataChannel you allocate name with that size (already suspicious, as in most other places you + 1 for a NUL byte; this is what caused me to search for usages: in samples/Common.c you incorrectly assume it is terminated, btw); but, in handleDcepPacket you get a 16-bit value for the length which you pass through dataChannelOpenFunc to onSctpSessionDataChannelOpen... and then it is that length (as pNameLen) which gets passed to STRNCPY (which I verified is a #define for strncpy) instead of the size of the buffer.
I had momentarily considered "maybe this field fundamentally can't be larger than 255 for some other reason I don't see here" (as I honestly couldn't remember what the length limitations for these things are), but the standard even makes clear that these fields can go up to the full 16-bit length:
> The DATA_CHANNEL_OPEN messages contains two variable length fields: the protocol and the label. A receiver must be prepared to receive DATA_CHANNEL_OPEN messages where these field have the maximum length of 65535 bytes.
(This then made me do a quick search for STRNCPY in your code, and I am noticing you are using the wrong length constants for most of the calls in the SDP parser; this happens to work, because they are all defined to be the same number--255--but is still highly suspicious and would make me want to do a much more thorough audit. Really, my recommendation would be to never call STRNCPY directly: abstract it behind a macro that enforces you pass a direct array and an offset so it can do sizeof() to get the right size at compile time.)
> We were trying to scheme a way to convince you to use Pion :p
Honestly, I guess I am just of the firm belief that this is 2020, and no one should be writing code in "Pure C" under any circumstance :/. You might totally find a place where I made a mistake like this in my code (as I am, after all, still using C++, and I have very few fundamental guarantees), but when I fix it it will fix that entire class of bug for all of my code everywhere, as I have templated abstractions for things like buffer management (where I have even been trying to slip in some pseudo-dependent types to elide runtime checks) and am able to use deconstructors to manage my resources; my biggest issue is dealing with asynchronous coroutines running code in objects as they get deallocated, but even there I have been able to design a mostly type-safe abstraction (that if nothing else should catch the coding mistake at runtime and safely terminate the process). Can I maybe convince you to rewrite this in C++20 or Rust? :(