My very React app initially took about 10 seconds to render changes on a iPhone 5s and a 1st gen Moto X. Also a user with a chromebook complained that it degraded other aps. I did some digging and found that many of the React developers use some form of immutability with the pureRender mixin. This is a huge boost to performance. Doing that alone made it slow but usable, and then I batched up changes to state, and split parts of the application into different tabs to improve things even more.
To this day, I still wouldn't be done with the application had I used vanilla JS, so even with the performance tuning it was worth it, but it was not without cost.
i've been doing react for almost two years (since before it was publically released) and i've never seen anything render that slowly, and i basically never use purerender
if you share that code with me i will find the actual problem for you, just because i'm curious. it isn't react.
For what it's worth, I also work on a fairly large React application and haven't run into organically strange performance dips. If we're designing a component to do interesting things with unbounded data sources, we absolutely consider the vanilla JS approaches before trying to model the problem in terms of composable React components. But compared to every other approach I've experienced building web applications, React does give us pretty ergonomic and efficient building blocks by default.
It wasn't just react, I was definitely doing unneeded state changes, and I got it to the point now where there is no noticeable delay.
One big issue was that I wanted dependent fields to update as-you-type, which also meant validating as-you-type. I added a timer to delay that so that this wasn't all running on every single key-stroke, but wouldn't update until you stopped typing.
Reducing the number of dependent values in the DOM tree at a single time helped a lot too, and there were logical categories to split them into. My validation code was not particularly optimized either, since it was originally designed for batch-processing.
I really don't get huge concerns people have with performance. I find chrome's profiling tools to be very primitive to what I'm used to, but they were more than adequate for me to steadily improve performance.
There's probably still some improvements to be had, but right now there's no issues with responsiveness on mobile, so I stopped looking.
I can totally empathize with the large form woes, having worked on a couple of similar client projects with similarly unusual requirements. Angular (the first version, I hear awesome things about the latest), in particular, fell down hard when managing a very large number of semi-dependent input fields. We ended up customizing large swaths of its collection diffing code to gain minor performance increases before designing a pure JS solution that was very performant, if very expensive and time-consuming to implement. I have no reason to think that React would have fared any better in either case.
I was starting from a CRUD application with forms for inputting all the fields to store in a database, and then a PDF could be generated off of that. So, I already had code to calculate all of the dependent values from the inputs. Turning it into react took about one days work to get the poorly performing prototype, and fully switching to ImmutableJS took another few hours.
Also, I know approximately nothing about web browsers. It's possible that using something like bootstrap or angular would have given me a big boost here as well, but the article is comparing to vanilla JS, not to other frameworks.
To this day, I still wouldn't be done with the application had I used vanilla JS, so even with the performance tuning it was worth it, but it was not without cost.