Well, you're running both the iOS development tools (Xcode, iOS Simulator), plus the Android development tools (Gradle, Android emulator, and maybe Android Studio too). These add up.
16GB might be possible, though.
(Skip itself doesn't take much memory. If you run it headlessly as a SwiftPM plugin, you wouldn't need nearly that much.)
Do you have to run both at the same time? Because my flow with React Native is to focus on one platform at a time, I don't try to run everything in one shot.
No, you can configure it to just build and launch for iOS or Android separately. But we do recommend iterating on both in parallel for most of the UI work, just to make sure that everything stays in sync.
For framework/library development, you can of course build and test separately for each platform.
And if we're talking Expo, that's only for prebuilds of course; once you've got the native app installed then you can absolutely code and see updates in near real-time on both Android and iOS devices.
Likely because it uses both iOS and Android toolchains plus its own transpiler (with Skip Lite) or other overhead with Skip Fuse. iOS alone is already challenging with 16GB. Don't blame Skip for this - it's on Apple and Google for not shipping memory-efficient tooling, which shouldn't be a surprise if you've used their software.
Well, you always hope there is some overlap between those, who need it to be said, those, that you can reach, and those, where it will make the tiniest bit of difference.
If it truly brings you joy we have hat covered: it's a simple enough hobby!
The actual issue is that then you need something still that makes money. I think, for a programmer, that's fairly unproblematic too, for the foreseeable future: all those agents will need direction. Anyone can do that up to some level of complexity on their own, sure, but it simply is hard for humans to structure requirements and reason about a big enough systems and I don't see demand for those decreasing.
> But there were benefits to knowing as, ld and ar, and there still are today.
This is trivially true. The constraint for anything you do in your life is time it takes to know something.
So the far more interesting question is: At what level do you want to solve problems – and is it likely that you need knowledge of as, ld and ar over anything else, that you could learn instead?
Knowledge of as, ld, ar, cc, etc is only needed when setting up (or modifying) your build toolchain, and in practice you can just copy-paste the build script from some other, similar project. Knowledge of these tools has never been needed.
Knowledge of cc has never been needed? What an optimist! You must never have had headers installed in a place where the compiler (or Makefile author) didn’t expect them. Same problems with the libraries. Worse when the routine you needed to link was in a different library (maybe an arch-specific optimized lib).
The library problems you described are nothing that can't be solved using symlinks. A bad solution? Sure, but it works, and doesn't require me to understand cc. (Though when I needed to solve this problem, it only took me about 15 minutes and a man page to learn how to do it. `gcc -v --help` is, however, unhelpful.)
"A similar project" as in: this isn't the first piece of software ever written, and many previous examples can be found on the computer you're currently using. Skim through them until you find one with a source file structure you like, then ruthlessly cannibalise its build script.
I think I know what they mean, I share a similar experience. It has changed, 3.5 couldn't even attempt to solve non-trivial tasks so it was a 100% failure, now it's 70%.
Dear lord, what?
reply