The whole idea with "NoSQL" is that you pick the best tool for the job. Most people that push RDBMS solutions try to shoehorn an RDBMS for every possible scenario. For modeling the majority of business data, an RDBMS is fine. There are many problems such as graphs, logs, lossy real-time data, etc that CAN be modeled in an RDBMS but shouldn't be.
I largely agree with this, but in my mind using an RDBMS for a sub-optimal task is not always shoehorning—it can be a powerful hedge. In an early stage startup you don't know what the final business model is going to be, or how your data will need to scale. Spending time researching and setting up optimized data stores for things that change drastically or never end up being important is a tremendous waste of time.
I suppose it could also be said that spending time designing relational schemas that will change drastically, then migrating data every time they do, is wasteful. Many non-relational databases allow quite a bit of flexibility in terms of incrementally changing structure as you learn about your business needs.
Not that I am in one camp or the other. We use both relational and non-relational databases where it makes sense. The more of each (both in choices and users), the better, so far as I am concerned.
That's a really good point. Saying that NoSQL is about picking the right tool for the job makes it easy to dismiss. There are lots of programmers out there that just refuse to look at any other tools than the very small set that they already know. I think it would be beneficial to change that attitude across the spectrum of programming tools.
Both are in nearly every dictionary around. And given the base of "ubiquitous", I prefer adding a "ness" to denote a quality, which is extremely common in English and can even be applied to words that don't normally accept it, so it's an easy cognate. Something being cheesy == has cheesiness. "Ubiquity" is more of a hacked form of a word, as it removes from the base word.
Though I admit it's shorter, thus may eventually achieve ubiquity over ubiquitousness.
Not to continue the content-free let-me-fix-that-for-ya sniping you're answering, but "ubiquity" dates back to 1579. It's a nice, old word, not a hack or neologism.
Nonetheless, both words are entirely proper, acceptable, and clear.
>Most people that push RDBMS solutions try to shoehorn an RDBMS for every possible scenario.
I don't think that's true for well-informed developers in a space where NoSQL is acceptable.
It might be true generally only in the sense that most developers don't know or care about NoSQL and don't like the idea of learning something new. That's to be expected; most of these guys will never leave .NET/Java and the blessed toolkits associated with each. They are corporate programmers and they don't really count.
A well-informed person might make some good social arguments against NoSQL adoption, like the lack of experienced available developers, or the relative immaturity of the respective codebase, or other issues that surround emerging platforms. These arguments will sometimes hold merit.
Otherwise, I don't know why one would be averse to the implementation of a "NoSQL" datastore for information like logs. I know that I've always having logs in the relational db.
I haven't met many experienced, decent developers who shun NoSQL when there is a good technical and environmental atmosphere for its implementation. I don't think it's a widespread thing.