Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>We've been doing CRUD in our industry for decades. How can we not just say "this is how you do CRUD, we're done w/ that now"

As an analyst, can you explain this bit?

I keep hearing things like "that's not actually a software development job, just CRUD", "we're done with doing CRUD" etc. But it seems like between the application and the DBA all the CRUD is taken care of, wouldn't the developer just work on the application itself? And isn't saying "we don't do CRUD anymore" somewhat akin to saying "we don't do [+-*/] anymore"? How can you have persistent data without CRUD? I must be missing a piece of the puzzle in this discussion.



It's a reductive, dismissive way of thinking, like saying that everything is ones and zeros, or that we're just copying protobufs around.

The data that we manipulate has business meaning and there are consequences for the users that arise from how we model things. Consider the genre of articles like "Falsehoods Programmers Believe About Names" [1]. There is ridiculous complexity here, for those willing to see it, but some people get tired of it.

[1] https://www.kalzumeus.com/2010/06/17/falsehoods-programmers-...


Not OP, but my take: When people talk about "CRUD" in the way you describe, they're usually talking about one of two separate (but related) things.

The "it's not _actual_ development" framing is usually directed at applications which "only" allow users to perform basic actions on some data, basically UIs for manipulating a database. It is absolutely real development (in my view), but less sexy than AI/ML, big data, etc, etc.

You are correct that every application (with some sort of data persistence) needs CRUD. But how CRUD is implemented, for better or for worse, depends on the requirements of the application storing the data. For (most) relational databases, the low-level "how do I CRUD" is well defined: standard SQL queries. But if I use NoSQL, or flat files, or something; it changes.

The definition of CRUD also varies depending on the layer of abstraction within an application or the perspective of the user/developer. For example: from a DBA's perspective, CRUD is SQL queries. From a UI, CRUD might be a JSON API or GraphQL endpoint. From a server-side application, CRUD might be a specific ORM library.


Yeah, CRUD is a solved problem but you still have to do it.

Mapping state to the database is to web dev what applying paint to the canvas is to painting. It’s how you do it that counts. Saying otherwise is overly reductionist.

Frameworks exist that abstract CRUD away. But you end up sacrificing UX and / or flexibility.


Picking the right level and nature of abstraction for the problem at hand is something of an art. Too high and you'll straitjacket yourself. Too low and you'll spend most of your time maintaining ugly boilerplate.

One of the many reasons why CRUD is way harder than its reputation credits it with.


I suspect their point is more: if it's a solved problem, why do we keep making new ways to do it?


CRUD is looked down upon because it's time consuming and repetitive when you do it with poorly designed tools and because it's the most common role.

I think it's mostly a class thing though. Test automation is similarly looked down upon even though it is often much harder to do right than regular coding.

There is a definite pecking order when it comes to programmer roles and it's not necessarily related to difficulty (although it correlates very strongly with pay).




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: