Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
A word used only by Postgres developers
286 points by ccleve on March 10, 2022 | hide | past | favorite | 156 comments
I came across a word in the Postgres source code that I'd never seen before: "frammish".

https://github.com/postgres/postgres/blob/master/src/backend... :

> Therefore, they offer both exclusive and shared lock modes (to support read/write and read-only access to a shared object). There are few other frammishes. User-level locking should be done with the full lock manager --- which depends on LWLocks to protect its shared state.

It sort of makes sense in context, as a "feature" or a "flourish". It also appears on the pg_hackers mailing list:

> There has been some talk of separating the power to create new users from the power of being superuser (although presumably only a superuser should be allowed to create new superusers). If the planned pg_role rewrite gets submitted before the 8.1 feature freeze, I might look at adding that frammish into it.

and here, from 19 years ago:

> And we get ragged on regularly for the non-SQL-standard features we've inherited from Berkeley Postgres (eg, the implicit-FROM frammish that was under discussion yesterday).

No amount of googling turns up a formal definition or usage outside of the Postgres community. "frammish.org" doesn't seem to be related.

Are Postgres developers starting to evolve their own dialect? Should we call an anthropologist?



I recently caught out someone trying to hide their deep knowledge of Postgres when, in a moment of weakness as I mentioned one of the benefits of TOAST, he replied "I guess that's why they call it 'the best thing since sliced bread'".

And that's when I knew.

Because that's an exact quote from the docs.

> This section provides an overview of TOAST (The Oversized-Attribute Storage Technique).

> PostgreSQL uses a fixed page size (commonly 8 kB), and does not allow tuples to span multiple pages. Therefore, it is not possible to store very large field values directly. To overcome this limitation, large field values are compressed and/or broken up into multiple physical rows. This happens transparently to the user, with only small impact on most of the backend code. The technique is affectionately known as TOAST (or “the best thing since sliced bread”). The TOAST infrastructure is also used to improve handling of large data values in-memory.

https://www.postgresql.org/docs/current/storage-toast.html


Why would anyone try to "hide their deep knowledge of Postgres"? Like, I could easily see someone having to hide their deep knowledge of MongoDB--lest they be branded forever as "damaged"--but I've been under the impression that PostgreSQL skills are considered a really good thing this past decade or so... were they just really hoping to avoid becoming a database engineer, or were they maybe under threat of becoming an "on call" asset?


If people ever start seeing you as “the database guy”, you’ll be pigeonholed into that slot as long as you remain in that team.

Or maybe their manager was just a really big fan of Oracle or something. I don’t know.


> you’ll be pigeonholed into that slot as long as you remain in that team

That's a good thing: it gives you great leverage to ensure good retention-pay.

-------

During your 1:1s/"connects"/perf-evals, don't frame it as "I hate being the only one here who knows SQL, I'm quitting/gimmie-a-raise", instead frame it as "I'm proud of the great accomplishments that my unique and deep understanding of both database-theory and SQL in-practice have brought to the company; given my significant responsibilities I now have in the team/company I'm sure you'll "recognize" my now quite-significant leading-role...."

(for best effect make the "money-gesture" with your fingers during the last part).

...I wish I had the gall to do that when I was younger.


It also creates problems that normally wouldn't exist. I was "the regex guy" at my last company. Every couple of days someone would be trying to solve a problem that would much more reasonably be handled by a few string.split() or indexOf() calls and some logic in their application's language. Instead they would message me "I need to do blahblahblah, can you give me a regex for that?" Or even worse "I've written this regex but it doesn't work on a few cases, can you help me fix it?" and then share some 600 character long regex monstrosity.

90% of the time my answer was simply "Yeah... That's a bad use-case for a regex. You can use this java snippet to accomplish the same thing."

At the start I would actually solve their request with some hairy regex, but it generally did not perform well and when requirements change they'd be unable to edit it and would find me again to change it.



Yep! When I first heard of this it was phrased as "the chainsaw problem".

"Hey can I borrow your chainsaw?"

"Sure, you need the gas powered one or the electric one?"

"Oh I'm not sure, which one would be best for cutting my hair?"


Did the company ever use the "internal market" paradigm? If so, keep track of your time spent on other-peoples-problems and record it in a spreadsheet, and invoice other peoples' managers accordingly :)


That sounds like a quick way to shut down collaboration across the team...


...and get shit done. I mean come on, who asks a person in another team to help with a regular expression. There has to be a line somewhere. You can be a go-to person for an obscure technology, some complex framework, but being a go-to person for regular expressions? What's next? Go-to person for iteration?


When I joined it was about 15 employees. When I left it was about 150 (but still under 30 devs) so this wouldn't really be reasonable.

But part of the problem was they did know simple enough regex for what they wanted to accomplish. Having a few simple regex and some application logic around them would have done it. But they'd often think "that sounds messy, squeaky-clean can probably solve this in a single regex." And while I probably could, it would not be good code nor would it be more performant.


How are you supposed to get work done on a team, if any question asked to another team member is immediately seen as "unproductive"? It's not just about regexes. I'm having trouble debugging a system, who do I ask? Nobody, apparently, because if I do then I might incur a "bill" from my own coworkers!


You wouldn't be logging time spent on stuff that's part of your actual responsibilities and/or for your same team (under the same manager) is fine - the whole "internal market" stuff applies to inter-team work, especially work/asks/tasks that don't go through the chain-of-command (i.e. from IC1 in team A to IC2 in team B without going through either Team A's manager nor Team B's manager).


I think in the some cases like people mentioned above, it's no longer collaboration but more of a consultation. The benefit is only going in one direction: to the consultantee. In that case, you need some way to make the nature of that relationship explicit and valued. Otherwise these consultants begin to feel they're being put in a position of working for someone else without management recognizing that and compensating it as such.


When I joined it was still a pretty early stage startup so that wouldn't have been reasonable. When I left it was pretty solidly in the mid-sized company category, so maybe something like that would have been doable (invoices are pretty rude though). I did enjoy helping cross-team though, it's fun working with people you don't work with on a daily basis, and they were/are my friends.

I wouldn't even have minded being seen as the "string pattern/parsing guy". But I was specifically regex-guy in their minds.

Also story for another day and another thread, I was also the MongoDB-guy. Similar to the regex stuff though, in 90% of the questions they'd send to me MongoDB was not the correct choice.

Overall it's more nuanced then could ever fit into an HN comment (yes I brought it up with management, and other issues), but there are a number of reasons I quit. I quit a couple months before a company buyout we knew was going to happen and I had equity. I felt like it was probably the right choice then. I recently got some beers with some of my friends who stayed through the buyout and had equal amounts of equity and learned I did make the correct choice.


I know enough people who have gotten pigeonholed as "the database person" to believe that's not an enviable place to be. You end up getting a lot of tedious, fiddly work dumped on you by people who have no idea what they're doing and need you to clean up their messes.

No amount of retention pay would raise my spirits in that situation.


> people who have no idea what they're doing and need you to clean up their messes

Not even this. It's worse when you end up with a whole bunch of tedious, yet very basic work dumped on your by people who have no desire to know what they're doing because "the database guy" can just do it instead.

Note that it's not only "databases" where this happens.


I became one of the 'version control' guys at a company once. It was nice to be a go-to person for certain crises, but it had its down-sides for sure.


Worse than that, you can get set with OKRs set by people with no context and then judged because you didn’t meet them. Then your expertise is discounted.


But if you truly find it boring it’s still going to be difficult to frame it that way. Picking up too much work you don’t want to do can make you resent a job very quickly.


idiotsecant, other people having worse jobs doesn't mean you should be happy with the same (or even better). Should we limit ourselves to be content with doing better than 90%? Where would the Torvalds, Stroustups, and Dijkstras come from then?

Being pigeon holed into boring work just because you're "the guy" is crap, no matter how good the salary is. I quit my last job because of this, and would encourage anyone feeling the same to do so.


Most of the world works at jobs that are repetitive, demeaning, dangerous, and insecure. And all that for the payoff of barely continuing to survive. It's truly a point of privilege to turn up your nose to a top 10% first world salary because shuffling things around in a database is distateful to your sensibilities .


And much of the world has food insecurity, but I still don't eat bananas...


Same with anything to do with ops or deployment - better to hide that knowledge.


I used to work with a guy who was amazing with makefiles. After being burned once, ne made damn sure nobody at his new job knew about it.


You know that you've matured in this business when you can feel no guilt when denying knowledge about something that you're an expert in.


Yeah, it sucks. I have strong opinions about how to improve our ops, and the knowledge/skills to do it, but there is no way I'm going to get stuck as being the ops guys.


Unless that's the job they hired you to do, and it's in your job title...


As a fulltime ops/deployment person, I haven't written any real software in over 4 years and long to go back to it.

Of course I'm solving real and challenging problems, but the skills that got me here in the first place are dying on the vine.

Also, on top of that, I'm "the cloud/networking/debugging guy" for about 150 engineers. It's annoying. I want to turn off Slack.


That goes without saying doesn’t it?


Can confirm from experience. It took me years to shake that off and become known as a developer who could database. Its a function of whatever the team lacks. Once upon a time, DBA was a thing - I did manage to stay out of that deep pigeonhole luckily.


DBA is absolutely still a thing, they fix things that developers make that deal with databases :)


From my limited interactions with DBA's.. they rarley did anything outside of the database other than tell the developers their queries were inefficient or deploy new tables/databases/upgrades.

I understand my view on the DBA may be limited, but are you serious ? Have I been unlucky and only ran into this weird subcategory ?


No, you experienced the average DBA which is probably someone that started as a Junior SysAdmin, futzed around with databases, liked that they could silo into that and do nothing else.

The problem is that that is basically where all advancement stopped for them. They learned SQL well enough, they might even know how to do transaction rollbacks, but god help you if you have a real systemic problem that takes detailed knowledge of the whole system to unwind and fix without major data loss.

I'm in the process of trying to convince management to hire a real Postgres expert(both inside and outside the database), because we are currently on a bad, checkbook driven path that is moving tons of our managed RDS Postgres databases to self-managed (and poorly architected) clusters on another cloud. I have neither the time nor the inclination to become a deep Postgres expert.


Exactly. I was at one point an orgs "GDPR expert" which was fine work wise but not exactly what you want to be known for


> were they maybe under threat of becoming an "on call" asset

You got it. A previous role as a database firefighter was something this individual did not want to continue at their new workplace.


I’ve definitely done it for exactly this reason, and others mentioned in the thread: a strong desire to not suddenly become the ______ guy.

Additional responsibility with no additional authority absolutely sucks when you get (a) pigeonholed, (b) saddled with every single request ever about _______ in addition to your other work or (c) some unholy combination of both.

Especially when ______ only has enough “buy in” from decision makers to make the decision that you need to keep ______ alive because the business “needs”it but apparently not enough to properly source and acquire the necessary resources it needs compared to other business initiatives.

Because “why do we need to do that? I thought you knew about ______ “

Go figure. I’m quite done volunteering myself like that.


Heartily agreed. Additional responsibility without the authority to match just sets you up for failure. And rarely do you get extra pay for this additional responsibility anyway.


A few other scenarios:

In an interview I don’t want to be intimidating and feigning ignorance can give the candidate opportunity to shine. If they go deep, I can keep up and keep pushing, if they veer into bullshitting I can tell and gracefully conclude without offending anyone.

As a manager, not disclosing depth lets me ask stupid questions more frequently and in more contexts (for the benefit of others, for when I forget something or don’t understand something, as a Socratic teaching method, to help set the culture of asking questions, etc)

Playing dumb is also a good way to avoid responsibility if you hate something, too, if a bit passive aggressive.


Sometimes being seen as deeply knowledgeable will do nothing but create endless unpaid work for you. It is often the case that this work comes from people who don't know and don't want to know whatever it is you're good at.

Surely everyone here can relate with the classic "computer guy" everyone takes advantage of. It turns out this exact same thing can happen in actual technology companies. All you have to do to become "that guy" is have knowledge about some vital technology that people don't actually want to care about. People will happily send anything related to it directly your way, increasing your workload and responsibilities with zero additional compensation.

It seems databases are to many professionals what computers are to laymen. They depend on this technology but they don't fully understand how it works or how to fix problems. They still want to define the database schema and make other key decisions. If there's any problem, they don't want the responsibility, they want to be able to offload it to some "database guy". Who wants to be that guy?


> Why would anyone try to "hide their deep knowledge of Postgres"?

> or were they maybe under threat of becoming an "on call" asset?

This is likely, especially if the person they're talking to tends to leech on other people's skills & time.


If you're specializing in something else but have knowledge of some other, unrelated thing, the latter is often at least accidentally concealed because it rarely comes up. Further, one might deliberately conceal it, to avoid having work assigned that distracts from the thing one is trying to focus on (for career development, personal preference, whatever reason).

I've been known to pretend not to know a damn thing about WordPress, for instance. Even though I do.


"Oh, you're a computer programmer? Can you help me with my printer?"

But difficult, and with consequences.


Hehe.. I once was asked to retrieve someones mail account (he lost is password)... sigh. People have all kinds of ideas what you can do if you are a programmer.

Told him to contact the provider.


Printers can be harder, especially if you have driver trouble.


We could never get the damn things working in our office

This was when I worked for HP


I once fixed a bug caused by a faulty printer driver from HP. The driver changed the floating point control word and didn't change it back. Our program crashed while doing a completely innocuous operation, but only if you had printed to an HP printer earlier in the session.


I remember that, because I worked on the MS FoxPro team at the time and report printing was crashing for a lot of users, but not all of them. Took forever to finally pin it down because, as parent comment points out, the culprit could have long left the building by the time the crash happens. Stupid driver sets the FPU to say "math errors like divide-by-zero are software's problem now, not mine" without telling software. IIRC (and this was over 20 years ago), operations had to be wrapped in/with the one line of code that flipped it back.

But here's the thing: it wasn't just HP, it was a lot of print drivers. I suspected that there was some printer driver boilerplate out there, possibly even published by MS, that included this bug.

(And, wow, did I swerve sharply into the off-topic lane for story time. Sorry.)


That's funny, we had a very diverse user base but never saw the problem with any printer but HP. And 20 years ago sounds about right for my time frame too. I think that was about the point where Windows started providing more isolation between drivers and user programs.


"Oh, you're a hacker? Can you help me with my printer?"

I once extracted a PPD from a very expensive printer's firmware because the vendor didn't officially support Linux.


Similarly I know a lot about software management because I saw that it affects everything I do, and I needed to be able to push back on bad management. Doesn't mean I want to be a project manager. No, I don't, and please stop asking.

And there are tasks you took on at old jobs because they needed to get done and nobody else would do it, so you got stuck. You did them. Maybe you even did them well. But they aren't on your resume, because you don't want to do it again. And if you mention them as anecdotes, you are careful where and when you bring them up.


The competitive fighting game scene's term for this is "Hiding your power level".


In my home, I would want to be the dumb guy whose only knowledge of computers is to turn on the button.

In my work, I would want to be the know it all, so that I can jump between good projects.

In my consultation business I would want to be the master in the topic I work with.

Depends on the context.


Why would some be branded damaged because they have knowledge of a tool? That's ludicrous.

I've used many databases including mongo. They are all tools like any other with pros and cons and having experience with multiple across domains is a boon.


It may be ludicrous, but it's not an uncommon point of view.

"... teaching of BASIC should be rated as a criminal offence: it mutilates the mind beyond recovery"

- Edsger W. Dijkstra

https://www.cs.utexas.edu/users/EWD/transcriptions/EWD08xx/E...


Though it's rare these days, there is still such a thing as being modest


Sometimes it is better to be the student than the teacher.


ok, but Toast being the best thing since sliced bread is also a somewhat antiquated vernacular expression, I mean obviously the best thing since sliced bread is often used without toast as the thing being referred to but it was also often the case that toast was being referred to> https://www.theatlantic.com/health/archive/2012/02/how-the-p...


Yeah, maybe the guy just has a sense of humor - something that your average software developer would mistake for being a reference to documentation


This is wholly unnecessary.


What's unnecessary, a sense of humor?

;)


Toast predates machine-sliced bread, which is what "sliced bread" refers to. Sliced bread made toast better (easier to automate), but that doesn't make toast the best thing since sliced bread.

The article you linked explains that "sliced bread" is the best thing, to which all later (not just bready) inventions are compared.


Isn't the joke that the particular piece of toast always postdates the particular slice of bread it was made from? I.e., toast is the best thing since sliced bread because first we had sliced bread, and then we toasted it. It's a dumb dad joke


are you really under the impression that vernacular expressions are always perfectly logical and thus showing how an expression would not be logical proves that it was never actually in common use?


I am not familiar with Postgreql.

I would make that joke.


I'm a little bit confused by the wording "This happens transparently to the user," I assume they mean that the user is unaware of this workaround? Or do they mean it's obvious how it works (transparent) to the user?


In this context, it means "this workaround happens without the user having to perform any additional steps"; not necessarily that they're unaware, but that it happens without their involvement and it doesn't seem that it's any different from the surface-level.


It's an old software thing! "Transparent to the user" means "without the user knowing". Strange, right? I prefer "hidden from" to "transparent to" but I definitely use the latter a lot and it's quite clearly wrong!


It's not wrong. Glass is transparent to you. It precisely means that you don't see it.


You are right! Consider me convinced!


Huh, curious. It clearly didn't originate with Postgresql developers, as one can find uses of the word sprinkled around here and there in old literature. One neat way to see uses of it is to search on Google Books:

https://www.google.com/search?tbm=bks&hl=en&q=%22frammish%22

That said, I don't see anything that purports to give a definitive definition of the word - everybody who uses it seems to assume that everyone else knows it. And at least at first blush, I don't see anything that attempts to explain the origin / etymology of the word either.

It's the kind of thing you'd almost expect to see in the "Jargon File" but it doesn't appear to be there either.

http://catb.org/jargon/html/go01.html


Barely worth mentioning but from your Google Books link, the Chaucer and the City result is an OCR error; clicking the preview shows that it is 'rammish and somehow that was interpreted as frammish.

EDIT: Same with the result from The Royal Dictionary Abridged

EDIT: All the results I can find pictures for are actually `rammish or |rammish or similar. I would assume any results w/out scans available are similarly suspect.


Yeah, I noticed that about the Chaucer book, but I didn't bother checking most of the others. Interesting. So maybe Postgres did invent the word!


The Raku language (formerly Perl 6) community also has its share of idioms, abbreviations (not just technical) and phrases. I'm not sure if they are exclusive to this community, or also used elsewhere.

Examples:

In the early days, many features were specified by Not Yet Implemented, so NYI became its own term.

Error messages are meant to be awesome, so anything "Less Than Awesome" (LTA) was considered a bug.

Larry Wall didn't like the term "void context", so he invented "sink context" instead (with all the puns related to it, of course).

There were many more, though I have a hard time coming up with a longer list...


Perl has a history of this due to Larry Wall's background in linguistics. His Christian faith is also an influence. In Perl you can "bless" a reference to reify it into a class. The name Perl also comes from the bible apparently. Though you probably know this based on your username...

Fortunately Perl was reasonable. On the other side of the spectrum you have the impenetrable and pretentiously obnoxious Urbit[1]

[1] https://urbit.org/docs/glossary/moon


Oh, hey, they have a glossary now. That's at least a nice affordance. Back when I last looked at Urbit you were expected to figure everything out like you were reading a cyberpunk novel.

...actually, now that I say it, "cyberpunk enthusiast" kinda resonates with the whole design of Urbit.


Urbit's documentation is like the Codex Seraphinianus but less understandable.


This is my first visit to Urbit... I feel like I stumbled into a parallel dimension. Currently transfixing me: The "Hoon syntax" page gives one-syllable pronunciations for all the ASCII punctuation chars. Wat.

https://urbit.org/docs/hoon/hoon-school/hoon-syntax


Geez, this seems like if Timecube was opensourced on GitHub. Is it a serious project or a parody?


It's as serious as Perl, but where joining the Perl community turns you into Ned Flanders, it's meant to turn you into a pro-monarchy NRX guy.

You might need therapy after, but whenever I've met someone who can't stop saying things like "grok" and "less than awesome" I've always thought they needed whatever the opposite of therapy is. They're so well adjusted you can't stand being near them.


NRX = neo-reactionary, for those not up to date with the latest political affiliation acronyms.


thanks, you saved me an urbandictionary visit. :P


> In this lesson we introduce the notion of a vane, which should be thought of as a kernel module for Arvo (if you don't know what that means yet, just keep reading)

> Following this lesson, we will introduce Gall, a vane used to build user space apps. Then we have a walkthrough where we construct an egg timer as a Gall app that interacts with Behn.

nods Gall is a vane that uses Behn, which is an Arvo kernel module. Got it. Crystal clear.

> Here we see that wind produces a wet gate that takes in two molds, which for the move type for Behn are notes and gifts. When a vane needs to request something of another vane, it %passes a note. When a vane produces a result that was requested, it %gives a gift to the callee.



It predates this from earlier verisons of Mac OS. This was a "Mac OS" tool before it was a Mac OSX CLI tool.


I thought the etymology of "NYI" was more-or-less the same as "NIH" (not invented here) - namely, that it came out of Bell Labs somewhere along the line

...or, at least, that's what my CS prof claimed in the late 90s :)


Likely a variant spelling of frammis; https://en.wiktionary.org/wiki/frammis


Could frammish be a portmanteau then? Condensing "frammis-ish" to frammish - as in gizmo-like, maybe the thing you need to build which may or may not be a distinct component?


Iterestingly, several of the Google Books results for Frammis are Joe Celko's SQL books, wonder if there's a connection.


Maybe I’ll use that when I finally get around to writing a UI framework. ButtonFrammis, CheckboxFrammis, LabelFrammis, TextInputFrammis, …


Thought it was a Carrollian invention: sort of like beamish, uffish, frumious...


Something, generally a device, for which one does not know the proper term


frammis, frammistat, frob, frobnicate, and a few others are "hacker" words that I learned from downloading text files on bulletin boards in the 80s. i only ever heard them from that source, until i had a manager from that era except he was all military software background. (yeesh, he was awful)


It's also used in libjpeg: https://github.com/kornelski/libjpeg/blob/master/libjpeg.doc...

Maybe it's used as a sort of shibboleth and accidentally escaped internal communication into source code?

It does not seem to be spreading very much, though.


The original developer of libjpeg is Tom Lane, who is the same lead developer who uses this term at PostgreSQL.

https://handwiki.org/wiki/Biography:Tom_Lane_(computer_scien...


Showing up on HN is a bit of a superspreader event.


Username checks out.


Clearly there are no Andromeda Strain fans here.


Good call -- that's exactly where I got the name!


My favorite like this is "impunge" from Gluster. It's used during repairs after a node has gone down and come back up. Files that are present but should have been deleted are expunged. Files that are absent but should be present are impunged from surviving replicas.


Betting it comes from the incantation "Frammin at the jim-jam, frippin in the krotz", repeated frequently in the US comic strip "Wizard of Id" by Brant Parker and Johnny Hart.


"allballs" is another potential candidate: https://www.postgresql.org/message-id/20050124200645.GA6126%...


That's slang from the Apollo missions: https://www.history.nasa.gov/afj/ap11fj/cm-107_graffiti.html (search for "all balls")


The link mentions that it's military slang. I doubt it originated with postgresql devs.


I've noticed that the docs sometimes use the verb "spell" in an interesting way, e.g. "IN GROUP is an obsolete spelling of IN ROLE."


That’s “spelling” the noun, not a verb.


True, but to be fair it's the gerund formed from the verb.


The verb form is where the atypical usage originates, so I think it's fair to focus on the verb as the point of interest despite including an example that happens to be a gerund.

Anyway, I think there's an even larger set that "spell" is a member of: metaphorical usage to highlight anything unwanted or erroneous; "code smell" is another.

It's wrong, therefore it smells. It's wrong, therefore it's misspelled. It's wrong, therefore it wants to be different (personification metaphor).


And here I thought `"spelling" the noun` was a reference to either Tori or Aaron


apparently no one here watches TV


I think that usage is a little bit more common eg., I believe I've seen it in Perl docs, and in rare StackOverflow answers.


I’ve heard this before, e.g. Go’s “while” loop is spelled “for”


Yeah I have as well, I think it's both to be cute and to signify that they're equivalent."'IN GROUP' is an obsolete version of 'IN ROLE'" might mean that it was a subset of the latter with different behavior.


I'm working on seeding `backcompat` as a portmanteau of "backwards compatibility". Help me out, HN.


I'm not sure how much seeding is needed - I've heard it used for years, and seems to have become a bit of a term for the Xbox previous generation game compatibility stuff.


I've seen and used that term for at least a decade. I don't think it needs seeding.


For example quirks mode in Internet Explorer: https://developer.mozilla.org/en-US/docs/Web/API/Document/co...

  if (document.compatMode == "BackCompat") {
    // in Quirks mode
  }


That’s using backcompat in source code, like “CompatMode”. Thankfully dictionaries aren’t full of all the variable names that exist in source code.


I have a meeting with a couple of junior devs in about an hour. I'll use it in conversation and we'll see what happens. :)


I‘d go for backpat. ;)


Or ytilibitapmoc.


arguably that assumes that the operator that each letter denotes is self inverse.

in the general case, y⁻¹ t⁻¹ i⁻¹ l⁻¹ i⁻¹ b⁻¹ i⁻¹ t⁻¹ a⁻¹ p⁻¹ m⁻¹ o⁻¹ c⁻¹


Bah! The Postgres gang didn't invent this word. I know a bunch of devs, at Apollo Computer Inc., who used the word "framis" for a similar purpuse back in 1989.


"Are Postgres developers starting to evolve their own dialect? Should we call an anthropologist?"

This is a "jargon" term: https://linguistics.stackexchange.com/questions/2812/argot-v...


What group doesn't develop their own words over twenty years?


"It's not a bug, it's a frammish."


Domain frammish.com is free. Let's see for how long after posting this comment.


Wait until you hear what "cluster" means in Postgres-speak.



No unfortunately PG does not have clustered indexes much to my dismay.


No doubt lots of relational database software share this quirk.


Please go on.


> PostgreSQL uses the term cluster to refer to a “cluster” of databases, as opposed to the usual notion of a group of servers or VMs working in a co-ordinated fashion.

https://www.opsdash.com/blog/postgresql-cluster.html

Note that the "databases" above are logical databases, not database hosts.


Thank you for this. I didn't realise this and for the past few years have been thinking that I seem to be the only one that runs only a single database host as everyone else seems to have "clusters". I never realised this just means multiple databases.


So according to this usage, one database instance with a few databases would be a cluster?


I'm not an expert, but as far as I can tell, even a single database instance with a single database (or maybe no databases at all?) could constitute a "cluster".


Clustering is, besides all the other definitions you're seeing here, a table property. A clustered table is a table that has, on disk, been aligned with a certain index[1] - each table can only be clustered to a single index and it essentially means that row retrieval for that specific index is much more efficient, it basically gets you one covering index for free.

1. https://www.postgresql.org/docs/current/sql-cluster.html


A Postgres cluster is, roughly speaking, a server instance. That is, if you run two copies of postgres on the same box (One on the default 5432 port, another on, say, 6543) and have each of those copies manage its own independent config, data, etc, then those instances are what Postgres calls a cluster.


Or schema!


The PostgreSQL usage of schema is consistent with the ANSI sql definition. MSSQL is the same.


I think most people surprised at the PostgreSQL model of databases and schemas are coming from a MySQL/MariaDB background where the terms are synonymous. PostgreSQL matches the model of basically every non-MySQL database when it comes to these concepts.

Though if we talk about database weirdness, I never liked Oracle DB's insistence that databases and users are the same thing. Glad I haven't used it for well over a decade now :)


I was actually surprised that you can query across databases in MySQL, but that makes sense once you understand that there's only one database and the "databases" are just schemas.


Coming from MSSQL I was equally surprised that you cant query across databases in postgres even if they are on the the same server. Yeah there is the Foreign Data Wrapper thing but the DBA was very reluctant to enable it or whatever.


Hmm, care to expand? The word schema, in Postgres terms, seems to mean exactly what I expect it to.


Normally, schema means "shape/structure of data". In Postgres it's roughly a container for tables.


> Normally, schema means "shape/structure of data". In Postgres it's roughly a container for logical databases.

No, it's a namespace within a database, not a container for databases; this is not Postgres specific, it is part of the SQL standard and widely used in other implementations.


Yeah, you caught me between my post and my edit. I misspoke and said "logical databases" instead of "tables". I'm sure that's not precisely correct either, however.


It also seems under-used for how core it is (ex. everything in public/dbo).


Ah I see. Postgres uses the word in the sense that it's used in the SQL spec, though, and it means pretty much what you're describing. The key difference is that I think you're talking about the human-readable file (which in the SQL world is written in SQL DDL), whereas the standard means it as closer to being the internal/runtime representation of that "shape/structure of data".

It's helpful to think of "the database" as the actual physical storage, and the schema is what the db uses to make sense of how to manipulate/query that data. From that perspective, the SQL DDL is scripting language to manipulate those schema objects ("objects" in the OOP sense, "schema objects" has an actual specific meaning in SQL).


I think people think schema means "a list of tables and indexes" but it's an actual object in postgres ("DROP SCHEMA public"). These do not diverge too far; that's what the object represents of course.


Most people think schema means "shape/structure of data", not "list of tables and indexes". In Postgres (or maybe SQL more broadly) it roughly means "a container for tables".


Not just tables, but everything else as well. The shape and structure of your data is always defined within a schema, and a single database may have more than one.

A PostgreSQL server can contain multiple databases; they are independent and you can't access data in one database while connected to another (without dblink or something similar)

As far as I know pretty much every database (and the SQL standard) except MySQL has schemas as an explicit database object and calls them that. What MySQL calls "databases" are actually schemas; they're just containers for database objects and you can query across them (and CREATE SCHEMA is an alias for CREATE DATABASE)

EDIT:

There's a fun trick you can do with multiple schemas that illustrate why they are schemas and not just "containers of things"

You have a "data" schema that contains your table definitions; your actual, real data and indices etc. go here. Only privileged users can access this schema directly.

Then you have an "interface" schema, that contains views and functions used by people; they can refer to the data schema, and with some clever view definitions, you can do it such that they can only access the data using the views and functions in your interface schema.

At some point, you could create an "interface_v2" schema that provides better (or more) methods for accessing your data that's backwards incompatible. Old applications can continue using the "interface" schema by setting their schema search path to "interface" (which would be the default), but new applications can "overlay" the schemas by setting their search path to "interface_v2,interface" and opt-in to new functionality. The "structure" of your data is changed simply by opting in to the new schema.

It's pretty rare for people to do this (they understand versioned web APIs better than versioned database APIs), but it's a thing you can do.


> The shape and structure of your data is always defined within a schema

Maybe we're already agreed on this, but for clarity my point is that the common notion of a schema is strictly "the shape of the data" and not "a container for the data, the shape of the data, and a bunch of other stuff". I agree that this latter definition is probably shared across many relational databases and not just Postgres.


> but for clarity my point is that the common notion of a schema is strictly "the shape of the data" and not "a container for the data, the shape of the data, and a bunch of other stuff".

Yeah, I see what you are saying I just disagree. Most people who know either use, in the context of RDBMSs, know both, and resolve the ambiguity by context. This is fairly normal, it's very common for words to have multiple common definitions.


In my experience, this is fairly "advanced" knowledge. Lots of people who grind out SQL queries all day as analysts or vanilla software engineers don't know the SQL sense of the term.

> Most people who know either use, in the context of RDBMSs, know both, and resolve the ambiguity by context. This is fairly normal, it's very common for words to have multiple common definitions.

The RDBMS domain alone doesn't suffice to resolve the ambiguity because "structure of data" and "container of tables/etc" are both relevant. I would definitely contend that application developers and operators (though perhaps not DBAs) need to talk about "structure of data" a lot more than I need to talk about "container of tables/etc".


Using schemas to effect private implementations and public interfaces was my favorite trick when I was a DBA. This was for a large company so we definitely needed fences like this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: