If you're interested in the Ruby language too, check out this PoC gem for an "operator-less" syntax for pipe operations using regular blocks/expressions like every other Ruby DSL.
"https://api.github.com/repos/ruby/ruby".pipe do
URI.parse
Net::HTTP.get
JSON.parse.fetch("stargazers_count")
yield_self { |n| "Ruby has #{n} stars" }
Kernel.puts
end
#=> Ruby has 15120 stars
Services would commit messages to their own databases along with the rest
of their data (with the same transactional guarantees) and then messages
are "realtime" replicated (with all of its features and guarantees) to the
receiving service's database where their workers (e.g. https://github.com/que-rb/que, skip locked polling, etc) are waiting to
respond by inserting messages into their database to be replicated back.
Throw in a trigger to automatically acknowledge/cleanup/notify messages and
I think we've got something that resembles a queue? Maybe make that same
trigger match incoming messages against a "routes" table (based on message
type, certain JSON schemas
in the payload, etc)
and write matches to the que-rb jobs table instead for some kind of
distributed/replicated work queue hybrid?
I'm looking to poke holes in this concept before sinking anymore time exploring the idea. Any feedback/warnings/concerns would be much appreciated, thanks for your time!
Sounds a bit like change data capture (CDC), e.g. via Debezium [1] which is based on logical decoding for Postgres, sending change events to consumers via Kafka? In particular when using explicit events for downstream consumers using the outbox pattern [2]. Disclaimer: I work on Debezium.
Just as Kafka isn't a database, PostgreSQL isn't a queue/broker. You can use it that way, but you'll spend a lot of time tweaking it, and I suspect you'll find it's too slow for non-trivial workloads.
Skype at its earlier peak used Postgres as a queue at huge scale. PGQueue. It had a few tweaks and, sure, it is an anti-pattern but it can work. It is sure handy if you are already using postgres and want to maintain a small stack.
interesting idea: I think one issue is that the write throughout of a single master Instance is very limited.
But if working set fit in memory on the instances processing the writes, you could use PostgreSQL logical replication to update materialized view on other server easily.
but then it start looking like Amazon aurora or Datomic database.
Definitely agree regarding the strange semantics recently like the pipeline[0] or `.:` method reference[1] operators. I do think those kind of functional features would be handy if they behaved correctly and had a more Ruby-like syntax instead of introducing foreign looking operators.
Luckily this new pattern matching feature appears to behave like expected and feels pretty natural - just a new `in` keyword and some variable binding syntax to familiarize ourselves with (which we already kind of use with statements like `rescue`).
Awhile back we experimented with an alternative Ruby pipe operator proof of concept[2] that is "operator-less" and looks just like regular old Ruby blocks with method calls inside of it. Maybe there's still a chance for something like this now that those other implementations have been reverted!
# regular inverted method calls
JSON.parse(Net::HTTP.get(URI.parse(url)))
# could be written left to right
url.pipe { URI.parse; Net::HTTP.get; JSON.parse }
# or top to bottom for even more clarity
url.pipe do
URI.parse
Net::HTTP.get
JSON.parse
end
tldr: We're automating the loan origination/underwriting/servicing/investing/etc process
LendingHome is reimagining the mortgage process from the ground up by combining innovative technology with an experienced team. Our goal is to create a seamless, transparent process that transforms and automates the mortgage process from end to end. We've raised $167MM in venture capital with a team of over 300 people and have been featured on the Forbes Fintech 50 list for two years running! LendingHome is uniquely positioned to become the next great financial services brand powered by the most advanced mortgage platform in the world.
tldr: We're automating the loan origination process (applying, underwriting, servicing, investing, etc)
LendingHome is reimagining the mortgage process from the ground up by combining innovative technology with an experienced team. Our goal is to create a seamless, transparent process that transforms and automates the mortgage process from end to end. We've raised $167MM in venture capital with a team of over 300 people and have been featured on the Forbes Fintech 50 list for two years running! LendingHome is uniquely positioned to become the next great financial services brand powered by the most advanced mortgage platform in the world.
LendingHome is reimagining the mortgage process from the ground up by combining innovative technology with an experienced team. Our goal is to create a seamless, transparent process that transforms and automates the mortgage process from end to end. We've raised $167MM in venture capital with a team of over 300 people and have been featured on the Forbes Fintech 50 list for two years running! LendingHome is uniquely positioned to become the next great financial services brand powered by the most advanced mortgage platform in the world.
Open positions:
* Engineering Manager
* Senior Data Scientist
* Software Engineer (Frontend or Backend)
* Design/Finance/HR/Marketing/Operations/Product/Sales/etc
https://github.com/lendinghome/pipe_operator#-pipe_operator