I never see many local companies using Immutability and FP, but my tech scene is smaller than Silicon Valley's. I mainly get my dose of these subjects from the Internet through blogs and videos.
I think both FP and Immutability are tools and, like any tool, they can be used correctly or incorrectly. So the important questions to ask about any tool is:
* When should I use this tool?
* When should I NOT use this tool?
I can't speak too much about FP as I don't practice it and I don't see anyone locally doing the same thing (I do prefer functional style: reduce, map, iterators, etc but I see that as distinct from FP). I can speak a bit about immutability, I'm going to attempt to convince you that immutability isn't a fad, but can be a useful tool when modeling certain problems.
I find a lot that immutability is used to describe two related but different things: const/immutability bindings and immutable data structures. Usually, trying to make small items constant isn't that much of a problem. The problem arises when we start making multiple copies of items containing large data structures. Enter immutable data structures.
Immutable data structures try to provide an answer to the inefficiency of copying large data structures by using structural sharing. In the past, these were more typically known as persistent data structures. Tarjan did a lot of work on this topic and the wikipedia page[1] is quite informative.
Basically, a persistent data structure allows access to previous versions of itself. I like to think of the version as an additional dimension that one can use to solve a problem. Usually this dimension is just time, but it can sometimes be other things[2].
So okay, back to the original questions
* When should I use this tool?
Immutability attempts to make copying the data structure fast. For an example, the RRB trees[3] uses immutability to provide a sequence data structure that can be concatenated quickly. I believe these are the default sequences in Scala and Clojure. You should use them if you require structural sharing or when you require the operations/complexities they provide.
* When should I NOT use this tool?
Most of these data structures are implemented as trees so they are likely to not be as friendly to hardware as a flat array/vector for example. They add a lot of complexity and I have found bugs in libraries that provide these structures. If you don't require the operations or your problem space isn't sufficiently large enough you shouldn't use them.
Anyway, hope I've managed to convince you that immutable data structures aren't a fad and can be useful in many situations.
We used Datadog at my last place to do monitoring/alerting type things. It's pretty pricey in my opinion and the amount you get out of it just wasn't worth it for us, but it might be for you. Another option is to host Prometheus for this.
Datadog is good for metrics, but I'm more curious about the deployment side of things. For example, let's say I just made an update to a service's API that requires deploying new versions of 3 microservices. I want to (1) be able to locally deploy and test the 3 new versions, and (2) deploy all 3 versions at the same time. This ends up being super tedious because I have to do a lot of manual work just to push a (potentially minor) update.
Not that I agree with the content. You don't have to be an industry insider to have an opinion on the tech industry. Especially an opinion about how it relates to society as a whole. It's also a podcast show, the page shows a 68 minute podcast which is where the the actual content is. As for partisan, I'm not even sure how you decided that.
Yes, immutable data structures can be used for capturing a temporal aspect without actually needing to model it explicitly in the underlying data.
Here's a document[1] explaining how to make a tree-like data structures persistent along with a motivating example of their use (at the end). The wikipedia page[2] for the problem also goes into other techniques and has some diagrams too.
tl;dr a naive approach to the point location query problem achieves O(N^2) space and O(logN) time per query, while simply modifying the data structure to be persistent achieves O(N) space while retaining O(logN) time per query.
Most transactions require an OTP to successfully complete, you also get notifications whenever a login to your account is performed.
I think it would probably be a good idea to have some sort of separate 2FA device linked at home but I doubt they'll ever implement it. You would want it separate to your phone because if your wallet and phone get stolen you can login to the online banking account and deactivate your stolen cards without having to go to the bank.
If the phone has a PIN or similar (I realize not everyone has) and the 2FA app has a pin/password, then that does seem like a reasonable level of security.
No, because getting your phone and wallet stolen (they are likely to both be on your person so both would likely be stolen at the same time) means you couldn't then log on to online banking and deactivate your credit cards (which you would want to do as soon as possible)
Edit: Just to clarify a bit more, most cards here have a tap and go function requiring no PIN up to a certain amount. Although the amount is small I'd still rather have it that no one spends my money.
I use said bank and they are generally pretty good. Also note they even said in the tweet that they acknowledge the role of password managers so I think you may have read a bit too much into the tweet. Almost every time I log on to their online banking site, I get a page detailing the latest scams and what to look out for.
I also agree with their statement for the most part. The general public, at least here in SA, aren't too discerning when it comes to tech matters who will probably download any random app from the play store. If you don't trust pretty much anyone with your credentials, why trust a probably unknown 3rd party with them.
I think the best idea in this case is to choose a strong password, try and remember it or write it down and store it in a safe.