As the article mentions a lot of the accounts probably just using VPN services. Remember when Elon exposed the US government in cahoots with Meta and Twitter actively shadow banning people for free speech on legitimate topics such as covid, covid vaccines, masks, and the woke agenda?
The "Twitter Files" absolutely did not expose anything like that, despite what Lord Genius Elon tried to imply. At best it exposed internal discussion and policies that may have suppressed posts related to Covid, but showed that the Biden administration was largely (largely doesn't mean never) uninvolved with that. It seems like the most they asked was in regards to the "Hunter Biden Laptop" issue and that was largely with posts that were showing the nudes of the women that he was sleeping with, not even posts that were just nude Hunter.
Elon then turned on Matt Taibbi and banned him from Twitter when he wouldn't go along with his lying and spin.
I am fascinated with 3D/Gaming programming and watch a few YouTubers stream while they build games[1]. Honestly, it feels insanely more complicated than my wheelhouse of webapps and DevOps. As soon as you dive in, pixel shaders, compute shaders, geometry, linear algebra, partial differential equations (PDE). Brain meld.
I looked at using fck-nat, but decided it was honestly easier to build my own Debian Trixie packer images. See my comment below[1]. How has your experience been with fck-nat?
Unfortunately ETRADE for example does not make exporting all transaction easy. Last time I looked at their API it involved manually authing via a http flow signing into your ETRADE account to get a temporary token that expires. Not exactly a flow that can be used for long polling account activity.
I haven't researched much on Robinhood or Coinbase but I suspect they have much better APIs. That's an idea where a plugin system would be awesome, something like Plaid but for brokerages and Crypto exchanges only.
I build my own NAT instances from Debian Trixie with Packer on AWS. AWS built-in NAT Gateways use an absurdly outdated and end-of-life version of Amazon Linux and are ridiculously expensive (especially traffic).
The bash configuration is literally a few lines:
cat <<'EOF' | sudo tee /etc/sysctl.d/99-ip-forwarding.conf > /dev/null
net.ipv4.ip_forward=1
EOF
sudo sysctl --system
sudo iptables -t nat -A POSTROUTING -o ens5 -j MASQUERADE
sudo iptables -F FORWARD
sudo iptables -A FORWARD -i ens5 -m state --state RELATED,ESTABLISHED -j ACCEPT
sudo iptables -A FORWARD -o ens5 -j ACCEPT
sudo iptables-save | sudo tee /etc/iptables/rules.v4 > /dev/null
Change ens5 with your instance network interface name. Also, VERY IMPORTANT you must set source_dest_check = false on the EC2 NAT instances.
Also, don’t assign a EIP to your EC2 NAT instances (unless you absolutely must persist a given public IP) as that counterintuitively routes through public traffic. Just use a auto-assigned public IP (no EIP).
NAT instance with EIP
- AWS routes it through the public AWS network infrastructure (hairpinning).
- You get charged $0.01/GB regional data transfer, even if in the same AZ.
> Also, don’t assign a EIP to your EC2 NAT instances (unless you absolutely must persist a given public IP) as that counterintuitively routes through public traffic. Just use a auto-assigned public IP (no EIP).
Could you point me to somewhere I can read more about this? I didn't know there was an extra charge for using an EIP (other than for the EIP itself).
That's what you did before AWS had the "NAT Gateway" managed service. It's literally called "NAT Instance" in current AWS documentation, and you can implement it in any way you wish. Of course, you don't have to limit yourself to iptables/nftables etc. OPNsense is a great way to do a NAT instance.
I believe the NAT instances also use super old and end-of-life Amazon Linux. I prefer Debian Trixie with Packer and EC2 instances and no EIP. Most secure, performant, and cost effective setup possible.
> NAT AMI is built on the last version of the Amazon Linux AMI, 2018.03, which reached the end of standard support on December 31, 2020 and end of maintenance support on December 31, 2023.
Sure that one’s case, though you might be able to give out a host instead of IP to others to whitelist. Then you just set a low TTL and update the DNS record.
While obvious, a huge performance improvement bump can had using cachetools on functions. Cachetools is much more feature rich than lru_cache with support for TTL.
Would love to see it added. I’m running a Flask app using raw SQLite3 without a ORM. I believe I could hack up a Span solution in Sentry to query track like:
import sqlite3
from sentry_sdk import start_span
def execute_query(conn: sqlite3.Connection, sql: str, params: tuple = ()):
with start_span(op="db", description=sql) as span:
span.set_data("db.system", "sqlite")
cursor = conn.execute(sql, params)
return cursor
Telebugs isn’t meant to be a full Sentry replacement. It focuses purely on error tracking. The goal is to avoid feature creep and do one thing exceptionally well, much like classic UNIX tools. That said, you’re not the first to ask about APM features, and it’s motivating me to consider building a separate APM product.
His track record since “the big short” has been horrific. That’s the problem with being a perma-bear, eventually the market dips but you missed out on extraordinary gains vastly out weighing your negative thesis. It’s a hellva lot easier to be bullish American companies then try and time draw downs. Doesn’t make any sense.
Yes, but I haven't seen anything from AI technology to suggest it's going to live up to the hype in the short term. I'm not saying Burry is completely accurate in this case, but the draw down could be quite big.
GPT-4 was released in early 2023. Back then AI maximalists were saying AGI is near. We're approaching early 2026 and we obviously aren't anywhere close to anything any reasonable person would consider AGI. But what do we have? "Agents" that are mostly useless. Image and video clip content generators that are pretty much only good for social media memes and spam. We do have better software development tools, but that's not a life changing advancement.
It seems like in order for all this speculation and all these massive build-outs to pay off we're going to need AI to redefine how we work and live within the next 3-5 years. Even if we AI development doubles or triples what it's been able to do in the last 3-5 years, I don't see this happening.
So, when this does not happen, when the AI hype does not live up to the promises, by a longshot, what will happen to the markets?
Have you tried claude code? I despise AI to my bones but even I can’t say claude code is not impressive.
If any anthropic reps read this, I think you guys, while probably better than open AI and meta, possibly Google, are delusional and are more likely to destroy the world than create infinite human life.
I have and it is. But I did acknowledge that in the previous post. I just don't think software development tools like Claude Code, while great, and I wouldn't want to back to life without them, are going to recoup all this investment. We need like 10 Claude Codes for different aspects of work and life. Then we're getting somewhere...
> His track record since “the big short” has been horrific.
You would expect that with low probability, highly leveraged bets, which shorts largely are. You are wrong most of the time and then make a giant pile of money when you are right. People definitely should understand that strategy though and not just follow him blindly into investments without the expectation that you will probably lose your money almost every time.
- In late 2020, Scion sold its entire stake in GameStop. Scion missed out of the GameStop short squeeze which occurred only a few months later. Its 5.3% stake would have been worth over $1.5 billion at its height.
- In May 2021, Scion disclosed it acquired put options on Tesla shares.
- In August 2023, it was reported Scion anticipated a stock market crash and acquired $1.6 billion worth of put options to bet against the ETFs that tracked the S&P 500 and the Nasdaq-100.
- Scion also was noted to have held a large put option against the iShares Semiconductor ETF.
However, nothing about any of those points indicate his performance has been horrific.
All that matters are his returns against his reference index. That's the only relevant measure.
EDIT I did manage to find his returns via chatgpt and the OP is correct that they haven't been great in some periods, but his last 5 year average is +85% which isn't bad, not great, but not bad.
He is also up about 10% over the past year, so not great and not terrible, he's mid as the kids say.
A crisis is when money flow stops in the market. In The Big Short, he bet on the fact that cash flow would stop, and they won. The stock market works in a similar way to a company’s cash flow. When money is pumped into a stock, its value increases; when money is pulled out, the value drops.
The news is essentially about making a bet, as the title suggests, with no real evidence or information on who will pull the money out or how it will happen. Just because the price is high doesn’t necessarily mean there will be an outflow. It’s more like gambling, based on speculation rather than solid facts.
Nice to know, though I wonder how many companies are really using all private images? I've certainly had a client running their own Harbor instance, but almost all others pulled from Docker Hub or Github (ghcr.io).
Lots of medical and governmental organisations are not allowed to run in public cloud environments. It's part of my job to help them get set up. However, in reality that often boils down to devs wining about adding a registry to Harbor to cache; nobody is going to recompile base images and read through millions of lines of third party code.
A lot of security is posturing and posing to legally cover your ass by following an almost arbitrary set of regulations. In practice, most end up running the same code as the rest of us anyway. People need to get stuff done.
Please describe their system for us, including system throughput, the hardware they're on, networking constraints, and how many people are allowed to be needed to operate it.
The Public Sector and anyone concerned with compliance under the Cyber Resilience Act should really use their own private image store. Some do, some don't.
reply