I'd like to know that too, have been considering doing some zigbee tinkering, and battery powered would be a requirement. I've read in some other comment that nRF would be much better in that regard. Need to do some googling for numbers...
Digikey has the modules for under 4 EUR in unit quantities, but they aren't the friendliest to integrate, since they only have pads on the bottom.
I also found some boards with the bare chip for just over 4 EUR there, you can also find similar ones on AliExpress.
There's ways that prevent it -
- Freeze all code after an update through permissions
- Don't make most directories writeable
- Don't allow file uploads, or limit file uploads to media
There's a few plugins that do this, but vanilla WP is dangerous.
Love the article - you may want to lock down your API endpoint for chat. Maybe a CAPTCHA? I was able to use it to prompt whatever I want. Having an open API endpoint to OpenAI is a gold mine for scammers. I can see it being exploited by others nefariously on your dime.
The implication is that the users that are being constantly presented with CAPTCHAs are experiencing that because they are unwittingly proxying scrapers through their devices via malicious apps they've installed.
or just that they don't run windows/mac OS with chome like everyone else and it's "suspicious".
I get cloudflare capchas all the time with firefox on linux... (and I'm pretty sure there's no such app in my home network!)
When a random device on your network gets infected with crap like this, your network becomes a bot egress point, and anti bot networks respond appropriately. Cloudflare, Akamai, even Google will start showing CAPTCHAs for every website they protect when your network starts hitting random servers with scrapers or DDoS attacks.
This is even worse with CG-NAT if you don't have IPv6 to solve the CG-NAT problem.
I don't think the data they collect is used to train anything these days. Cloudflare is using AI generated images for CAPTCHAs and Google's actual CAPTCHAs are easier for bots than humans at this point (it's the passive monitoring that makes it still work a little bit).
This isn't really something you'd ship in a car though. It's cool that we have such a rich ecosystem of devices that this can be made "off-the-shelf" - but for production use in a car? Not really practical.
Did people just... do this by hand (in software), transistor by transistor, or was it laid out programmatically in some sense? As in, were segments created algorithmically, then repeated to obtain the desired outcome? CPU design baffles me, especially considering there are 134 BILLION transistors or so in the latest i7 CPU. How does the team even keep track of, work on, or even load the files to WORK on the CPUs?
It's written in an HDL; IIRC both Intel and AMD use verilog. A modern core is on the order of a million or so lines of verilog.
Some of that will be hand placed, quite a bit will just be thrown at the synthesizer. Other parts like SRAM blocks will have their cad generated directly from a macro and a description of the block in question.
To further expound on this. ASIC (like AMD CPUs) is a lot like software work. The engineers that create a lot of the digital logic aren't dealing with individual transistors, instead they are saying "give me an accumulator for this section of code" and the HDL provides it. The definition of that module exists elsewhere and is shared throughout the system.
This is how the complexity can be wrangled.
Now, MOST of the work is automated for digital logic. However, we live in an analog world. So, there is (As far as I'm aware) still quite a bit of work for analog engineers to bend the analog reality into digital. In the real world, changing current creates magnetic fields which means you need definitions limiting voltages and defining how close a signal line can be to avoid cross talk. Square waves are hard to come by, so there's effort in timing and voltage bands to make sure you aren't registering a "1" when it should have been a "0".
Several of my professors were intel engineers. From what they told me, the ratios of employment were something like 100 digital engineers to 10 analog engineers to 1 Physicist/materials engineer.
They use EDA (Electronic Design Automation) software, there are only a handful of vendors, the largest probably being Mentor Graphics, now owned by Siemens. So, yes, they use automation to algorithmically build and track/resolve refactors as they design CPUs. CPUs are /generally/ block-type designs these days, so particular functions get repeated identically in different places and can be somewhat abstracted away in your EDA.
It's still enormously complex, and way more complex than the last time I touched this stuff more than 15 years ago.
The average vape has more processing power than Voyager, and the iPhone is orders of magnitude more complex. With that said, it takes skilled engineers to squeeze perfectly crafted code into such a tiny platform from the 70s.
I understand what you're getting at, but the 'average' vape pen is essentially a disposable battery and temperature sensor with no additional inputs or features.
After reading some details about the Voyager, I have my doubts that a disposable vape has more computation power [1]. Maybe the higher end devices with programable displays and temperature settings?
A Pinecil (digital soldering pen) is probably a better example. BL706 MCU,
"a low-power, high-performance IoT chip that supports BLE/Zigbee wireless networking, ...
BL702 has built-in RISC-V 32-bit single-core processor with FPU, the clock frequency can reach 144MHz, has 132KB RAM / 192KB ROM / 1Kb eFuse storage resources, supports external Flash, and optional embedded pSRAM."
Either way, it's clear that we (well, JPL) can build extremely powerful and sophisticated systems with relatively small computers, suggesting that resource constraints can sometimes be a source of stability and creativity.
yep, I had fun watching the Ben Eater videos where he builds a retrocomputer out of them. I even bought some of the chips to build a simple 4-bit counter with up/down buttons. It was a real revelation to understand that concept, and I ended up looking at an Apple I motherboard (https://en.wikipedia.org/wiki/Apple_I#/media/File:CopsonAppl...) and noticed the regular array of 74xx chips connected by some elegantly laid-out wires.
This silicon and lithium could be used for much much better things and recycled as well, than a few puffs of a substance only meant to make people addicted.
TM-V71A isn't a HT, nor is a TM-D710G. For a new ham who's scraped together some coins and gotten an HT, the only options that I'm aware that will work are the TH-D72.
AFAIK the only options are TH-D72 and Wouxun Dual.
It's good there are 3rd party solutions and OSS solutions such as this, but on principle, if you pay way over the odds for Apple hardware, you really shouldn't have these problems in the first place