They enable a lot of crazy defence products.
A well known german product for example is the Iris-T from Diehl Defence. Highly accurate and exceptional engineering. But I guess FPGAs are in most defence products nowadays. I think the biggest reason is that you can build/verify your own hardware without having to go through the expensive ASIC manufacturing
Edit: I just realized that these are some literal killer apps. That wasn't even intentional, lol.
It also probably makes it easier to prevent adversaries from being able to delid/reverse engineer products. When using FPGAs you don't even need to have the firmware/gateware on or near device until it's in use, which would help prevent any sensitive trade secrets from making it into the wrong hands.
Any sensor that captures a ton of data that needs realtime processing to 'compress' the data before the data can be forwarded to data accumulator. Think MRI or CT scanners but industrially there are thousands of applications.
If you need a lot of realtime processing to drive motors (think industrial robots of all kinds), FPGAs are preferred of micro-controllers.
All kinds of industrial sorting systems are driven by fpgas because the moment of measurement (typically with a camera) & the sorting decision are less than a milisecond apart.
There are many more, it's a very 'industrial' product nowadays, but sometimes an FPGA will pop up in a high-end smartphone or TV because they allow to add certain features late in the design cycle.
They enable a bunch of niches (some of which do have a large impact), as opposed to having a few high-volume uses. Basically anything where you really need an ASIC but you don't have the volume to justify an ASIC (and also have the requires large margins for such a product to be viable). Custom RF protocols, the ASIC development process itself, super-low-latency but complex control loops in big motor drives, that kind of thing. You'll almost never see them in consumer products (outside of maybe some super-tiny ones which aren't useful for compute but just do 'glue logic') because they're so expensive.
What you're describing is correct for the top-end FPGA products (they're in every 5G base station, and almost every data centre has thousands of them rerouting information), but the low-end ($10 or less) 2k LE FPGAs are in a hell of a lot of products now too. They're fantastic for anything where you need a lot of logic that executes immediately/concurrently (vs sequentially as would with a microcontroller) in a tiny package. Think medical devices, robotics, comms devices, instrumentation, or power controllers.
I'm pretty sure there's an FPGA in most consumer devices now, but as you say they're there for some sort of glue logic - but that's a killer niche unto itself. Schematics can shift and change throughout a design cycle, and you only need to rewrite some HDL rather than go hunting for a different ASIC that's fit for purpose. It's a growing field again as their cost has come right down. They're in the Apple Vision headset, the Steam Deck, modern TVs, and a host of small form factor consumer computing products.
The ReefShark ASIC sits alongside an FPGA which acts akin to an IPU. I know only because I played my own small part in the design. It was originally meant to be entirely FPGA-based, but they got hit with some severe supply constraints by Intel and Xilinx, which is why cost keeps getting discussed. Prices have dropped back down to stable numbers again since mid-last year, but at the time ASICs ended up being more affordable at the volume they're doing (demand spiked mid-project due to the removal of Huawei networking equipment).
We (outside Wireless) heard the Intel silicon didn't perform/yield and the original designs became infeasible, prompting a sudden mad scramble. I didn't realise it was originally planned to be FPGA-based. Interesting, thanks.
> I'm pretty sure there's an FPGA in most consumer devices now,
I can’t think of the last time I saw an FPGA on a mainstream consumer device. MCUs are so fast and have so much IO that it’s rare to need something like an FPGA. I’ve seen a couple tiny CPLDs, but not a full blown FPGA.
I frequently see FPGAs in test and lab gear, though. Crucial for data capture and processing at high speeds.
Low-latency (e.g. less than 20 lines) Videoswitchers/mixers. There's a huge amount of data (12Gbps for 4K/UHD) per input, with many inputs and outputs, all with extremely tight timing tolerances. If you loosen the latency restrictions you can do a small numbers of inputs on regular PCs (see OBS Studio), but at some point a PC architecture will not scale easily anymore and it is much more efficient to just use FPGAs that will do the required logic in hardware. It's such a small market that for most devices an ASIC is not a option.
Blackmagic's whole gear line is based on Xilinx FPGAs. Whatever product of them you see, if you tear it down, it will almost always be nothing more than SerDes chips, I/O controllers and FPGAs.
Anything where you wish you could have an ASIC but you don't have the budget for custom ASIC, and where using smaller chips either makes for worse Bill of Materials or takes up more space.
They are used everywhere, including some very small ones I've seen used purely for power sequencing on motherboards - usually very small FPGA with embedded memory that "boots" first on standby voltage and contains simple combinatoric logic that controls how other devices on motherboard are getting powered up faster than any MCU can do it - while taking less space than discrete components.
Glue logic, custom I/O systems (including high-end backplanes in complex systems), custom devices (often combined with "hard" components in the chip, like ARM CPUs in Zynq series FPGA), specialized filters that can be runtime updated.
They're used in places that require real time processing (DSP) of raw digital signals at very high (several hundred Mhz and more), where you cannot afford to miss a sample because of latency from a microcontroller (uC). I think even some PCI devices use them for this reason, and it allows you to update firmware whereas ASIC doesn't
A while back I wrote an entire FPGA pipeline to recalibrate signals from optical sensors before they were passed on to the CPU. Doing this allowed us to keep up processing speed with acquisition, so it was real time. A lot of FIR filters and FFTs. But my proudest achievement was a linear interpolation algorithm which is fairly high level and tricky to implement on FPGA, which is more geared towards simpler DSP algorithms like FIR filters, and FFT (not simpler but so much effort has got into making IPs it effectively is because you don't have to implement it yourself)
But other than that, for raw bulk compute GPUs are kicking their butts in most domains.
To give you an example, these are often used in CNC machines.
Before you had to have:
A. a PLC that ran logic with real time guarantees to tie everything together. The PLC is often user-modified to add more logic.
B. Decoders that processed several mhz encoder feedback signals, somewhere between 3 and 10 of these.
C. Something that decides what to do with the data in B
D. Encoders and motor driving, also being output at several mhz (somewhere between 3 and 10 of these as well)
Among other tasks.
These were all separate chips/boards/things that you tried to synchronize/manage. Latency could be high. But if you are moving 1200 inches per minute (easy), 100 milliseconds latency is equivalent to losing track of things for 2 inches. Enough to ruin anything being made.
Nowadays it is often just an FPGA hooked up to a host core.
(or at a minimum, a time-synchronized bus like ethercat)
Products with PCIe (PCI Express) and high speed interfaces like 10G Ethernet, SATA, HDMI, USB 3.0 and higher, Thunderbolt.
Most of the ASICs with these SerDes interfaces are not for sale on the open market, only for OEM who buy MOQ of millions.
Take for example the Raspberry PI SBCs. The Raspberry Pi only got PCIe very late (compute model 4), influencer Jeff unlocked them with a lot of difficulty https://pipci.jeffgeerling.com but you still can't buy these cheap microprocessors from Broadcom.
The reason is that no cheap PCIe chips are available for hobbyists and small company buyers (below a million dollars).
'Cheap' FPGA's starting at $200+ where and still are the only PCIe devices for sale to anyone. If you want to nitpick, a few low speed Serdes are available in $27 ECP5 FPGA's, but no 10 Gbps and higher.
Another example, I sell $130 switches with 100 Gbps switching speeds and PCIe 4x8 and QSFP28 optics. But you can't buy the AWS/Amazon ASIC chips on this board anywhere, nor their competitors chips from Broadcom, Intel, MicroSemi/Microchip, Marvell.
I went as high as Intel's vice president and also their highest level account manager VPs and still got no answer on how to buy their ASIC switches or FPGAs.
The core of modern oscilloscopes is often an FPGA that reads out the analog-to-digital converters at ~gigasamples/s and dumps the result into a RAM chip. Some companies (Keysight, Siglent) use custom chips for this, but FPGAs are very common.
From a consumer-facing perspective, FPGAs have enabled a golden age of reasonably affordable and upgradeable hardware for relatively niche tech hobbies.
* Replacement parts for vintage computers
* Flash cartridges and optical drive emulators for older video game consoles
* High-speed, high quality analog video upscalers
Many of these things aren't produced at a scale where producing bespoke chips is not really viable. Using an FPGA lets you build your product with off the shelf parts, and lets you squish bugs in the field with a firmware update.
There is also MiSTer, an open source project to re-implement a wide range of vintage computer hardware on the Terasic DE10-Nano FPGA.
Lower-volume specialty chips for interfaces (lots of I/O pins), such as adapters for an odd interface, custom hardware designs for which there isn't an existing chip, etc.
For instance, audio, video or other signal processing can be done by putting the algorithm "directly" into the hardware design; it will run at a constant predictable speed thereafter.
I think low latency is the main thing. In most cases, to get an FPGA that's faster in terms of compute than a GPU/CPU you're going to have to spend probably hundreds of thousands (which the military do, e.g. for radar and that sort of thing).
But even a very cheap FPGA will beat any CPU/GPU on latency.
In the past, I’d have tried to use Achronix’s FPGA’s for secure processors like Burroughs B5000, Sandia Secure Processor, SAFE architecture, or CHERI. One could also build I/O processors with advanced IOMMU’s, crypto, etc. Following trusted/untrusted pattern, I’d put as much non-security-critical processing as possible into x86 or ARM chips with secure cores handling what had to be secure.
High-risk operations could run the most critical stuff on these CPU’s. That would reduce the security effort from who knows how many person-years to basically spending more per unit and recompiling. Using lean, fast software would reduce the performance gap a bit.
CHERI is now shipping in ASIC’s, works with CPU’s that fit in affordable FPGA’s, and so this idea could happen again.
One particular use of FPGAs (and ASICs) is operating on bit-oriented rather than byte-oriented data. Certain kinds of compression and encryption algorithms can be implemented much more efficiently on custom chips. These are generally limited to niche applications, though, because the dominance of byte-oriented general-purpose CPUs and microcontrollers selected against such algorithms for more common applications.
It can be of use in anything that handles a lot of data throughput but not built in large enough numbers to justify producing an ASIC. First example that comes to mind is an oscilloscope, but by definition FPGAs can be used anywhere (from retrogame consoles to radars).
Broadly speaking anything that does either a lot of reasonably specialized logic and medium-to-high performance broad work will have an FPGA in it (unless its made in very high volumes in which case it may be an ASIC, ditto for very high performance things).
Some FPGAs are absolutely tiny e.g. you might just use it as a fancy way of turning a stream of bits into something in parallel for a custom bit of hardware you have, other FPGAs are truly enormous and might be used for making a semi-custom CPU so you can do low latency signal processing or high frequency trading and so on.
I think we’re going to see a greatly increased use of FPGAs in AI applications. They can be very good at matrix multiplication. Think about an AI layer that tunes the FPGA based on incoming model requirements? Need an LPU like groq? Done. I would bet Apple Silicon gets some sort of FPGA in the neural engine.
But ASICs perform way faster and more efficiently. I doubt even the gain that you would get from "retuning" the FPGA would not increase enough compared to the benefit from a general purpose processor, GPU, or an ASIC
They are useful for products which do video encoding, decoding, and microwave receive and transmit of video data. They are useful for TCP/IP insertion and extraction of packet data, e.g., in video streams.
Some older video game consoles have been "emulated" in FPGA. You just map out the circuitry of a device and voila you get native performance without the bugs of a software implementation.