When is an FPGA a better choice than an ASIC in a mass produced product?


Right off the assembly line
Looking at teardowns of things like oscilloscopes, enterprise equipment, and even very high end cameras, a lot of them use an FPGA as part of their design. However, since these products are mass produced and since FPGAs are some of the most expensive integrated circuits you can buy, wouldn't it be cheaper in the long run to roll some ASICs or even use a suitable microprocessor? I can't imagine these devices are changing the internal designs of their chips very often or at all once released. At what point would you want to spring for an FPGA in your mass produced product as opposed to an ASIC or microprocessor, and what use case examples outside or development or prototyping would require one?


Super Moderator
Staff member
To actually make an ASIC you'd need skilled people who can do the physical layout and then the company will need to buy wafers from a foundry for manufacturing.

Plus, if there are any problems in your logic, you can't really fix those problems in an ASIC unless you somehow have a "microcode" based design.

FPGA gives you the flexibility to patch your problems and are cheap to buy for smaller firms.


Cyborg Agent
It gets clearer when we compare the development cycles of an FPGA based design v/s ASIC.

For implementing something like an oscilloscope on an FPGA, depending on what functions, processing, etc, it may take you about 1-2 months. If something goes wrong, you should be able to rectify and test the next revision in a couple of weeks.

When you're using an FPGA, you can be pretty sure that the logic is going to run as you programmed it to (given that manufacturer guidelines are followed). One less thing to worry about.

For a similar ASIC, it'll take about an year at least (including fabrication time, testing, etc) for a single cycle.
If there's something wrong, even after the plethora of simulations that one needs to complete for the ASIC, you'll have to make necessary changes, and then the whole process repeats.
Lots of money is required for getting your chip fabricated. The amount varies, depending on the process (what node, and what technology, like 45nm CMOS), your fabrication house/company/ die size, packaging, etc.
The engineers that you need to employ in order to design an ASIC are far more expensive as well.

I've used durations quite extensively above, and let's be clear, these are just back of the hand calculations. Real times depend on design complexity, engineers, research prospect, etc. The durations are relative.

In general, companies go for ASICs when the product cycle is really long term (about 10years minimum), and FPGA/discrete ICs based approach if less. Also, quantity plays a major role too. If you're going to sell a couple million units of some system, ASICs present a really good profit margin, even if product cycle is less than 10 years.
Total cost of production can be estimated/projected to figure out which is the better way. This includes development costs, employee salaries, software, fabrication, testing equipment/testing facility rent, etc.

Even after all this, you'll see some companies making ASICs for systems that they're going to sell about less than 1000 units of(I think).
Take for example that 110GHz oscilloscope from Keysight. The Signal Path has a tear down video on his youtube channel. It has some ASICs, and so do quite a bit of other instruments. The kind of accuracy, specs that are expected just are too much to expect from an FPGA/Discrete IC based design. Custom chips are almost essential to the existence of such systems.

I was left amazed by the amount of engineering, the intricate design of the subsystems that goes into design of an instrument. It's just so soothing.

I think this is too much for a post, but I'll let this be.

Sent from my Redmi Note 4 using Tapatalk
Top Bottom