Cypress PSocs [1] were super fun to use. Visually interconnecting PWM, ADC, DAC, OpAmp/comparator blocks and then accessing the "pins" in C code... It's like Arduino but you get to also compose the analog side. Back then it was an 24 MHz 8-bit controller, now it looks like they have some muscle in them.
PSOCs are very useful for things that interface with the world -- for example, they were used in the iPod touch wheel ... I've used them to make:
* hearing research devices (some filters in the analog side of the device)
* wearable DJ system (MIDI, ADC/DAC for controls, some PWM, some analog synth, capacitive sensing)
* smart sex toys (ADC/DAC/PWM, some filtering, capacitive sensing)
Microchip has been copying Cypress in new ATSAM* microcontrollers, with op-amps and comparators on the chip. I think this is about as far as programmable analog circuits will get in mainstream applications. Just adjust your signal to a good amplitude for an ADC, and do everything else digitally.
See also https://anadigm.com/ (which styles itself as "The FPAA Company" where 'FPAA' is Field Programmable Analog Array) and https://www.dialog-semiconductor.com/products/greenpak (GreenPAK™ originated with Silego Technology, which Dialog bought in 2017, Dialog itself is now part of Renesas as of last year).
The GreenPAK line includes a variety of ICs, some of them are more like traditional FPGAs and some have analog components too.
The reasoning laid out in the article boils down to: Analog doesn't compose.
There is no fundamental element in the real world that you can combine to build arbitrary analog circuits, while you can do just that with digital circuits.
You can kinda-sorta compose things out of op-amps, and people did, but for almost all practical purposes the best thing to do is to wire the input straight into an ADC and do all your computation in the digital domain.
yeah pretty much. programmable digital is already niche and it's made of the same types of components repeated: NAND gates.... or similar combined into cells... analog circuits generally have much higher diversity of components.... caps, resistors, and inductors can easily vary 1e3 to 1e6 times in size. component interconnects are more more diverse... e.g. where digital is usually a flow of data from on port to another, analog has many more feedback points.
Another issue is that analog is typically used in media where they can interfere with other systems if misconfigured. e.g. radio systems or other networks. whereas digital can more or less apply to system like math or seraching or database, where errors won't throw everyon eelse of the network.
These aren't unsolvable problems, but they probably mean poor utilization of space on a die. and since like I said, programmmable digital is inefficient enough, analog will be less so. :(
That being said, there are some possibly applications where I'd like to see analog such as motor controllers and SEPIC regulators. Advantages here would be less RF, less CPU power, and less time programming precise timing.
OPamps, resistors, (switched) capacitors (and optionally inductors, although I believe they're infeasible to integrate) provide universal linear elements allowing the (approximate) reproduction of any linear system. Usually analog stages are linear, so that's not a huge issue. The key here is approximate: in reality, analog systems are precision-critical, have demands like low noise and subtle nonlinearities that apparently (per the article) make integration of an analog network not so attractive (because you can't control those parameters so easily).
Really easy achieve 8-10 bits, even 12 bits, but 16 bits or more are hard.
For example, now manufacturers use large matrices with lot of spare parts, and before put chip in case, it's precision measured and cut metal seals, so made very precise parameters.
So in reality, modern analog chips ARE one time programmable analog, but one time analog programming is cheap enough but rewritable are inadequate expensive.
This is kind of a cool computer, but it's not a serious approach for most problems, and doesn't have much to do with the original article. Analog computers have very limited precision due to SNR, and this kills their usefulness for almost everything aside from niche simulations and machine learning.
Edit: Also, the host of this YouTube video doesn't sound like he knows what he's talking about.
It's like how people have been saying lisp is the future of coding.
I feel most engineers (including myself!) become enamored by beautiful ideas enough that they'll want to try applying it to places it doesn't really work.
Doesn't mean I have to stop dreaming of a plan9 flavored 100% Lisp OS though!
To expand a bit, that parent site is about programmable analog computers while the original article is about programmable analog circuits.
Analog computers are indeed built with analog circuits but analog computers are a niche tool for solving differential equations[0] while analog circuits are used literally everywhere.
The "programmability" occurs on two different levels; in the digital realm you program an FPGA by changing the logic of its gates and their interconnections. You program a computer by using Java to build a terrible issue tracker.[1]
The former corresponds to the original article; the latter is about the parent. There is much overlap between the two ideas but that's the gist.
[0] And other tasks like producing music. Analog synthesizers are really just special-purpose analog computers.
Universal analog chips are cool, but extremely expensive, and their analog performance is moderate.
- Their analog performance is enough for simple hobbyists projects, but for commercial, need something with much more precision - 18 bit or more.
That's what differ analog computers from digital - in analog easy achievable ~ 10-12 bits, but more are expensive;
in digital, could trade precision for speed - for example calculate 64 bit on 32 bit hardware, which is at least 4 times slower (in ideal cases, in real more than 4x difference), but double precision worth it.
Even more, in digital world, could calculate 32bit or even 64bit on 8bit hardware (really used in microcontroller world), which is extremely slow, but for many cases is fast enough.
Really easy achieve 8-10 bits, even 12 bits, but 16 bits or more are hard.
For example, now manufacturers use large matrices with lot of spare parts, and before put chip in case, it's precision measured and cut metal seals, so made very precise parameters.
So in reality, modern analog chips ARE one time programmable analog, but one time analog programming is cheap enough, but rewritable are inadequate expensive.
Also this is reason, why analog computers where extremely popular ~before microcomputers.
- Just because first computers where too slow and have too little memory, to simulate interest for real applications scale of analog circuits.
But digital progress where extremely fast and it scaled very well, so in very short time appear cheap digital computers, which could do 32 bit precision or even 64 bit, on 8 bit hardware.
And for analog, 16 bit still hard.
Where you could see race for precision - in analog TVs and in radios, used DAC (or mechanical multi-turn resistor) for frequency tune, and they limit progress of commodity equipment on about 12-14 bit, even when exist professional (scientific) equipment with 20-22bits or even more, but it costs many times more.
Other less easy seen example - in CCD sensors, ADC limit about 32bit, and even when exists market for better solutions, they are not appear, just because very expensive.
For displays this is not important, because most people accept even 24 bit color and because more depth achievable via emulation (yes you may hear about pwm in monitors).
As I know, AD used laser cutter on every chip to achieve good analog precision, but this is impossible for really big series, like commodity CPUs or DRAM/FLASH.
- CPUs/DRAM mostly just tested if working, and if not working or not fit tolerances are recycled (in some cases could disable worst part of die).
The biggest problem is that analog is nothing like digital - BTW my PhD thesis back in the 1980s was about this. You can make it work at low frequencies that are essentially "DC". Higher frequencies simply can't work - too many parasitics plus lumped equivalent fails.
PSOCs are very useful for things that interface with the world -- for example, they were used in the iPod touch wheel ... I've used them to make:
* hearing research devices (some filters in the analog side of the device)
* wearable DJ system (MIDI, ADC/DAC for controls, some PWM, some analog synth, capacitive sensing)
* smart sex toys (ADC/DAC/PWM, some filtering, capacitive sensing)
[1] https://en.wikipedia.org/wiki/Cypress_PSoC