Once again i'm amazed what ALSO has a processor in it. I'm wondering if at some point I'll just see it as normal that nearly every device, no matter how static it may seem, has more processing power than my first computer.
Almost everything has at least an 8-bit MCU now. 32-bit MCUs are also extremely common now, and even simple ones like the ARM Cortex-M0 are competitive with a 286/386 (albeit with less memory / no MMU).
A rather surprising number of devices run very powerful application processors. An amusing example is Apple's Lightning to HDMI adapter, which has an ARM SoC with 256MB of RAM and boots a Darwin kernel in order to decode a H.264 compressed video protocol. Depending on what exactly they put into it (wouldn't be surprised if they borrowed the Apple TV chip for a relatively low-volume product like this) it may be more powerful than a fairly recent computer.
I've worked with TI BLE chips which were 8051 variants. Took me back to original CS classes as it is a very simple core with limited registers but can still be used to perform all the bluetooth functions.
IIRC, the BLE stack on those was a separate (undocumented?) core; the 8051 was for user software that gave high-level commands to the BLE block through magic registers. Still fun to program, for sure.
Some of the more, ah, cost-reduced Bluetooth chips have 8051s or all kinds of weird, proprietary processors, presumably because the per-chip licensing fee for ARM is too much at that end of the market or something.
No, I don't have a blog. These are just my private projects that I make for my own use. I am working as a software developer and learning electronics as a hobby.
The backlight is going to be a box supplied directly from AC, connecting up to 6 strips of individually addressable WS2812 RGB LEDs, providing up to 10A at 5V (50mA per LED == 200 LEDs at full power). It will be connected using galvanically isolated Full-Speed USB to the PC.
For now I will have some pre-programmed sequences but I plan to make a piece of software that will make it possible to match LEDs to the borders of image on the screen though I have no idea how to do that at the moment.
I've also been playing with WS2812 LEDs as ambient/halo lighting around my monitor. I'm using a cheap ESP-32 dev board to allow control from my phone via WiFi even when my computer is asleep. I went much smaller though - only 16 pixels, powered by an old phone charger.
This works for now because my primary use-case is ambient lighting when the room would otherwise be dark. I'm planning to build some larger-scale higher-density light panels to provide more illumination for those dark winter days.
I have a bunch of STM32 Nucleo and Discovery boards that I use and which one I use will depend on circumstances.
For prototyping I use both breadboards and perfboards. I use breadboards for small fast prototypes and perfboards when I know I am going to develop it over a longer time or when I have some special requirements (like AC power on board or a component that has 2.54mm pitch but is not breadboardable) that exclude or make it more difficult to use on a breadboard.
For breadboard I would default to use STM32L432 Nucleo-32 which is breadboardable and doesn't use much space.
For perfboard I default on either STM32L452 Nucleo-64 or STM32F303 Discovery. I don't solder them to board but instead just insert it in the board and then put couple of pieces of plastic from 2.54mm pitch header with the metal pins removed. This mounts the board securely in place without having to solder it. I use dupont jumper wires to connect it to the rest of the board where I would typically solder the rest of components (unless I also don't want to solder them in for some reason).
I would typically solder in things that are disposable to me that I don't want to flap around.
This fascinates me. Is there a website anywhere that is collecting these sorts of use cases? It seems truly mind blowing something as simple as an electric toothbrush would have a processor in it.
Nothing mind blowing. Processor is cheaper than building analog circuitry.
Think about your toothbrush. All important timing parameters are configured digitally and you can easily change it. You can technically do the same with resonators but it would take much more board space, be less precise, require inductors which you want to avoid in the circuit, etc.
And cheaper/more flexible than custom digital circuitry too. Even if something is large-scale enough to justify a custom chip, that'll often be some components around a mask-programmed 8051 clone. straight-forward, well-understood, relatively easy to make variations by just changing the program.
No, not really. The STM32s I use (I mainly work with Cortex-M4) have internal resonators that are enough for just about anything unless you need precise timing. Certainly good enough for toothbrush.
How would you explain this part of your argument then:
> You can technically do the same with resonators but it would take much more board space, be less precise, require inductors which you want to avoid in the circuit, etc.
Many microcontrollers (PIC, STM32, atmega8 etc) include an internal RC oscillator - which is literally inside the chip itself. Zero external components required.
Not only do you save the costs of using a crystal, you also save two pins - which was useful in the days of 8-pin microcontrollers like the ATtiny85.
As internal RC oscillator drift rates can be as much as 10% (and vary with temperature) they're not precise enough to run a serial connection, let alone a USB connection. That's why products like Arduino tend to go directly to using a proper crystal (which gives you a 0.01% drift rate for a few pennies).
I know next to nothing about microcontrollers, so this might be a dumb question, but how do they do the initial flashing is they can't run a serial port? Are they hooked up to something external before going on the final PCB? Do they not need a precise clock to read the onboard program from whatever's storing it?
Even if you've got a really inaccurate clock, you can still accept synchronous protocols like SPI and I2C where the bus master provides a clock signal.
Every chip brand would have their own protocol and provide their own programming hardware that could speak it.
yeah, almost all microprocessors in that price/power bracket have on-board oscillators, because it saves power, money, and space if you don't need a precise reference.
The parallax propeller has an on-chip RC oscillator. For obvious reasons it can only be used in low clocked scenarios, but many simple human interface applications are fine with this. The propeller is not a very common MCU, but it is fun to work with and has many technical merits.
Actually I switched to electronic toothbrush years ago as it cleans better. Now when I am deprived of my brush and have to rely on a normal brush I don't feel like I did good job cleaning my teeth. Part of this may be subjective feeling but various tests show that electronic toothbrush cleans better than regular one in most cases.
No, I don't need any special functionality other than to clean my teeth but if you were to design a toothbrush you would most likely be asked to implement those.
I doubt, pretty much anything that has electronic circuitry has got a micro processor - it's an off the shelf component, well understood, much easier to change/modify and test than custom built analog circuitry. For example - what's the option to save any end user settings with analog devises - knobs/potentiometers.... Compared non-volatile memory like NAND, the cost (and space and weight) differences are orders of magnitude.
Another thing to consider : more and more single-use items also have processors and electronics in them. At what point will disposable electronics have more processing power than, say, the Apollo flight computer?
A couple of examples that I've noticed :
There's a sports good store near here where they attach some form of RFID tag to each item (including individual protein bars), to automate the scanning. That means these are RFID tags intended to be scanned exactly once.
There's also the case of "digital" pregnancy tests, which consist of a regular paper pregnancy test, a processor to read out the results, and an e-ink display to show the results. All of this is included in the single-use disposable predictor stick.
Those RFIDs make it possible to not just scan them when the article gets bought, but also to very easily scan the inventory of the whole store by just walking through the store with a scanner.
Just as useful, although currently still more likely to be done with barcode scanners is to identify the products in the logistics chain. Right now you need scanning ports with at least 4 cameras/scanners or humans manually scanning each barcode, with RFIDs the port would be simplified further.
I still agree that it's amazing that our society can make functional structures with feature sizes in the nm/µm ranges so cheaply that we can afford to throw them away.
Virtually every power tool has a micro-processor inside. LED dimmers have micro-processors. Christmas LEDs (that can blink) have micro-processors in the control block. Compared to Apple II, they might have more computation power but usually not 48KBs of memory.
It’s been said before, about ROM chips for sure, but I don’t know if history necessarily agrees regarding other chips, that there’s a point at which the opportunity cost of “just enough” silicon is too high and so you end up with a choice between way more than you need for simple tasks, or doing without (eg, embedding that responsibility in some other chip).
I think a lot of projects will be targeting ARM CPUs in the next era of computing, and I hope one day we will see entire processes moving off the cpu and onto peripherals.
Give me a RAID controller that can run Postgres directly on it, or an SSD that can run SQLite. Give me a network card that runs eBPF, or even nginx.
You're basically surrounded by 8051 processors at all times. Tiny controllers, many probably OTP or mask-ROM, doing all sorts of basic management things.
USB-C cables that support high current (> 3A) have a chip inside that communicates with the power source to let it know it can carry high current. Only then does the source advertise the higher power profiles to the consumer.
The real problem I see is when IoT crap starts getting 4/5g modems with their own network plans so they can spy on you even while not connected to wifi
I was looking at a sensor package (light, temp, humidity, particulates) earlier and it comes with a cellular modem and a free SIM with a modest data allowance for 2 years to send readings to the cloud.
I wish Zwave/zigbee were more practical to use. I'm not too worried about someone local pulling readings out of the air, but giving all of this crap access to my wifi, relying on cloud services that could go down at any time, and even adding a bridge straight in (your SIM comment) is just rediculous. For things like cameras or other high bandwidth applications wifi makes sense, but for the usual sensor/switch stuff, there's no need for a full IP network.
Plus, all of the stuff controlled locally means the latency is sooooooooo much lower. It's awesome being able to hit a switch in the homeassistant app and have the corisponding plug or light turn on instantly. It's like you're flipping a physical switch.
Can you recommend specific parts? I'm hoping to build out a smart home lighting and sensor solution soon using local protocols. Part of it will be an IoT wifi network & VLAN without internet access but I'd like to experiment with Zigbee/Zwave as well.
Not OP, but I’ve been happy with everything made by Aeotec. My whole house is outfitted with their products on a raspberry pi + homeassistant setup. Entire process was really simple and easy, no need to “experiment”. You’ll be done in half an afternoon.
The FCC probably has that data, though probably not in a form that you can filter by "IoT" and it's conceivable that only an internal radio module is registered for cheap devices that didn't undergo regulatory testing.
I don't think you even have to register anything if you use a preapproved module so all of the iot stuff would not show up and only the module that implements 4g will.
Every single SD card or MicroSD card has an ARM CPU onboard to track flash wear and to handle physical to logical block mapping. Everything has a CPU anymore.