Technical Preview of Mindgrove Tech "Secure IoT" SoC
700 MHz 64 bit microcontroller, but what else ?
I had covered various aspects of Mindgrove Technologies “Secure IoT” SoC a couple of months ago, at the time of the product launch (read the article here). That coverage was based on information pieced together from various sources. Fortunately, some time ago Mindgrove released a “Prototype Datasheet” (link). This allows me to take a closer look at the product. That said, it is far from being a full blown datasheet - just nineteen pages is indicative of this being a work in progress. Thus, some guesswork is still required. The rest of this article abbreviates the official name “MG Secure IoT” to just SIoT, for readability.
SIoT is is arguably1 the world’s first 64 bit microcontroller(MCU), clocked at an impressive 700 MHz ! While MCUs from established vendors such as NXP and ST have similar and higher clocks, they don’t have a 64 bit offering. That said, the performance of this MCU isn’t documented. Benchmarks being murky territory, it doesn’t pay to guess. Thus, we need to wait for both first party as well as independent benchmarks.
Unfortunately, for the majority of potential volume customers of this chip, that’s where the story of SIoT would end. A careful comparison with competitive products of mainstream vendors shows that SIoT lags behind in various ways. MCUs typically integrate an assortment of controllers that make each variant suitable for various applications - be it capacitive touch sensing, keyboard scan, communication interfaces (USB, Ethernet), support for specialized interfaces (e.g. CAN bus), and multimedia(I2S/PCM audio, LCD/camera, graphics accelerator) . Oddly, SIoT has a very fast processor, but not the interfaces required to collect data from the real world to implement rich processing. The claimed 30% cost advantage2 on just the MCU cannot cover the various gaps.
The CPU in the MCU is based on the open source SHAKTI RISC-V “C class” CPU core. It is not clear if the exact CPU implementation is available publicly, but it would certainly be nice to have this documented. A chip is more than just the CPU core, so this doesn’t make the entire chip “open source”. That said, it would be useful for customers looking to deploy this chip in strategic applications, especially when the integrity of the supply chain becomes critical. The wide operating temperature range of -40 to 125 degrees would be useful in many such applications. That said, the MCU is not rated to rigorous mechanical requirements (e.g AEC-Q100), required for operating in harsh environments such as automotive.
Overall, SIoT looks more of a lighthouse technology demonstrator(“look, we made a fast processor”), while falling short of the feature set expected of a ready to use mass market chip. In the words of Prof Kamakoti (he is the director of IIT Madras; see this video, around 17:55), “this processor I hope the big Indian industries like Tatas, Reliance, Mahindra and Mahindra, the L&Ts who are all working on automobiles, working on smart cities and communication related projects will seriously consider evaluating this chip and make it part of their product line”. We’ll just have to wait and watch to see how that pans out. As is, IMHO, it is more likely to fit into niche strategic applications which are much lower volume (but not cost sensitive) compared to mainstream industry and consumer grade applications (cost sensitive, but high volume).
To understand why this is so, let us look at how electronic product designers will evaluate this chip. From an Indian semiconductor perspective, my take is that this is almost as if the not-so-distant history is repeating itself, a deja-vu of sorts. While this chip is better than CDAC’s THEJAS32 (my article about THEJAS32), it makes many of the same mistakes - which is surprising.
SIoT is capable of secure boot, cryptographic acceleration, and features a 4kB on-chip OTP memory. That said, it can’t directly execute encrypted code from flash - which is a security requirement. JTAG is supported, compliant to RISC-V Debug Spec v0.13, leading to a potential catch - the datasheet does not specify support of any security features of JTAG. Without the ability to lock out JTAG, code security will be in question, making it hard to protect intellectual property in a provable manner.
The “IoT” in the product name is somewhat misleading as this chip features absolutely no network connectivity features - neither Ethernet nor any form of wireless communications (WiFi, Bluetooth, etc). Most IoT products require support for low power sleep modes for battery conservation. SIoT runs off a single 20 MHz crystal clock source, boosted to 700 MHz using an oscillator+PLL3. Typical microcontrollers include some form of low accuracy internal RC oscillators, as well as an support for a 32.768 kHz crystal to serve as RTC (Real Time Clock). Both these options are absent in Secure IoT. Surprisingly, the datasheet offers no information on sleep modes ! Makes we wonder - does SIoT even have a sleep mode ? Not having a sleep mode is a major IoT #fail. This is not something any vendor would miss in the datasheet. Given that there’s a single clock source, I can only guess that SIoT runs at a fixed 700 MHz all the time. Clock scaling is commonly used to achieve power/performance trade-off at runtime on all modern CPUs, and not having that does feel weird.
SIoT supports various standard interfacing options such as
2 x QSPI, maximum clock 70 MHz, with a FIFO on both the transmit and receive paths
4 x SPI, maximum clock 35 MHz. SPI master and slave modes are supported.
2 x I2C, master only, maximum clock 1 MHz. 7 bit addressing only
3 x UART, with no support for flow control.
A single SAR ADC with 8 channels, 5 MSPS conversion rate and an 8:1 mux. Thus you’ll get 5 MSPS for 1 channel, and 625 K samples when all channels are enabled. Resolution selectable between 12, 10, 8 and 6 bits. Note that the ADC reference voltage of 0.9V is half the IO voltage of 1.8V - effectively halving the ADC range.
4 general purpose timers and watchdog support
32 x GPIO, 8 x PWM. The datasheet mentions that GPIOs are “designed to operate at 1 MHz”. That does seem low, considering the 700 MHz clock speed.
Note that the PWM pins are multiplexed on the GPIO, but everything else is mapped to dedicated pins.
The datasheet doesn’t mention any FIFO for UART, I2C, SPI or ADC. SIoT has the seemingly dubious distinction of featuring an ultra fast CPU without a DMA controller. This is a major design and architectural flaw. What were the designers thinking? Lack of a DMA controller on SIoT will make error free programming tricky. It will also make it hard to extract the maximum juice out of the 700 MHz processor, especially for any use cases where all the peripheral buses are loaded. With something like the SPI slave functionality, the interrupt rain could cause issues at high data rates.
SIoT also lacks a USB controller. Those needing USB support could potentially use converter chips from USB to UART, but this approach does not allow high speed applications of the sort you’d expect with a 700 MHz CPU. Not to mention, external chips add to costs and make power management harder.
SIoT lacks audio interfaces such as I2S/SPDIF, another inexplicable design choice. The general lack of multimedia specific features (how do we get a simple camera stream in?) limits the applications of SIoT. A powerful processor needs ways of getting high speed data in/out in a time sensitive, efficient manner - that needs dedicated interfaces and DMA. Makes me wonder if someone somewhere forgot the current buzzwords of edge computing and AI…
The IO pins of the SIoT work at 1.8V. Any interfacing required to 3.3V logic devices will need a level translator. Unfortunately, 3.3V is a very common logic level for MCU circuits. Requiring level translation not only adds cost and space, but also makes it harder to reduce power consumption to an absolute minimum.
Cost sensitive applications use IC packages that allow 2/4 layer PCB implementations. The 0.5mm spaced 144 pin BGA package used in Secure IoT will likely need tight manufacturing tolerances to get to a 4 layer PCB implementation. Overall, the cost of the chip alone isn’t the important thing - it’s the entire Bill of Materials (BOM) and PCBA manufacturing cost that gets customers excited (or exit!).
The designers of SIoT clearly have a lot of work to do to convert this into a chip that the industry will actively consider and want to use. SIoT’s target applications listed on their website are: Smart Lock, Smart Watch, Smart Meter, Thermal Printer, Smart Fan, and Biometric Module. The product specifications of SIoT aren’t well matched with these. Notably - all these applications require some form of connectivity - wired (USB) or wireless (WiFi/BT, etc), and most of these don’t need a 700 MHz MCU. That level of performance is neither required, nor will customers pay for it. Similarly, -40 to 120 degree operating temperature is useful for very specific applications, and not something consumer applications need.
What do customers actually use and want in an MCU? It certainly helps to learn from teardowns of various devices that are actually in production to understand the level of performance and integration that is adequate for specific application categories. Some examples:
Atomberg Renesa smart fan (see teardown) uses an 8 bit, 16 MHz microcontroller with integrated flash.
boAt Ultima Chronos Smartwatch (see teardown) uses a 100 MHz 32 bit MCU with a lot more SRAM, integrates flash, Bluetooth, useful audio interfaces and codecs, and a parallel display interface for LCDs . And, no surprise - it includes an in-built RTC as well.
AmiciSmart WiFi Single phase Energy meter (see teardown) again uses MCUs with modest specs and the right levels of integration, specific to this application.
Bottom-line: Customers adopt ICs that directly meet requirements, resulting in optimized development and product costs. Integration is the raison d'etre of SoCs. For real world applications, the CPU isn’t the hero, the integration of the right feature set at the right cost is.
In summary, Secure IoT is a chip that flatters to deceive. The value proposition of a potentially fast processor is eroded by weak application focus and low integration. That makes it a technology demonstrator, rather than a serious chip that can be directly adopted for target applications. That is unfortunate, as the eye catching CPU spec immediately brings up thoughts of, “what if….” !
Hopefully the next iteration of the chip will focus on some application areas and create a compelling offering. Mindgrove Technologies certainly has a headstart and a powerful CPU - an advantage they can convert with appropriate course corrections!
Correct me if I am wrong here!
The cost of this SoC isn’t publicly available, estimates place it around $3 a piece
The PLL isn’t documented in the datasheet.