posted by Bryon Moyer
Lens-free technology has poked its head up in a few places, but one of the more frequent views you may have of it is an application that Imec appears particularly fond of: a cell sorter.
The whole idea behind the contraption is to isolate abnormal blood cells from a sample. So they built a microfluidic device that delivers a flow of blood cells. Each cell passes over a lens-free aperture where a lens-free camera analyzes the interference patterns that the cell creates. That creates a differentiating signal between normal and abnormal cells.
The processing of that signature happens quickly enough that, at the point where the cell has traveled further to a microfluidic crossroads, normal cells can be steered down one channel; abnormal cells down another.
How do you “steer” a cell? Well, the default flow goes one way, and when a faulty cell is detected, at the time it hits the junction, a small heater creates an instantaneous bubble that pushes the cell into the other channel. (You could also actively steer the normal cells with a counter-bubble as well.)
In case that seems like a lot of work, well, it is. They say that they process 20 million images a second.
As I mentioned, they appear particularly proud of this, because it’s presented at numerous different venues, and they’ve invested in marvelous animation to illustrate what’s going on. So if you find yourself at an Imec function, you may also get to see the images. But, to be sure, it’s more than animation. When visiting their facility, this was one of the places they took us where they stood by like proud papas as we took a look at the real deals.
One of the challenges with building something like this is finding an adhesive that is compatible with being a microfluid channel, especially when there may be heaters and such in the device. Such an adhesive would be used to secure a glass cap.
Imec and JSR announced such a material last month. The adhesive can be patterned using normal photolithography, allowing this step to be performed on entire wafers. The picture below shows a cell sorter wafer with glass covers glued to the intact microfluidic dice, which contain those micro-heaters for steering the cells. With glass covers in place, the wafer can be diced up into individual cell sorters.
You can read more about this material in their announcement.
posted by Bryon Moyer
At the recent MEMS Executive Congress, Bosch Automotive announced a new 6-axis automotive IMU. It’s not for use as part of the automotive control systems, but rather for the “infotainment” infrastructure – the so-called center stack – and other non-safety-critical applications.
It may not be obvious that Bosch has two different sensor groups. There’s an automotive group which focuses strictly on – you guessed it – cars; then there’s Bosch Sensortec, which handles other consumer and industrial sensors. (There’s also Bosch Akustica for microphones.) So… does this mean that Bosch Automotive is off in its private silo inventing its own sensors independently of Bosch Sensortec?
No; according to them, the automotive market actually isn’t large enough compared to phones and other consumer goods to justify that. The IMU that Bosch Automotive has announced actually came from Bosch Sensortec. Does this mean that it’s just a rebranding of an old part?
No; the automotive units have more stringent operating requirements than a consumer unit has. Just going to 125 °C from 85 isn’t trivial. There are also corrosion concerns and the fact that the IMUs must interface with automotive diagnostic systems.
Corrosion can presumably be handled with packaging. Calibration and linearization over a higher temperature range involve changes to the ASIC. The diagnostic interface also resides in the ASIC. So, in reality, we have a Bosch Sensortec IMU with a new ASIC and package.
You can find out more in their release.
posted by Bryon Moyer
Imagination Technologies has announced a new image signal processing architecture that they’re calling “Raptor.” The overarching concept is that the image signal processor (ISP) should no longer be a separate chip: it should be integrated into the main system SoC, along with the other related accelerators, CPU, and GPU. Raptor is IP that allows such integration. It’s targeted at next-generation image processing applications like feature identification, scalable for both low- and high-end applications.
The benefits they tout come both from this integration and the fact that they provide all of the pieces required between the raw camera sensor(s) and final RGB or YUV output or an encoded image or stream. Within the ISP itself, they are able to leverage the fact that all of the technology comes from the same place – with similar compression and a unified architecture. They say that this keeps latency low and supports their “Zero-memory” approach to delivering the image to encoders and various effects accelerators.
Of course, having all of this on the SoC reduces the chip-to-chip overhead of an external ISP. The ISP also gets the process advantages of the advanced nodes typically used for an SoC.
The architecture is intended to support multiple sensors, maintaining up to four concurrent contexts. These could be front- and back-side cameras on a phone, for example, or they could be multiple cameras for multi-camera arrays, stereoscopic imaging, or “integral photography,” where multiple images are stitched together to form what can be an almost 3D image with holographic tendencies. They support up to 16-bit pixel depth, scalable to the needs of the application.
Custom processing can also be implemented by tagging the image data at various points in the pipeline and then running that data back into the pipeline. The image statistics are gathered as the image is processed; those statistics are available to the encoders, eliminating one encoding pass.
Availability is targeted for the first quarter of 2014. You can find more information in their announcement.