Thursday, April 26, 2018

Review of 3D Cameras for AR Glasses

Daqri smart glasses rely on an array of cameras, including 3D Intel RealSense:

The company's Chief Scientist Daniel Wagner publishes a nice overview of 3D camera technologies together with AR glasses requirements for a depth camera:

"First, sensors need to be very small in order to integrate into headsets of comparably restricted size. For AR headsets, small can be defined as “mobile phone class sensors” in size,e.g., a camera module no more than 5mm thick.

Second, the depth camera should use as little power as possible, ideally something noticeably lower than 500 mW, since the overall heat dissipation capability of the average headset is just a few watts.

Third, in order to further save power, the depth camera should not require intensive processing of the sensor output since that would result in further power consumption.

For environmental scanning, the depth camera needs to see as far as possible — in practice roughly a range of around 60 cm to 5 meters.

In contrast, user input needs to work at only arm’s length, hence a range of around 20 to 100 cm.

Lastly, there is the matter of calibration. As automatic built-in self-calibration is not yet available, they rely on the factory calibration to remain valid over their lifetime, which can be a problem.

Are All Sony Sensors Created Equal?

Basler publishes a nice white paper "Sensor Comparison: Are all IMXs equal?" comparing various Sony sensors in different families:

"EMVA1288 standard offers the measured value of the “absolute threshold value for sensitivity”. It states the average number of required photons so that the signal to noise ratio is exactly 1."

4 Generations of Camera Module Testing

Pamtek publishes a video showing the four generations of its automated camera module testing machines:

Wednesday, April 25, 2018

High Speed SERDES Technology Enables High Frame Rates, Potentially

EETimes: With the emergence of 112 Gbps per lane SERDES technology and wide adoption of 56 Gbps per lane one, the 12.8 Tbps single-chip switches from different companies have reached the market. This enables a data infrastructure for high frame rate and high resolution imaging systems. For instance, an 8K video with 16b per pixel can transferred at more than 24,000 fps speed through this data pipe. Now, once the data transfer technology is ready and the wafer stacking technology is mature, we could design image sensors supporting this speed and find an application for them. Or, may be, find the application first.

Technavio Forecasts Automotive CIS Cost Reductions

BusinessWire: Technavio global automotive image sensors market reports talks about an number of the recent trends:

"The global automotive image sensors market has witnessed a reduction in the cost of image sensors. The adoption of image sensors in consumer electronics and smartphones has allowed image sensor manufacturers to experience economies of scale, which further resulted in price reduction. The automotive industry did not benefit just from the reduction in cost but also by improved performance and picture quality.

One of the key trends impacting the growth of the market is the development of high-sensitivity CMOS image sensor with LED flicker mitigation.

ST Reports Weak Sales of Imaging Products for Smartphones

SeekingAlpha publishes ST Q1 2018 earnings call. The company sounds not happy about its imaging business in smartphones:

"On a sequential basis, AMS [Analog, MEMS and Sensors Group] revenues decreased by 27.4%, principally reflecting the negative impact of smartphone applications to our Imaging business...

As we already anticipated, and now this is well-known by the industry, the second quarter is another quarter of weak sales in smartphones, particularly for our Imaging business.

Tuesday, April 24, 2018

Sony Stacked Vision Chip Paper

MDPI Special Issue on the 2017 International Image Sensor Workshop keeps publishing papers presented at the workshop. Sony paper "Design and Performance of a 1 ms High-Speed Vision Chip with 3D-Stacked 140 GOPS Column-Parallel PEs" by Atsushi Nose, Tomohiro Yamazaki, Hironobu Katayama, Shuji Uehara, Masatsugu Kobayashi, Sayaka Shida, Masaki Odahara, Kenichi Takamiya, Shizunori Matsumoto, Leo Miyashita, Yoshihiro Watanabe, Takashi Izawa, Yoshinori Muramatsu, Yoshikazu Nitta, and Masatoshi Ishikawa presents:

"We have developed a high-speed vision chip using 3D stacking technology to address the increasing demand for high-speed vision chips in diverse applications. The chip comprises a 1/3.2-inch, 1.27 Mpixel, 500 fps (0.31 Mpixel, 1000 fps, 2 × 2 binning) vision chip with 3D-stacked column-parallel Analog-to-Digital Converters (ADCs) and 140 Giga Operation per Second (GOPS) programmable Single Instruction Multiple Data (SIMD) column-parallel PEs for new sensing applications. The 3D-stacked structure and column parallel processing architecture achieve high sensitivity, high resolution, and high-accuracy object positioning."

Nondestructive Photon Detection

APS Physics publishes Washington University article "Viewpoint: Single Microwave Photons Spotted on the Rebound" by Kater W. Murch.

"Single optical photon detectors typically absorb an incoming photon and use that energy to generate an electrical signal, or “click,” that indicates the arrival of a single quantum of light. Such a high-precision measurement—at the quantum limit of detection—is a remarkable achievement, but the price of that click is in some cases too high, as the measurement completely destroys the photon. If the photon could be saved, then it could be measured by other detectors or entangled with other photons. Fortunately, there is a way to detect single photons without destroying them.

This quantum nondemolition photon detection was recently demonstrated in the optical domain, and now the feat has been repeated for microwaves. Two research groups—one based at the Swiss Federal Institute of Technology (ETH) in Zurich and the other at the University of Tokyo in Japan—have utilized a cavity-qubit combination to detect a single microwave photon through its reflection off the cavity.

The non-destructive optical photon detection paper has been published in 2013 and described in Photonics magazine:

"Andreas Reiserer and colleagues at the Max Planck Institute of Quantum Optics have developed a device that leaves the photon untouched upon detection.

In their experiment, Reiserer, Dr. Stephan Ritter and professor Gerhard Rempe developed a cavity consisting of two highly reflecting mirrors closely facing each other. When a photon is put inside the cavity, it travels back and forth thousands of times before it is transmitted or lost, leading to strong interaction between the light particle and a rubidium atom trapped in the cavity. By reflecting the photon away from the device, the team was able to detect the photon by changing its phase rather than its energy.

The phase shift of the atomic state is detected using a well-known technique.

I'm not sure what is the practical use of this for image sensing. In theory, this opens a way to an invisible image sensor that detects and releases all the incoming photons without absorbing them.

Prophesee Event-Driven Reference Design

EETimes: Prophesee (former Chronocam) comes up with an event driven sensor reference design for potential customers. The Onboard reference system contains a VGA event-driven camera integrated with Prophesee’s ASIC, Qualcomm’s quad-core Snapdragon processor running at 1.5GHz, 6-axis Inertial Measurement Unit, and interfaces including USB 3.0, Ethernet, micro-HDMI and WiFi (802.11ac), and MIPI CSI-2:

Monday, April 23, 2018

Image Sensor Market is Greater than Lamps

IC Insights Optoelectronic, Sensor, and Discrete (O-S-D) report gives a nice comparison of image sensor business with others. It turns out that the world spends more on image sensing than on the scenes illumination: