Tokyo: One of Sony Corp.’s most promising bets on the future is hiding in plain sight.
Inside the electronics maker’s Atsugi Technology Centre, a research campus located an hour outside Tokyo, engineers and researchers are developing sensors that can detect people and objects by calculating how long it takes for light to reflect off surfaces.
With plans to go into mass production next year, Sony’s new class of sensors are designed for smartphones and augmented-reality (AR) devices. Eventually, the new chips could find their way into drones, self-driving automobiles, gaming consoles, industrial equipment, factory and warehouse robots, and many other machines that interact with their environments. All told, the market for these 3D sensors will expand three-fold to $4.5 billion by 2022, according to researcher Yole Developpement, approaching Sony’s current revenue from image sensors.
“This has the scale to become the next pillar of our business," said Satoshi Yoshihara, the general manager in charge of Sony’s sensors division.
Sony thinks its manufacturing expertise in developing camera chips —found in the latest iPhones— gives it a distinct edge. The Tokyo-based company dominates the image-sensor market, with a 49% share. While roughly a tenth of Sony’s revenue came from semiconductors in the latest quarter, almost a third of operating profit comes from the division.
The new 3D detectors are in a category called time-of-flight (TOF) sensors, which scatter infrared light pulses to measure the time it takes for them to bounce back. The basic technology has been around for a while and forms the basis for the Xbox’s motion-based Kinect, as well as laser-based rangefinders on autonomous vehicles and in military planes. Sony’s big innovation over existing TOF sensors is that they’re smaller and calculate depth at greater distances. Used with regular image sensors, they effectively give machines the ability to see like humans.
“Instead of making images for the eyes of human beings, we’re creating them for the eyes of machines," Yoshihara said. “Whether it’s AR in smartphones or sensors in self-driving cars, computers will have a way of understanding their environment."
The most immediate impact from TOF sensors, which will be fabricated at Sony’s factories in Kyushu, will probably be seen in augmented-reality gadgets. Apple Inc. is betting big on mixing real and virtual environments, making it a key feature of the iPhone X, which ships 3 November. While the brand new smartphone relies on older time-of-flight technology (along with software and digital cameras), Apple is likely to adopt Sony’s TOF sensors for future devices, according to Yusuke Toyoda, a sensors analyst at Fuji Chimera Research Inc.
“Other smartphone makers will then copy Apple and adopt TOF sensors," Toyoda said. Representatives for Apple didn’t respond to requests for comment. Apple is struggling to produce enough iPhone X due to supply problems with its 3D sensors, Bloomberg News reported Thursday.
Sony shares are up 29% this year, about double the 15% gain in the broader market. The stock fell 0.9% in Tokyo trading Wednesday.
Sony faces competition, including from the current top supplier of TOF sensors, STMicroelectronics NV. The Geneva-based company makes up almost the entire market with its FlightSense sensors, used by Apple and more than 80 smartphone models. The chips are mainly deployed to measure distances to accurately focus cameras.
Alexis Breton, a spokesman for STMicro, declined to comment, pointing to recent data showing that it’s shipped more than 300 million TOF chips. STMicro’s revenue from the division that mostly includes the sensors was $295 million last year.
Much of the technology that has made Sony a success in imaging chips is also used in 3D sensors. Its back-illuminated technology is considered state of the art for converting images into electrons, which smartphone processors can then store and manipulate. That will be even more critical as phones evolve from sensing dozens of different depths to thousands.
“Sony has everything technology-wise to address the market," said Pierre Cambou, an imaging analyst at Yole. “They shouldn’t have a problem gaining a large share in 3D."
When Sony decided to gamble on time-of-flight sensors three years ago, it faced a choice between building or buying the technology. In 2015, Sony decided to acquire Softkinetic Systems, a small developer of TOF sensors. The Brussels-based company of 77 employees had already successfully deployed the technology in BMW’s 7 Series sedans, giving drivers the ability to control many of the car’s functions using hand gestures.
“When our engineers and their engineers talked about collaborating, we realized we could come up with an amazing sensor," Yoshihara said of the merger. “In terms of both (chip) performance and size, we can achieve another breakthrough."
Yoshihara joined Sony in 1991, just as it was pushing into digital-imaging. He was involved in the development of charge-coupled device (CCD) sensors, central to products like the Handycam and Cyber-shot, and the transition to complementary metal-oxide semiconductor (CMOS) chips, used in virtually every smartphone. Now the industry standard, CMOS sensors are a $12 billion market, according to Yole.
As more AR-enabled hardware and software reach consumers, they’ll start to expect more devices to see the world as it is, in three dimensions. IKEA recently introduced an AR app that lets consumers virtually place furniture in living rooms and bedrooms. “It acts like it’s there, it looks like it’s there, and you interact like it’s there," said Michael Valdsgaard of IKEA, who created the app.
Self-driving cars will probably use both laser- and semiconductor-based TOF sensors, offering redundancy and greater accuracy, while drones will benefit from using smaller TOF chips to sense their surroundings, according to Fuji Chimera’s Toyoda. Artificial intelligence software will also be able to take advantage of the ability to detect people and objects, according to Dave Ranyard, who made AR games at Sony before running his own studio, Dream Reality Interactive.
“The way we interact with computers now and with each other online is through a 2D interface like a webpage," Ranyard said. “But fundamentally, we’re moving to a 3D world." Bloomberg