Apple said to step up plans for Amazon Echo-style smart-home device
Latest News »
- Govt sets notifies new norms for discoms to buy solar power
- Tata Motors to take ‘concerted decision’ on Nano, says N. Chandrasekaran
- Gross loan portfolio for MFIs rises 8% to Rs35,045 crore in June quarter: report
- Druva raises $80 million from investors led by Riverwood Capital
- US targets China, Russia entities, individuals for North Korea support
San Francisco: Apple Inc. is pressing ahead with the development of an Echo-like smart-home device based on the Siri voice assistant, according to people familiar with the matter.
Started more than two years ago, the project has exited the research and development lab and is now in prototype testing, said the people, who asked not to be identified discussing unannounced Apple projects. Like Amazon Inc.’s Echo, the device is designed to control appliances, locks, lights and curtains via voice activation, the people said. Apple hasn’t finalized plans for the device and could still scrap the project.
If a product reaches the market, it would be Apple’s most significant piece of new hardware since the company announced the Apple Watch in 2014. Echo has been a surprise hit, even to Apple engineers working on their competing project, and is already being baked into smart-home systems made by a range of companies. Meanwhile, Alphabet Inc. is working on its own device, Google Home. Besides taking on the competition, Apple is looking for a new hot seller to augment the iPhone.
The company is attempting to differentiate itself from Echo and Google Home with more advanced microphone and speaker technology, two people said. Some of the prototypes in testing include facial recognition sensors, another person said. Apple has acquired the facial recognition startups Faceshift and Emotient over the past two years, which may help the device act based on who is in a room or a person’s emotional state.
Besides serving as a controller for other smart-home devices, the speaker would theoretically be able to process many of the Siri commands available on the iPhone. For example, users may be able to ask the device to read e-mails, send text messages and Tweets, and stream content from Apple Music. Apple has also considered integrating mapping information into the speaker, another person said, potentially allowing the device to notify a user when it’s time to leave the house for an appointment.
An Apple spokeswoman declined to comment.
Before setting its sights on a standalone speaker, Apple attempted to integrate the functionality of an Amazon Echo-like device into the Apple TV, three people said. This would have allowed users to shout commands from the couch to the TV box. Those efforts were abandoned in favour of putting the voice-command features into a remote control when the latest set-top box shipped in October 2015.
Apple began showing its interest in the smart-home field with the launch of HomeKit in 2014, which allows third-party smart-home accessory makers to integrate with Siri. That same year, Apple began testing early versions of Siri-driven speakers with proprietary surround sound technology. The company worked on two versions, a larger and a smaller model similar to Amazon’s current line-up, and even set up a small home theater to test prototypes. But those early efforts may not translate into a final product.
The work on the speaker is a collaboration between the Cupertino, California-based company’s hardware division and its Siri team, which last year was re-organized by the division’s vice-president, Bill Stasior. The division now consists of four main development groups: web search, proactive assistance, speech recognition, and the Siri application itself, two people said.
The web search team focuses on Apple’s efforts to circumvent Google search by pushing query suggestions via Apple’s own servers, while the speech recognition group powers how Siri understands its users. The proactive assistance team develops functionality such as the iPhone feature that alerts users to appointments, while the application team engineers the voice assistant.
Now that the device has entered the prototype stage, Apple engineers have begun secretly testing it in their homes, one of the people said. While not an indicator of the speaker’s launch timeline, Apple chief executive officer Tim Cook has said he tested the iPad at home for roughly six months before its introduction. By contrast, Apple employees began testing the latest Apple TV with Siri about a year before it went on sale, one of the people said.
Following successive year-over-year quarterly drops in sales, Cook is looking for new ways to generate revenue for a company that some analysts have said has reached its peak. Apple relies heavily on the iPhone, which accounted for about 66% of its revenue last year, and an iPhone-connected speaker would be another way for the company to boost its smartphone sales via its ecosystem. The Information earlier reported that Apple is working on an Amazon Echo-like speaker, which CNET said would include cameras.
While Apple’s ability to make high-quality hardware isn’t in question, the Siri voice system has failed to meet expectations. Since launching in 2011 with the iPhone 4S, Siri has frequently stumbled with long load times and by misinterpreting information. If the speaker is to thrive in a marketplace dominated by Amazon’s advanced Alexa service, Apple needs to improve Siri.
Beyond the home device, Apple is researching new ways to improve Siri on iPhones and iPads, two people said. With an initiative code-named “Invisible Hand,” Apple hopes to give users the ability to fully control their devices through a Siri command system within three years, one of the people added. Currently, the voice assistant is able to respond to commands within its application, but Apple’s goal is for Siri to be able to control the entire system without having to open an app or reactivate Siri.
For example, a user would be able to ask their iPhone to open a web page and then share it with a friend without the need to ever launch the Siri interface. Other examples from Apple’s current research include being able to print a PDF by speaking “print” while reading it or saying “help” in order for the system to help the user navigate a particular task or application. Apple has also been researching opening this ability to third-party apps, the person said. Bloomberg