With increasing data generation by individuals (estimated to cross 1GB daily by 2020) and more than a billion Internet of Things-connected devices in use, our reliance on the cloud for processing is also on the rise.
Innovations in deep learning (DL) and artificial intelligence (AI) in the cloud have helped analyse the bulk of this data and build intelligent systems. However, many real-time use-cases require intelligence to be built at the edge itself. Apparently, Jeff Bezos had given the Alexa design team a one-second latency target (time taken to respond to a query).
While this may be acceptable for a consumer product, a second can often be the difference between life and death in an autonomous vehicle, for example. In addition to low latency, spotty connectivity, privacy and security issues mean that cloud computing is neither accessible instantly nor optimal.
At YourNest Venture Capital we see several product ideas that can emerge in this space.
AI in a box: Imagine building DL algorithms and processing all AI workloads using edge data signals on premise. An AI device deployed on a high-speed local edge network with access to unfiltered and high-fidelity data can also improve decision-making and training algorithms too.
AI offload chips: A large opportunity exists to build AI-first chips or co-processors and accelerators for DL. These would be designed for customer premise equipment alongside mainstream processors and any training and inference workloads would be offloaded to them.
Moving AI and DL to the edge is a complex problem and also an opportunity for deep tech startups. We recognize that building such startups takes a lot of skill, effort and resilience and our team at YourNest aims to support passionate entrepreneurs in their mission.
Catch all the Corporate news and Updates on Live Mint. Download The Mint News App to get Daily Market Updates & Live Business News.
MoreLess