Trouble viewing this email? View in web browser

Friday, July 28, 2023
techtalk
By Leslie D'Monte

Five AI trends to keep an eye on

1. Microsoft may gain even as Meta’s Llama rocks OpenAI’s boat

Is ChatGPT becoming dumber? A recent research paper by Stanford and UC Berkeley researchers that went viral demonstrated that the performance of the artificial intelligence (AI)-powered large language model (LLM)-based chatbot, ChatGPT, had deteriorated, especially when doing maths (‘Only 2.4% in math: Is ChatGPT turning dumb?’. ChatGPT, to be fair, is not particularly known for its maths skills, to begin with.

Some experts like Jim Fan, senior scientist at Nvidia, corroborated that in a bid to make GPT-4 “safer”, OpenAI could have made it less useful and even reduced the parameters in a bid to cut costs. On the other hand, Princeton professor of computer science, Arvind Narayanan, disagreed on some counts--one of them being that variance in behaviour does not suggest a degradation in capability.

Picture courtesy of Livemint

     

Regardless of the impact of the study on ChatGPT’s credibility, the research was published at a time when Meta’s so-called open-source Llama 2 began vying for user attention, and Apple is reported to be testing its own AppleGPT.

Meta’s second version of Llama (Llama 2) for research and commercial use, providing an alternative to the pricey proprietary LLMs sold by OpenAI like ChatGPT Plus and Google Bard. When Meta released the first version of Llama in February, it insisted that the LLM requires “far less computing power and resources to test new approaches, validate others’ work, and explore new use cases”. Meta made LLaMA available in several sizes (7B, 13B, 33B, and 65B parameters -- B stands for billion) and also shared a LLaMA model card that detailed how it built the model, very unlike the lack of transparency at OpenAI.

Generative Pre-trained Transformer series (GPT-3), on the other hand, has 175 billion parameters, while GPT-4 was rumoured to have been launched with 100 trillion parameters--a claim that was dismissed by OpenAI CEO Sam Altman. Foundation models (including LLMs too that sharply focus on language) train on a large set of unlabelled data, which makes them ideal for fine-tuning a variety of tasks. For instance, ChatGPT based on GPT 3.5 was trained on 570GB of text data from the internet containing hundreds of billions of words, including text harvested from books, articles, and websites, including social media.

Source: Meta

However, according to Meta, smaller models trained on more tokens (computer programmes understand numbers and not words. Hence the words are converted to tokens) are easier to re-train and fine-tune for specific potential product use cases. Meta says it has trained LLaMA 65B and LLaMA 33B on 1.4 trillion tokens. Its smallest model, LLaMA 7B, is trained on one trillion tokens. Like other LLMs, LLaMA takes a sequence of words as input and predicts the next word to generate text recursively. Meta says it chose a text from the 20 languages with the most speakers, focusing on those with Latin and Cyrillic alphabets, to train LLaMa.

Not strictly open source The newly-released Llama 2, according to Meta, is a collection of pre-trained and fine-tuned LLMs, ranging from 7 billion to 70 billion parameters. Meta simultaneously released Llama 2-Chat--a fine-tuned version of Llama 2 that is optimized for dialogue with the same parameter ranges. That said, even as Meta touts Llama to be an “open-source” LLM, The Register pointed out that since the license of Llama 2 forbids its use to train other language models and requires a special license from Meta if used in an app or service with more than 700 million monthly users, it does not meet the definition of the open source set by the Open Source Initiative (OSI). Nevertheless, the fact that Meta has made Llama 2 available for free to researchers and for commercial use with these restrictions is a welcome step.

Interestingly, Meta has partnered with Microsoft (which has also invested in OpenAI) for the same. Does it mean that the latter is hedging its bets in the Generative AI race?

On mobile devices, too. In May, Google said that it explored the viability of using LLMs to enable diverse language-based interactions with mobile user interfaces (UIs) and developed an algorithm to do so. It experimented with four pivotal modelling tasks -- screen question generation, screen summarization, screen question-answering, and mapping instruction to UI action. The results show that “our approach achieves competitive performance using only two data examples per task.

Source: Google

Meanwhile, Qualcomm Technologies, Inc. and Meta are partnering to run Llama 2 models directly on mobile devices (smartphones, PCs, VR/AR headsets, and vehicles), allowing developers to save on cloud costs and providing users with private, more reliable, and personalized experiences. ChatGPT, on its part, will soon be available on Android devices too. You can pre-register for the app at Google Play.

2. The bird is no more, but does Musk have the X Factor?

In an earlier newsletter titled ‘Twitter’s Thanos snap: What will Musk do next?’, I pointed out that Musk see-sawed over his $44 billion Twitter deal (was it buyer’s remorse?) till he finally closed it in October 2022. He sacked many executives and made multiple changes after taking charge of the microblogging site.

Even at that time, it was hard to see any direct synergies between the companies owned by Musk. SpaceX, for instance, builds rockets and spacecraft. Tesla designs and manufactures electric vehicles, battery energy storage from home to grid-scale, solar panels and solar roof tiles, and related products and services. And Neuralink is a company co-founded by Musk, which develops ultra-high bandwidth brain-machine interfaces to connect humans and computers. Twitter, on its part, is a microblogging platform that has tremendous potential to influence people who are part of the platform and even those who are not part of it since what we tweet does not necessarily stay on Twitter.

And just when the dust was settling down, or so it seemed, Musk replaced Twitter’s bird logo with an ‘X’, though the logo in mobile versions of the apps will have to be updated. To be sure, Musk owns the domain name X.com (the name of the online bank that he co-founded and that later became PayPal), which now opens up twitter.com. An article in The Verge pointed out last year that Musk was keen on emulating China’s WeChat, which began as a messaging service but later incorporated functions including gaming, social media, shopping, and payment services.

Following Twitter’s purchase, Musk said the move is eventually “an accelerant to building X, the everything app”. In April 2022, Musk had already created three new holding companies with a variation of X Holdings Inc. to help buy Twitter. A year later, Musk said in a US court filing that “Twitter Inc. has been merged into X Corp. and no longer exists. X Corp. is a privately held corporation, incorporated in Nevada, and with its principal place of business in San Francisco, California”. X Holdings Inc. is now said to reflect Musk’s idea of a super-conglomerate with Tesla, SpaceX, Neuralink, The Boring Co., and Twitter (now X.com).

However, according to a Reuters piece, Musk’s ‘X’ could land him in legal trouble since companies, including Meta and Microsoft, own intellectual property (IP) rights to the same letter. Further, “nearly 900 active U.S. trademark registrations that already cover the letter X in a wide range of industries.

Even if Musk and his lawyers have devised a strategy to retain the ‘X’ logo, Twitter has already lost between $4 billion and $20 billion in brand value, according to a Bloomberg article. Elon Musk’s tenure as CEO at Twitter saw its brand value drop 32% to $3.9 billion, in the Brand Finance Media 50 ranking this year, according to a press statement. Hope Musk has accounted for these costs too.

Musk may draw some comfort from the fact that user engagement on Threads (its rival from Meta) has plummeted 70%, according to a report in The Wall Street Journal.

According to market intelligence firm Sensor Tower, engagement on Threads saw a steep 70% decline in the number of daily active users since its July 7 peak, falling to 13 million. That said, Meta’s loss is definitely not Musk’s gain. He will have to prove that he’s an astute businessman and not an eccentric billionaire who frequently keeps on changing the goalposts. Musk has built some great companies. It would be sad to see them fall by the wayside.

3. Tome: Why GenZ is lapping up this AI app

If you want AI to create presentations for you, you can play around with the Tome app. In just 10 months of its launch, an AI-powered slideshow service called Tome has more than 10 million users, co-founder Keith Peiris told Semafor. You can create an account on tome.app and play around with it by uploading pictures, pasting text, or just writing prompts, etc. For instance, if you want to spruce up your resume for a new job, you can enter prompts like Name, address, previous work experience and dates of employment and click on the ‘Generate’ button for a job-ready resume. You could also paste plain text for a presentation output, but it always helps to be specific with prompts, as is the case with ChatGPT too. Tome has multiple templates to play around with.

There’s a free version with 500 AI credits (a 1000-word text may consume about 10 credits or so), a Pro for $8, and an enterprise version (you have to write for the pricing). Prompt Tome’s AI currently generates narratives in English, Spanish, Arabic, Portuguese and German, and the app plans to introduce more languages soon. Tome, according to Semafor, was founded by Peiris and Henri Liriani--both earlier worked with Meta on products like Instagram and Messenger.

4. AI-funded UBI with Altman’s Worldcoin

Sam Altman, the CEO of ChatGPT-maker OpenAI, has launched Worldcoin--a new cryptocurrency project that uses a special device called an “orb” to scan people’s eyeballs. This scanning process generates a unique digital identity for each person, known as a “World ID.” This World ID serves as proof that the person is a real individual, giving them the status of “proof of personhood” in the Worldcoin system. In other words, it confirms that they are a genuine and unique person in the digital world.

Worldcoin has a limit of 10 billion tokens that will ever be created, as mentioned in its white paper. On the launch day, 143 million Worldcoins were released. Out of these, 100 million were given to market makers as a loan, and the rest were distributed to investors who had verified their identity using the Orb technology before the official launch. Worldcoin’s World ID and its digital currency (WLD) are “currently complemented by World App--the first frontend to World ID and the Worldcoin Protocol, developed by the contributor team at Tools for Humanity (TFH)”.

“If successful, Worldcoin could considerably increase economic opportunity, scale a reliable solution for distinguishing humans from AI online while preserving privacy, enable global democratic processes, and show a potential path to AI-funded UBI,” reads the Worldcoin whitepaper.

Worldcoin, though, won’t be available for trading in the US due to regulatory restrictions in the country. Further, India’s apex bank, the Reserve Bank of India, has been consistent in expressing its displeasure with cryptocurrencies even as there’s no ban on them. However, gains from cryptocurrencies in India are treated as income, and a 1% tax is deducted at the source. The Cryptocurrency Bill 2021, meanwhile, is yet to make any significant headway.

5. AI @ Work in Charts

Source: AI at Work: What People are Saying. BCG X June 2023

Hope you folks have a great weekend, and your feedback will be much appreciated.

Download the Mint app and read premium stories
Google Play Store App Store
Livemint.com | Privacy Policy | Contact us You received this email because you signed up for HT newsletters or because it is included in your subscription. Copyright © HT Digital Streams. All Rights Reserved