Is Claude down? Hundreds of users report issues in Delhi, Mumbai, Ahmedabad; Anthropic reacts

Claude, Anthropic's AI chatbot, experienced service disruptions on Monday, affecting cities like Delhi and Mumbai. Downdetector reported issues related to Claude.ai and login/logout paths, while Anthropic confirmed that the Claude API is functioning correctly.

Garvit Bhirani
Updated2 Mar 2026, 06:57 PM IST
Is Claude down? Hundreds of users report issues on Downdetector
Is Claude down? Hundreds of users report issues on Downdetector(Photographer: Bloomberg)

Claude, Anthropic's AI chatbot, was facing a service disruption on Monday, according to Downdetector, which monitors such issues through user reports.

Claude down and when will it be back up?

“We have identified that the Claude API is working as intended. The issues we are seeing are related to Claude.ai and with the login/logout paths. We are continuing to investigate this issue,” the message on Claude's website said.

View full Image
Claude website

Several users in cities including Delhi, Ahmedabad, Mumbai, and Hyderabad have been facing issues, as seen on Downdetector.

View full Image
Image from Downdetector

Users react

The Trump administration had on Friday directed all US government agencies to halt the use of Anthropic’s AI systems and introduced additional significant penalties, intensifying a highly visible dispute between the government and the company over AI safety standards, according to PTI.

Also Read | Anthropic's Claude hits number 1 spot on App Store as users boycott OpenAI deal

President Donald Trump, Defense Secretary Pete Hegseth and other officials criticised Anthropic on social media for not granting the military unrestricted access to its AI tools by the Friday deadline. They accused the company of putting national security at risk after CEO Dario Amodei declined to reverse his stance, citing concerns that the technology could be used in ways that breach its built-in safeguards.

“We don't need it, we don't want it, and will not do business with them again!” Trump wrote on social media.

Also Read | Entrepreneur says Claude ‘killed’ her company, predicts AI-to-AI sales in future

Hegseth also labelled the company a "supply chain risk," a term usually applied to foreign adversaries that could jeopardize the firm’s key partnerships with other companies.

In a statement released Friday night, Anthropic said it would contest what it described as an unprecedented and legally flawed move “never before publicly applied to an American company.”

Also Read | Amid ‘Cancel ChatGPT’ trend, Anthropic brings a new feature to move to Claude

Anthropic had previously said it was seeking limited guarantees from the Pentagon that its AI chatbot, Claude, would not be employed for mass surveillance of Americans or in fully autonomous weapons systems. The Pentagon responded that it had no intention of using the technology in those ways and would apply it only within legal bounds, but it maintained that it required unrestricted access.

“No amount of intimidation or punishment from the Department of War will change our position on mass domestic surveillance or fully autonomous weapons. We will challenge any supply chain risk designation in court,” the company said.

The government’s attempt to exert greater control over the company’s internal decision-making is unfolding amid a broader dispute over the role of artificial intelligence in national security, including worries about how increasingly advanced systems might be deployed in high-risk contexts involving lethal force, classified data or state surveillance.

About the Author

Garvit Bhirani is a journalist based in Gurugram. He is a Deputy Chief Content Producer at LiveMint, where he covers national and international news s...Read More

Get Latest real-time updates

Catch all the Technology News and Updates on Live Mint. Download The Mint News App to get Daily Market Updates & Live Business News.

Business NewsTechnologyIs Claude down? Hundreds of users report issues in Delhi, Mumbai, Ahmedabad; Anthropic reacts
More