Facebook detailed plans to expand the number of countries covered by a new artificial intelligence (AI) tool designed to help spot suicidal expressions and connect users to help.

In a blog post, Facebook VP of product management Guy Rosen explained the AI system uses pattern recognition to identify suicidal thoughts in posts, comments and live videos then prioritises those posts for review. For example the tool scans for questions like “Are you ok?” or “Can I help?” Concerning posts are flagged for immediate attention from Facebook’s Community Operations team, which Rosen said means the most worrisome reports are escalated to authorities twice as fast.

“This ensures we can get the right resources to people in distress and, where appropriate, we can more quickly alert first responders.”

“In some instances, we have found that the technology has identified videos that may have gone unreported,” he added.

The AI application supplements an existing Facebook system enabling users to independently report concerning posts. Rosen said the tool had resulted in more than 100 wellness checks from first responders in the past month alone.

Facebook conducted tests of the application in the US since March.

Rosen said the company will begin rolling the app out more broadly, though he did not specify which countries would gain access next. Eventually, the AI system will be deployed worldwide, except in the EU, he said.

Looking ahead, Facebook chief Mark Zuckerberg said Facebook is hoping to expand its use of the AI tool to help identify bullying and “hate”.