Logo
NeoArc Studio

Site Settings: SEO and Custom Scripts

Configure robots.txt for search engines, llms.txt for AI crawlers, and inject custom scripts into the header or body of your published site for analytics, fonts, or chat widgets.

Three tabs control SEO and script injection: robots.txt, llms.txt, and Custom Scripts. These settings apply only to hostable and Azure publishing modes, where the generated site includes these files.

robots.txt Tab

The robots.txt tab provides a code editor for configuring search engine crawling directives. The file tells search engine crawlers which pages they are allowed to access.

# Example robots.txt
User-agent: *
Allow: /

Sitemap: https://docs.example.com/sitemap.xml

llms.txt Tab

The llms.txt tab provides a code editor for configuring large language model discovery information. This file helps AI crawlers and assistants understand your site's content and purpose.

Custom Scripts Tab

The Custom Scripts tab has two code editors for injecting HTML, JavaScript, or CSS into the published site.

Header Scripts

Content entered here is inserted into the <head> section of every page. Common uses include:

End of Body Scripts

Content entered here is inserted just before the closing </body> tag. Common uses include:

Publishing Mode Requirements