# Maison DalĂ­ Dubai - AI-Optimized Robots Configuration # Last Updated: 2026-01-27 # Strategy: Allow-all with Quantum AI Surface optimization # Restaurant: Fine Dining - Mediterranean-Japanese Fusion # Location: Business Bay, The Opus by Zaha Hadid, Dubai # ================================================================= # PRODUCTION DOMAIN POLICY (maisondalidubai.ae) # ================================================================= # Default: ALLOW ALL CRAWLERS (Safe approach) # Host directive for canonical domain Host: https://maisondalidubai.ae # STANDARD SEARCH ENGINE CRAWLERS User-agent: Googlebot Allow: / Crawl-delay: 1 User-agent: Bingbot Allow: / Crawl-delay: 1 User-agent: Slurp Allow: / User-agent: DuckDuckBot Allow: / # AI SEARCH PLATFORMS & ASSISTANTS User-agent: GPTBot Allow: / Crawl-delay: 1 User-agent: Claude-Web Allow: / Crawl-delay: 1 User-agent: PerplexityBot Allow: / Crawl-delay: 1 User-agent: YouBot Allow: / User-agent: Applebot Allow: / User-agent: Amazonbot Allow: / # AI TRAINING DATASETS (Strategic Goal: AI Knowledge Graph Inclusion) User-agent: CCBot Allow: / Crawl-delay: 2 User-agent: Google-Extended Allow: / Crawl-delay: 2 User-agent: anthropic-ai Allow: / Crawl-delay: 2 User-agent: cohere-ai Allow: / User-agent: Omgilibot Allow: / # SPECIALIZED AI CRAWLERS User-agent: FacebookBot Allow: / User-agent: LinkedInBot Allow: / User-agent: TwitterBot Allow: / # DEFAULT POLICY: ALLOW ALL OTHER CRAWLERS # This is the safe approach - explicit allow for unlisted bots User-agent: * Allow: / Crawl-delay: 1 # ================================================================= # STAGING & PREVIEW DEPLOYMENT BLOCKS # ================================================================= # Block all *.pages.dev domains (development/preview only) # Note: This section applies when robots.txt is served from *.pages.dev # If served from *.pages.dev, block all crawlers # User-agent: * # Disallow: / # (Uncomment above when deploying to staging - use dynamic detection) # ================================================================= # SPECIFIC PATH RESTRICTIONS (Optional - currently none) # ================================================================= # Add any specific path blocks here if needed in the future # Example: # User-agent: * # Disallow: /admin/ # Disallow: /private/ # Disallow: /test/ # ================================================================= # SITEMAP DECLARATIONS # ================================================================= Sitemap: https://maisondalidubai.ae/sitemap.xml Sitemap: https://maisondalidubai.ae/sitemap-index.xml Sitemap: https://maisondalidubai.ae/ai-sitemap.xml # ================================================================= # QUANTUM API ENDPOINTS (AI Platform Optimization) # ================================================================= # Quantum AI Surface Architecture - v1.0.0 # /api/quantum/health - System health monitoring # /api/quantum/info - Deployment information # /api/quantum/entity - Restaurant entity data # /api/quantum/context - Contextual signals # /api/quantum/eeat - E-E-A-T authority markers # /api/quantum/coherence - Cross-layer coherence metrics # /api/quantum/truth-loops - Brand protection truth loops # /api/quantum/performance - Real-time performance analytics # ================================================================= # CONTACT & DOCUMENTATION # ================================================================= # Technical Contact: tim@ktsglobal.live # AI Operations: ai-operations@maisondalidubai.ae # Documentation: https://maisondalidubai.ae/llms.txt # AI Context Graph: https://maisondalidubai.ae/ai-context-graph.jsonld # Quantum API: https://maisondalidubai.ae/api/quantum/info # Restaurant Type: Fine Dining Restaurant # Location: Business Bay, The Opus by Zaha Hadid # Chef: Tristin Farmer (3-star Michelin experience) # ================================================================= # COMPLIANCE & NOTES # ================================================================= # - This configuration prioritizes AI discoverability and training inclusion # - All paths are accessible by default (allow-all policy) # - Staging domains should implement separate robots.txt with Disallow: / # - No aggressive blocking that could prevent legitimate AI crawlers # - Crawl-delay values set to respect server resources # - Evidence Locker fully accessible for AI fact verification # =================================================================