One of the most common weaknesses of chatbots — even AI-powered ones — is that they have no real sense of place. They might know a lot about your content, but they don’t actually know where the user currently is on your site.

That’s not a big deal when you’re dealing with generic Q&A. But it quickly becomes a problem when your chatbot lives inside a context-rich environment like a Drupal shop or product catalog.

Imagine a user is viewing the Austroflamm Ivy fireplace stove product page and asks:

“Is it also available with a soapstone cladding?”

A human would instantly understand that this question refers to the product currently on screen. A typical AI chatbot, however, might completely miss that connection — and start talking about a different product you discussed five minutes ago.

When Prompting Isn’t Enough

My first instinct was to solve this purely at the prompt level. I tried telling the AI something like:

“Always consider the current page context. If the visitor is on a product page, their question probably refers to that product.”

Sounds good in theory. In practice? — It didn’t work.

The AI simply had no actual knowledge of what page it was on. It can’t “see” the route, the node, or any of the structured data behind it.

Extending Drupal AI with Custom Tokens

After digging through the Drupal AI and AI Agents module code, I noticed that only a few token groups are available by default — mainly "user" and "ai_agents".

However, existing groups like "current-page" can also be made available to the AI system via an event subscriber on the BuildSystemPromptEvent.

class AiChatbotSystemPromptSubscriber implements EventSubscriberInterface {

  public static function getSubscribedEvents(): array {
    return [
      BuildSystemPromptEvent::EVENT_NAME => ['onSystemPromptBuild'],
    ];
  }

  public function onSystemPromptBuild(BuildSystemPromptEvent $event): void {
    // “current-page” already exists in Drupal Token.
    // We’re just making it available to the AI prompt system.
    $event->setToken('current-page', []);
  }

}

This doesn’t create a new token group — it simply tells Drupal AI that tokens from the existing current-page group may now be used within prompts or system messages.

Defining the Token

Next, I added a specific token within that group. In hook_token_info(), I defined an ai-context token, meant to deliver structured context data about the current page:

$data['tokens']['current-page']['ai-context'] = [
  'name' => t('Current Page Context'),
  'description' => t('Provides structured information about the entity currently being viewed for AI-RAG context.'),
];

Then in hook_tokens(), I connected it to a custom helper function that actually resolves what “current page” means in an AI request:

if ($type === 'current-page') {
  foreach ($tokens as $name => $original) {
    switch ($name) {
      case 'ai-context':
        $replacements[$original] = Markup::create(my_get_current_ai_context());
        $bubbleable_metadata->addCacheContexts(['url.path']);
        break;
    }
  }
}

The Critical Part: Detecting the Chatbot Context

Here’s where it gets interesting. When the chatbot sends a request, it doesn’t run in the user’s front-end page context.

Instead, it always goes through the chatbot API route — in this case, /api/deepchat.

That means we can’t simply rely on \Drupal::routeMatch() to tell us what node or entity the user is looking at. We need to reconstruct that context manually from the JSON payload.

function my_get_current_ai_context(): string {
  $entity = NULL;
  $route_match = \Drupal::routeMatch();

  // Detect whether we’re in a chatbot API request.
  if ($route_match->getRouteName() === 'ai_chatbot.api') {
    $request = \Drupal::request();
    $content = $request->getContent();
    $data = Json::decode($content);

    // Extract the user’s original route from the chatbot payload.
    if (isset($data['contexts']['current_route'])) {
      $current_url = $data['contexts']['current_route'];

      /** @var \Drupal\Core\Routing\Router $router */
      $router = \Drupal::service('router.no_access_checks');

      try {
        $parent_route_match_info = $router->match($current_url);
        foreach (['node', 'taxonomy_term', 'commerce_product'] as $param_name) {
          if (isset($parent_route_match_info[$param_name])) {
            $entity = $parent_route_match_info[$param_name];
            break;
          }
        }
      }
      catch (\Exception $e) {}
    }
  }
  else {
    // Fallback: resolve directly from the current route (non-AI contexts).
    foreach (['node', 'taxonomy_term', 'commerce_product'] as $param_name) {
      if ($route_match->getParameter($param_name)) {
        $entity = $route_match->getParameter($param_name);
        break;
      }
    }
  }

  if ($entity instanceof ContentEntityInterface) {
    $context = [
      '* URL: ' . $entity->toUrl('canonical', ['absolute' => TRUE])->toString(),
      '* Type: ' . $entity->getEntityTypeId(),
      '* Title: ' . $entity->label(),
    ];
    return implode("\n", $context);
  }

  return 'No context available.';
}

The key insight: the chatbot doesn’t “know” where it is — it only knows what the frontend tells it via the API.
By decoding the current_route value in the chatbot payload, we can re-match the route server-side and resolve the actual entity.

Results

Once that token was active, I could inject it directly into the system prompt for my chatbot’s AI Agent:

## Current Page Context
**The current page context is critical.**  
If the user asks a question while viewing a detail page, the answer should primarily consider this information.  

The current page context is provided here:
[current-page:ai-context]

If this context is available, prioritize questions related to it over a general search in the index.  
If no context is available, proceed normally without referencing it.

By structuring the prompt this way, the AI explicitly knows to pay attention to the page-specific entity first, rather than giving generic answers or relying on previous conversation history.

Now, when a user on the Austroflamm Ivy fireplace stove page asks:

“Is it also available with a soapstone cladding?”

…the chatbot knows exactly what product the user means — without ever needing to guess.

Final Thoughts

Giving your chatbot real awareness of its environment transforms it from a generic Q&A machine into a true assistant.

It’s one of those cases where a little bit of Drupal plumbing dramatically improves the user experience — and prevents your AI from getting “lost” on your own site.

Comments

Klartext

  • Keine HTML-Tags erlaubt.
  • Zeilenumbrüche und Absätze werden automatisch erzeugt.
  • Website- und E-Mail-Adressen werden automatisch in Links umgewandelt.
The content of this field is kept private and will not be shown publicly.