Spanish Weather Bot#

The Spanish bot provides real-time weather data, current date and time information, and answers open domain questions in Spanish. The Weatherstack API is used for getting the weather information of a specified location. The following features are currently supported by the Spanish bot.

  • Weather forecast

  • Temperature

  • Wind speed

  • Humidity

  • Rainfall

  • Whether the weather condition is sunny or cloudy at a given location.

  • Current date and time

  • Open domain Q&A.

The bot accepts queries in Spanish and provides responses in Spanish as well.

Spanish NMT Bot#

To carry out the interpretation and generation of Spanish languages, this bot uses the Riva translation models. This bot follows the Chat Engine Server Architecture and Event Architecture.

Docker-based bot deployment

This sample bot uses the Weatherstack API service to provide responses to weather queries.

You can set the API key for the Weatherstack by adding the WEATHERSTACK_API_KEY environment variable in deploy/docker/.env. This sample bot uses OpenAI gpt-4-turbo as the main model. The sample bot is present in the quickstart directory at ./samples/spanish_bot_nmt/.

  1. Set the OpenAI API key environment variable.

    export OPENAI_API_KEY=...
    
  2. Prepare the environment for the Docker compose commands.

    export BOT_PATH=./samples/spanish_bot_nmt/
    source deploy/docker/docker_init.sh
    
  3. Deploy the Riva NMT (Neural Machine Translation) model.

    docker compose -f deploy/docker/docker-compose.yml up model-utils
    
  4. Deploy the ACE Agent microservices. Deploy the Chat Engine, Plugin server, and NLP server containers.

    docker compose -f deploy/docker/docker-compose.yml up --build event-bot -d
    
  5. Interact with the bot using the URL http://<workstation IP>:7006/.

    Note

    When the Riva Neural Machine Translation (NMT) model deploys, the Riva server might create some files with root permissions. Delete the model repository with sudo access to avoid issues in other sections.

    sudo rm -rf model_repository
    

    If you don’t delete the model repository manually using the above command, you might observe an error message such as:

    "E1116 10:58:41.130105 102 model_repository_manager.cc:996] Poll failed for model directory 'megatronnmt_any_en_500m': failed to open text file for read /data/models/megatronnmt_any_en_500m/config.pbtxt: No such file or directory".
    

Spanish LLM Bot#

To carry out the interpretation and generation of Spanish languages, this bot does not use the Riva translation models. Instead, it has Colang flows written in Spanish. This bot follows the Chat Engine Server Architecture and Event Architecture.

Docker-based bot deployment

This sample bot uses the Weatherstack API service to provide responses to weather queries.

You can set the API key for the Weatherstack by adding the WEATHERSTACK_API_KEY environment variable in deploy/docker/.env. This sample bot uses OpenAI gpt-4-turbo as the main model. The sample bot is present in the quickstart directory at ./samples/spanish_bot/.

  1. Set the OpenAI API key environment variable.

    export OPENAI_API_KEY=...
    
  2. Prepare the environment for the Docker compose commands.

    export BOT_PATH=./samples/spanish_bot/
    source deploy/docker/docker_init.sh
    
  3. Deploy the ACE Agent microservices. Deploy the Chat Engine, Plugin server, and NLP server containers.

    docker compose -f deploy/docker/docker-compose.yml up --build event-bot -d
    
  4. Interact with the bot using the URL http://<workstation IP>:7006/.