Skip to main content

On Sale: GamesAssetsToolsTabletopComics
Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
TagsGame Engines

AIdventure

Text adventure game with an AI. No rules, no censorship, the only limit is your imagination, not someone else's. · By Lyaaaaaaaaaaaaaaa

Tech-support Sticky

A topic by Lyaaaaaaaaaaaaaaa created Jan 29, 2022 Views: 2,824 Replies: 69
Viewing posts 1 to 20
Developer (2 edits)

If you encountered any problem, you can ask for help here. The support is for both demo and paid versions :).

Attach logs to your post!

Without logs, I might not be able to help you.

Where to find the game’s logs?

  • On Windows - %APPDATA%/Roaming/aidventure
  • On Linux - ~local/share/godot/app_userdata/AIdventure OR ~.local/share/godot/app_userdata/AIdventure

ummm

I stucked in the "connecting to the server"screen.

How could I fix this?

Developer

Did you run aidventure_installer before running the game? You should have a file named “installation_instructions.txt”. It might also happen if the game doesn’t find the server (It you moved the game after installing it).

solved.Thank.

By the way, May I ask  where did the 1.5G of AI get downloaded to?

Developer

They are saved in the game folder, in “models” and are downloaded from https://huggingface.co/

New Problems

The cmd window came out and closed not long after, showing that it had finished and could exit, but the task manager showed that the game was not using the network, and there was no models folder in the game folder, and it was still stuck in the interface of connecting to the server.

Is it possible to install AI manually?

Developer

Could you give me the logs found in %appdata%/roaming/aidventure/logs ? The download only happens if you reach the page asking you to select an AI. I think you are stuck before it then.

[2022.02.03 15:33:41] | INFO  | [installer.gd] [check_os()] >> Checking the os and its architecture

[2022.02.03 15:33:41] | DEBUG | [installer.gd] [check_os()] >> Architecture: 64 bits

[2022.02.03 15:33:41] | DEBUG | [installer.gd] [check_os()] >> OS: Windows

[2022.02.03 15:33:41] | INFO  | [installer.gd] [move_env_config] >> Coping conda environment config file to user://

[2022.02.03 15:33:41] | INFO  | [installer.gd] [move_env_config] >> Copied res://conda_config_windows.yml to user://conda_config.yml

[2022.02.03 15:33:46] | INFO  | [installer.gd] [download_conda] >> Downloading conda's installer

[2022.02.03 15:34:23] | INFO  | [installer.gd] [Install_conda] >> Installing conda.

[2022.02.03 15:36:09] | INFO  | [installer.gd] [install_conda] >> Successfully started miniconda installer

[2022.02.03 15:37:28] | INFO  | [installer.gd] [create_environment] >> Creating conda environment.

[2022.02.03 15:37:30] | ERROR | [installer.gd] [create_environment] >> Couldn't create conda environment. Error: 1

[2022.02.03 15:37:30] | INFO  | [installer.gd] [_create_config] >> Creating the config file.

[2022.02.03 15:37:30] | INFO  | [installer.gd] [_create_config] >> Successfully created the config file.

[2022.02.03 15:37:35] | INFO  | [installer.gd] [clear_cache] >> Clearing the cache.

Godot Engine v3.4.1.stable.official.aa1b95889 - https://godotengine.org

OpenGL ES 3.0 Renderer: GeForce MX250/PCIe/SSE2

OpenGL ES Batching: ON

[2022.02.03 15:33:41] | INFO  | [installer.gd] [check_os()] >> Checking the os and its architecture

[2022.02.03 15:33:41] | DEBUG | [installer.gd] [check_os()] >> Architecture: 64 bits

[2022.02.03 15:33:41] | DEBUG | [installer.gd] [check_os()] >> OS: Windows

[2022.02.03 15:33:41] | INFO  | [installer.gd] [move_env_config] >> Coping conda environment config file to user://

[2022.02.03 15:33:41] | INFO  | [installer.gd] [move_env_config] >> Copied res://conda_config_windows.yml to user://conda_config.yml

[2022.02.03 15:33:46] | INFO  | [installer.gd] [download_conda] >> Downloading conda's installer

[2022.02.03 15:34:23] | INFO  | [installer.gd] [Install_conda] >> Installing conda.

[2022.02.03 15:36:09] | INFO  | [installer.gd] [install_conda] >> Successfully started miniconda installer

[2022.02.03 15:37:28] | INFO  | [installer.gd] [create_environment] >> Creating conda environment.

ERROR: Error calling method from signal 'popup_hide': 'Control(installer.gd)::_on_CondaDestinationDialog_popup_hide': Method not found..

   at: emit_signal (core/object.cpp:1236) - Error calling method from signal 'popup_hide': 'Control(installer.gd)::_on_CondaDestinationDialog_popup_hide': Method not found..

[2022.02.03 15:37:30] | ERROR | [installer.gd] [create_environment] >> Couldn't create conda environment. Error: 1

[2022.02.03 15:37:30] | INFO  | [installer.gd] [_create_config] >> Creating the config file.

[2022.02.03 15:37:30] | INFO  | [installer.gd] [_create_config] >> Successfully created the config file.

[2022.02.03 15:37:35] | INFO  | [installer.gd] [clear_cache] >> Clearing the cache.

WARNING: ObjectDB instances leaked at exit (run with --verbose for details).

   at: cleanup (core/object.cpp:2064) - ObjectDB instances leaked at exit (run with --verbose for details).

ERROR: Resources still in use at exit (run with --verbose for details).

   at: clear (core/resource.cpp:417) - Resources still in use at exit (run with --verbose for details).

Developer (1 edit)

Thanks. You can add me on discord if you want the support to be faster. Lyaaaaaaaaaaaaaaa#4607 It looks like the environment isn’t installed. You should check the folder where you installed conda. The folder path\to\miniconda3\env\aidventure should be ~1.6GB. Are you running the demo or paid version? It might help me while investigating the error. Btw, make sure you downloaded the last version of the game. 1.1.2.

Oh god

There are no files or folders named under the nevs folder that contain the word aidventure

(miniconda3\envs)

There is only one file named .conda_envs_dir_test

And I'm running the demo version.

pls help.

°(°ˊДˋ°) °

Developer (1 edit)

Problem solved by

I found that it might be a problem with Pycharm and vpn. So I uninstalled all python-related software and reinstalled miniconda. and set the mirror address to domestic, turned off vpn and successfully created the environment

Said Ogoesinf2 on Discord.

My game window closes after I click "validate" after selecting a scenario? No matter which scenario I choose or the stats I give to my character, the results seem to be the same. I haven't seen the game attempting to download any AI model yet, so maybe I missed a step in the installation?

(1 edit)

Also, my settings that I choose in the "Options" tab are not remembered by the program when the window closes. The game always opens in the windowed mode, ignoring my fullscreen selection, even if I close the game regularly rather than the window closing automatically when I try to start the game. And when I try to create a custom scenario with the scenario editor, click "save as",  and open the scenarios folder, the scenario file name does not show up in the folder. Help would be appreciated!

Developer (1 edit)

It seems like the game doesn’t have permission to write in its folder… You could try moving it to another folder (beware if you do this you will have to manually edit the server path in settings.cfg) or try launching it with admin permissions.

Otherwise, could you send me the logs of the game? You can find them in %APPDATA%/Roaming/aidventure

(1 edit)

I found this repeated in 5 documents in the logs folder.

2022/02/25 03:41:05

Game version: 1.1.5

ERROR: Couldn't find the given section "paths" and key "server", and no default was given.

   at: get_value (core/io/config_file.cpp:84) - Condition "p_default.get_type() == Variant::NIL" is true. Returned: Variant()

ERROR: Couldn't find the given section "paths" and key "conda", and no default was given.

   at: get_value (core/io/config_file.cpp:84) - Condition "p_default.get_type() == Variant::NIL" is true. Returned: Variant()

Running as an administrator and moving the folder allowed me to at least make a save, but the window still closes when I click "Validate".

Fixed! I had unzipped the files into program files like the rest of my games, but it seemed to have some conflict with permissions, so I put it in my games desktop folder and now it's letting me select the model to download! Hopefully this helps someone else and thanks for the help Dev :)

Developer (1 edit)

You’re welcome. If you encounter any other error, post again.

“ERROR: Couldn’t find the given section “paths” and key “server”, and no default was given.” can happen when you didn’t run the installer first or when the installer failed.

(1 edit)

Hello,

I am running the paid version of AIdventure and I am having issues with a few things.

  • After selecting an AI (already downloaded) the screen sits at 'Please Wait' consistently at 75%, while loading the AI.
  • Success beyond this is not much better. The game runs very slowly and after I insert a line of text, it hangs for quite sometime but ultimately generates no AI response.
  • I have tried uninstalling conda and AIdventure, this did not seem to change anything.
  • I have 16GB of RAM.
  • Running version 1.22
  • Also seeing errors in the cmd prompt sometimes which say, "An error happened while using the GPU."

Any thoughts what is going on here?



Godot Engine v3.4.3.stable.official.242c05d12 - https://godotengine.org

OpenGL ES 3.0 Renderer: NVIDIA GeForce GTX 1660 SUPER/PCIe/SSE2

OpenGL ES Batching: ON

2022/03/22 12:15:02

Game version: 1.2.2

D:/AIdventure/models/KoboldAI/GPT-Neo-2.7B-Shinen/pytorch_model.bin is missing.

D:/AIdventure/models/KoboldAI/GPT-Neo-2.7B-Shinen/config.json is missing.

D:/AIdventure/tokenizers/KoboldAI/GPT-Neo-2.7B-Shinen/merges.txt is missing.

D:/AIdventure/tokenizers/KoboldAI/GPT-Neo-2.7B-Shinen/special_tokens_map.json is missing.

D:/AIdventure/tokenizers/KoboldAI/GPT-Neo-2.7B-Shinen/tokenizer.json is missing.

D:/AIdventure/tokenizers/KoboldAI/GPT-Neo-2.7B-Shinen/tokenizer_config.json is missing.

D:/AIdventure/tokenizers/KoboldAI/GPT-Neo-2.7B-Shinen/vocab.json is missing.

Moved user://cache/pytorch_model.bin to D:/AIdventure/models/KoboldAI/GPT-Neo-2.7B-Shinen/pytorch_model.bin

Moved user://cache/config.json to D:/AIdventure/models/KoboldAI/GPT-Neo-2.7B-Shinen/config.json

Moved user://cache/merges.txt to D:/AIdventure/tokenizers/KoboldAI/GPT-Neo-2.7B-Shinen/merges.txt

Moved user://cache/special_tokens_map.json to D:/AIdventure/tokenizers/KoboldAI/GPT-Neo-2.7B-Shinen/special_tokens_map.json

Moved user://cache/tokenizer.json to D:/AIdventure/tokenizers/KoboldAI/GPT-Neo-2.7B-Shinen/tokenizer.json

Moved user://cache/tokenizer_config.json to D:/AIdventure/tokenizers/KoboldAI/GPT-Neo-2.7B-Shinen/tokenizer_config.json

Moved user://cache/vocab.json to D:/AIdventure/tokenizers/KoboldAI/GPT-Neo-2.7B-Shinen/vocab.json

ERROR: Connection lost with the server. clean: False

   at: call (modules/gdscript/gdscript_functions.cpp:774) - Connection lost with the server. clean: False

connection closed

WARNING: WASAPI: Unsupported number of channels: 1

   at: init_render_device (drivers/wasapi/audio_driver_wasapi.cpp:328) - WASAPI: Unsupported number of channels: 1

ERROR: WASAPI: Initialize failed with error 0xffffffff80070015.

   at: audio_device_init (drivers/wasapi/audio_driver_wasapi.cpp:298) - Condition "hr != ((HRESULT)0x00000000)" is true. Returned: ERR_CANT_OPEN

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

ERROR: WASAPI: GetCurrentPadding error

   at: thread_func (drivers/wasapi/audio_driver_wasapi.cpp:626) - WASAPI: GetCurrentPadding error

WARNING: WASAPI: Unsupported number of channels: 1

   at: init_render_device (drivers/wasapi/audio_driver_wasapi.cpp:328) - WASAPI: Unsupported number of channels: 1

WARNING: Couldn't shutdown the server.

   at: call (modules/gdscript/gdscript_functions.cpp:788) - Couldn't shutdown the server.

ERROR: Connection lost with the server. clean: False

   at: call (modules/gdscript/gdscript_functions.cpp:774) - Connection lost with the server. clean: False

Developer (2 edits)

Hi, thank you for your report.

I see a few problems here.

On your example you try to load Shinen model. But it requires 16GB of ram by itself. It means you need 16GB for the AI + X for your OS + ~500MB for the game. You might be running out of ram.

Same thing for the GPU. This error is triggered when your GPU runs out of RAM. I suggest starting by a small AI to tweak your options.

Here are a few tweaks to improve the generation.

In the options:

  • Increase the max time from 10(seconds) to 120
  • You might want to reduce the context max length and memory max length too (as they use CPU too)
  • Make sure remove unfinished sentences is not check for your tests.

For the weird ERROR: WASAPI: GetCurrentPadding error I have no idea. It’s the first time I see them. It seems to be some errors related to audio drivers. The game doesn’t have any audio yet though. I hope this helps!

(+1)

This is definitely helpful thank you!

Developer

I’m glad, have fun!

Can't install. Getting an error and it says it can't install mamba right after clicking install game.

Developer

Hello, can you paste the logs of the installer here? It will greatly help for the support. It might also come from the anti-virus.

[2022.04.20 15:20:02] | INFO  | [installer.gd] [check_os()] >> Checking the os and its architecture

[2022.04.20 15:20:02] | DEBUG | [installer.gd] [check_os()] >> Architecture: 64 bits

[2022.04.20 15:20:02] | DEBUG | [installer.gd] [check_os()] >> OS: Windows

[2022.04.20 15:20:03] | DEBUG | [installer.gd] [start_installation] >> Is cuda enabled: True

[2022.04.20 15:20:03] | INFO  | [installer.gd] [move_env_config] >> Coping conda environment config file to user://

[2022.04.20 15:20:03] | INFO  | [installer.gd] [move_env_config] >> Copied res://conda_config_cuda.yml to user://conda_config.yml

[2022.04.20 15:20:03] | INFO  | [installer.gd] [_create_config] >> Creating the config file.

[2022.04.20 15:20:03] | INFO  | [installer.gd] [_create_config] >> Successfully created the config file.

[2022.04.20 15:20:03] | INFO  | [installer.gd] [install_mamba] >> Installing mamba

[2022.04.20 15:20:03] | ERROR | [installer.gd] [install_mamba] >> Failed installing mamba. Error: 1

[2022.04.20 15:20:03] | DEBUG | [installer.gd] [install_mamba] >> []

[2022.04.20 15:20:03] | INFO  | [installer.gd] [create_environment] >> Creating the environment.

[2022.04.20 15:20:03] | ERROR | [installer.gd] [create_environment] >> Failed creating the environment. Error: 1

[2022.04.20 15:20:03] | DEBUG | [installer.gd] [create_environment] >> []

I have no AV so it can't be that.

Developer

Try the following.

  • Make sure you have permissions in the directory where you moved the installer.
  • Make sure you have a micromamba.exe file in the mamba folder of the game.
  • Try running the installer with admin permission
  • Try to give admin permission to the micromamba.exe

No dice again. The installer actually comes up with the window of installing mamba before giving an error. So that's something?

Developer

What windows are you talking about? Is another window popping up?

You can also try to have the installer on the C drive instead of another one. And you might have no anti-virus but sometimes Windows Defender is the problem.

In the logs I can see when trying to install mamba it immediately fails. There might be something killing mamba process.

The additional window that pops up saying "Installing Mamba, please wait."

Tried moving it to C: drive and turned off Windows Defender, still getting the same error.

Any idea as to what could be killing it?

Developer (1 edit)

To be honest I don’t know. But we can try to manually install mamba.

  1. You will need to open Windows’s cmd and drag and drop the micromamba.exe file in it (This will write the path to the file)
  2. Then add the following behind micromamba.exe path: -r \path\to\aidventure_folder\mamba\ -p \path\to\aidventure_folder\mamba\ install mamba -c conda-forge

It should manually install mamba at the right place. We will see if it fails or not. If it doesn’t fail, you can try again using the installer (it will probably fail again at the install_mamba part but might succeed creating the environment! If the environment is correctly created, the game should work.

Doesn't even run the install. Also when trying to start micromamba.exe before, nothing pops up. Is it maybe an issue with the .exe in the upload?

Developer

Maybe a problem with the file then… You can try to redownload the game and see. Or, you can manually download micromamba at https://micro.mamba.pm/api/micromamba/win-64/latest and replace the micromamba.exe by the one you downloaded (make sure to rename the new .exe into micromamba.exe

got an error when installing.

[2022.04.25 06:37:40] | INFO  | [installer.gd] [check_os()] >> Checking the os and its architecture

[2022.04.25 06:37:40] | DEBUG | [installer.gd] [check_os()] >> Architecture: 64 bits

[2022.04.25 06:37:40] | DEBUG | [installer.gd] [check_os()] >> OS: Windows

[2022.04.25 06:37:42] | DEBUG | [installer.gd] [start_installation] >> Is cuda enabled: True

[2022.04.25 06:37:42] | INFO  | [installer.gd] [move_env_config] >> Coping conda environment config file to user://

[2022.04.25 06:37:42] | INFO  | [installer.gd] [move_env_config] >> Copied res://conda_config_cuda.yml to user://conda_config.yml

[2022.04.25 06:37:42] | INFO  | [installer.gd] [_create_config] >> Creating the config file.

[2022.04.25 06:37:42] | INFO  | [installer.gd] [_create_config] >> Successfully created the config file.

[2022.04.25 06:37:42] | INFO  | [installer.gd] [install_mamba] >> Installing mamba

[2022.04.25 06:39:08] | INFO  | [installer.gd] [install_mamba] >> Successfully installed mamba

[2022.04.25 06:39:08] | DEBUG | [installer.gd] [install_mamba] >> [

                                           __

          __  ______ ___  ____ _____ ___  / /_  ____ _

         / / / / __ `__ \/ __ `/ __ `__ \/ __ \/ __ `/

        / /_/ / / / / / / /_/ / / / / / / /_/ / /_/ /

       / .___/_/ /_/ /_/\__,_/_/ /_/ /_/_.___/\__,_/

      /_/

Transaction

  Prefix: C:\Users\winte\Downloads\aidventure-windows-64\mamba

  Updating specs:

   - mamba

  Package                        Version  Build               Channel                 Size

--------------------------------------------------------------------------------------------

  Install:

--------------------------------------------------------------------------------------------

  + brotlipy                       0.7.0  py310he2412df_1004  conda-forge/win-64     337kB

  + bzip2                          1.0.8  h8ffe710_4          conda-forge/win-64     152kB

  + ca-certificates            2021.10.8  h5b45459_0          conda-forge/win-64     180kB

  + certifi                    2021.10.8  py310h5588dad_2     conda-forge/win-64     149kB

  + cffi                          1.15.0  py310hcbf9ad4_0     conda-forge/win-64     236kB

  + charset-normalizer            2.0.12  pyhd8ed1ab_0        conda-forge/noarch      36kB

  + colorama                       0.4.4  pyh9f0ad1d_0        conda-forge/noarch      18kB

  + conda                         4.12.0  py310h5588dad_0     conda-forge/win-64       1MB

  + conda-package-handling         1.8.1  py310h4f637d6_1     conda-forge/win-64     764kB

  + cryptography                  36.0.2  py310ha857299_1     conda-forge/win-64       1MB

  + idna                             3.3  pyhd8ed1ab_0        conda-forge/noarch      56kB

  + krb5                          1.19.3  h1176d77_0          conda-forge/win-64     867kB

  + libarchive                     3.5.2  hb45042f_1          conda-forge/win-64       2MB

  + libcurl                       7.82.0  h789b8ee_0          conda-forge/win-64     309kB

  + libffi                         3.4.2  h8ffe710_5          conda-forge/win-64      42kB

  + libiconv                        1.16  he774522_0          conda-forge/win-64     697kB

  + libmamba                      0.22.1  h81a967f_1          conda-forge/win-64       3MB

  + libmambapy                    0.22.1  py310hd80b381_1     conda-forge/win-64     230kB

  + libsolv                       0.7.22  h7755175_0          conda-forge/win-64     422kB

  + libssh2                       1.10.0  h680486a_2          conda-forge/win-64     232kB

  + libxml2                       2.9.13  hf5bbc77_0          conda-forge/win-64       4MB

  + libzlib                       1.2.11  h8ffe710_1014       conda-forge/win-64      65kB

  + lz4-c                          1.9.3  h8ffe710_1          conda-forge/win-64     138kB

  + lzo                             2.10  he774522_1000       conda-forge/win-64     170kB

  + mamba                         0.22.1  py310h9376f3e_1     conda-forge/win-64      63kB

  + menuinst                      1.4.18  py310h5588dad_1     conda-forge/win-64      98kB

  + openssl                       1.1.1n  h8ffe710_0          conda-forge/win-64       6MB

  + pip                           22.0.4  pyhd8ed1ab_0        conda-forge/noarch       2MB

  + pybind11-abi                       4  hd8ed1ab_3          conda-forge/noarch      10kB

  + pycosat                        0.6.3  py310he2412df_1010  conda-forge/win-64     101kB

  + pycparser                       2.21  pyhd8ed1ab_0        conda-forge/noarch     103kB

  + pyopenssl                     22.0.0  pyhd8ed1ab_0        conda-forge/noarch      50kB

  + pysocks                        1.7.1  py310h5588dad_5     conda-forge/win-64      29kB

  + python                        3.10.4  h9a09f29_0_cpython  conda-forge/win-64      17MB

  + python_abi                      3.10  2_cp310             conda-forge/win-64       5kB

  + pywin32                          303  py310he2412df_0     conda-forge/win-64       7MB

  + reproc                        14.2.3  h8ffe710_0          conda-forge/win-64      32kB

  + reproc-cpp                    14.2.3  h0e60522_0          conda-forge/win-64      26kB

  + requests                      2.27.1  pyhd8ed1ab_0        conda-forge/noarch      54kB

  + ruamel_yaml                  0.15.80  py310he2412df_1006  conda-forge/win-64     293kB

  + setuptools                    62.1.0  py310h5588dad_0     conda-forge/win-64       1MB

  + six                           1.16.0  pyh6c4a22f_0        conda-forge/noarch      14kB

  + sqlite                        3.38.2  h8ffe710_0          conda-forge/win-64       1MB

  + tk                            8.6.12  h8ffe710_0          conda-forge/win-64       4MB

  + tqdm                          4.64.0  pyhd8ed1ab_0        conda-forge/noarch      83kB

  + tzdata                         2022a  h191b570_0          conda-forge/noarch     124kB

  + ucrt                    10.0.20348.0  h57928b3_0          conda-forge/win-64       1MB

  + urllib3                       1.26.9  pyhd8ed1ab_0        conda-forge/noarch     103kB

  + vc                              14.2  hb210afc_6          conda-forge/win-64      14kB

  + vs2015_runtime           14.29.30037  h902a5da_6          conda-forge/win-64       1MB

  + wheel                         0.37.1  pyhd8ed1ab_0        conda-forge/noarch      32kB

  + win_inet_pton                  1.1.0  py310h5588dad_4     conda-forge/win-64       9kB

  + xz                             5.2.5  h62dcd97_1          conda-forge/win-64     216kB

  + yaml                           0.2.5  h8ffe710_2          conda-forge/win-64      63kB

  + yaml-cpp                       0.6.3  ha925a31_4          conda-forge/win-64     150kB

  + zlib                          1.2.11  h8ffe710_1014       conda-forge/win-64     109kB

  + zstd                           1.5.2  h6255e5f_0          conda-forge/win-64       1MB

  Summary:

  Install: 57 packages

  Total download: 59MB

--------------------------------------------------------------------------------------------

Transaction starting

Linking ucrt-10.0.20348.0-h57928b3_0

Linking ca-certificates-2021.10.8-h5b45459_0

Linking vs2015_runtime-14.29.30037-h902a5da_6

Linking vc-14.2-hb210afc_6

Linking lzo-2.10-he774522_1000

Linking yaml-0.2.5-h8ffe710_2

Linking lz4-c-1.9.3-h8ffe710_1

Linking libiconv-1.16-he774522_0

Linking reproc-14.2.3-h8ffe710_0

Linking yaml-cpp-0.6.3-ha925a31_4

Linking xz-5.2.5-h62dcd97_1

Linking sqlite-3.38.2-h8ffe710_0

Linking libzlib-1.2.11-h8ffe710_1014

Linking libffi-3.4.2-h8ffe710_5

Linking bzip2-1.0.8-h8ffe710_4

Linking tk-8.6.12-h8ffe710_0

Linking openssl-1.1.1n-h8ffe710_0

Linking reproc-cpp-14.2.3-h0e60522_0

Linking zstd-1.5.2-h6255e5f_0

Linking libsolv-0.7.22-h7755175_0

Linking zlib-1.2.11-h8ffe710_1014

Linking krb5-1.19.3-h1176d77_0

Linking libxml2-2.9.13-hf5bbc77_0

Linking libssh2-1.10.0-h680486a_2

Linking libarchive-3.5.2-hb45042f_1

Linking libcurl-7.82.0-h789b8ee_0

Linking libmamba-0.22.1-h81a967f_1

Linking tzdata-2022a-h191b570_0

Linking pybind11-abi-4-hd8ed1ab_3

Linking python-3.10.4-h9a09f29_0_cpython

Linking python_abi-3.10-2_cp310

Linking setuptools-62.1.0-py310h5588dad_0

Linking wheel-0.37.1-pyhd8ed1ab_0

Linking pip-22.0.4-pyhd8ed1ab_0

Linking colorama-0.4.4-pyh9f0ad1d_0

Linking pycparser-2.21-pyhd8ed1ab_0

Linking six-1.16.0-pyh6c4a22f_0

Linking charset-normalizer-2.0.12-pyhd8ed1ab_0

Linking idna-3.3-pyhd8ed1ab_0

Linking tqdm-4.64.0-pyhd8ed1ab_0

Linking win_inet_pton-1.1.0-py310h5588dad_4

Linking pywin32-303-py310he2412df_0

Linking certifi-2021.10.8-py310h5588dad_2

Linking ruamel_yaml-0.15.80-py310he2412df_1006

Linking pycosat-0.6.3-py310he2412df_1010

Linking libmambapy-0.22.1-py310hd80b381_1

Linking cffi-1.15.0-py310hcbf9ad4_0

Linking conda-package-handling-1.8.1-py310h4f637d6_1

Linking pysocks-1.7.1-py310h5588dad_5

Linking menuinst-1.4.18-py310h5588dad_1

Linking brotlipy-0.7.0-py310he2412df_1004

Linking cryptography-36.0.2-py310ha857299_1

Linking pyopenssl-22.0.0-pyhd8ed1ab_0

Linking urllib3-1.26.9-pyhd8ed1ab_0

Linking requests-2.27.1-pyhd8ed1ab_0

Linking conda-4.12.0-py310h5588dad_0

Linking mamba-0.22.1-py310h9376f3e_1

Transaction finished

]

[2022.04.25 06:39:08] | INFO  | [installer.gd] [create_environment] >> Creating the environment.

[2022.04.25 06:39:18] | ERROR | [installer.gd] [create_environment] >> Failed creating the environment. Error: 1

[2022.04.25 06:39:18] | DEBUG | [installer.gd] [create_environment] >> [conda-forge/win-64                                          Using cache

conda-forge/noarch                                          Using cache

Looking for: ['pip', 'python=3.9.7', 'pytorch=1.10.1', 'torchvision=0.11.2', 'cudatoolkit=11.3.1', 'websockets=10.0', 'transformers=4.15']

Encountered problems while solving:

  - package cudatoolkit-11.3.1-h59b6b97_2 has constraint __cuda >=11.3 conflicting with __cuda-9.1-0

]

I honestly am so lost.

Developer

Hello, you can fix the error by replacing aidventure_installer.pck by this file https://cdn.discordapp.com/attachments/931956343244480592/966734585817882665/aidventure_installer.pck then run the installer again from the start.

It’s a known issue that should be fix in the next patch.

I'm currently stuck unable to play it (full version)

I keep getting errors

Godot Engine v3.4.4.stable.official.419e713a2 - https://godotengine.org OpenGL ES 3.0 Renderer: NVIDIA GeForce GTX 1650 SUPER/PCIe/SSE2 OpenGL ES Batching: ON   2022/07/25 19:33:18 Game version: 1.3.1 OS: Windows ERROR: Connection lost with the server. clean: False    at: call (modules/gdscript/gdscript_functions.cpp:774) - Connection lost with the server. clean: False connection closed ERROR: Condition "!is_connected_to_host()" is true. Returned: FAILED    at: put_packet (modules/websocket/wsl_peer.cpp:243) - Condition "!is_connected_to_host()" is true. Returned: FAILED WARNING: Unable to send data to the server    at: call (modules/gdscript/gdscript_functions.cpp:788) - Unable to send data to the server ERROR: Connection lost with the server. clean: True    at: call (modules/gdscript/gdscript_functions.cpp:774) - Connection lost with the server. clean: True
Developer

Hello, can you also send the logs of the server? You can find them in the game folder. The file is named server_logs.text. Otherwise, Blue Stacks is known to create problems with the game. Another solution might be to authorize the server when Windows popups a window about python. You can also try to see if it comes from your anti-virus. AVG AntiVirus Free is known to create troubles during the installation, maybe this is something related.

(2 edits) (+1)

Using the paid version.
No matter which model I use (I have 32gb RAM) and no matter how short or long I set the time for the AI to generate text, it just doesn't generate anything.
I type something, click on generate, what I typed appears in the story, then the generate button is greyed out and nothing happens.
I've set it to 10 seconds, 60 seconds and 180 seconds.
Even after 10 minutes there is still nothing.


I have tried gpt-large, gpt-xl and shinen. Getting that issue with all of them.

EDIT: After around 12 minutes it stopped generating and the result was the exact text I put in. It generated basically nothing.

Developer (1 edit)

Hello, it’s very curious. I think it’s the first time someone reports this. Could you send me the content of your settings.cfg file? Also, can you send me the logs? You also can send me the installer logs and the server logs.

Where to find the log:

  • On Windows - %APPDATA%/Roaming/aidventure
  • On Linux - ~local/share/godot/app_userdata/AIdventure OR ~.local/share/godot/app_userdata/AIdventure

The server logs are directly in the game root folder and named server_logs.

It can also be your CPU fault. Can you try with distilgpt (the smallest AI)?

So I tried it with the smallest AI and that worked. I just find it doubtful that my CPU is the problem here, since I have a Ryzen 3700X and while not being a high-end CPU it's still pretty decent.

Is there a general incompatibility with AMD CPUs?

I pasted the settings and logs below.


Settings:

[paths]

server="res://server/"

micromamba="res://mamba/micromamba.exe"

condabin="res://mamba/condabin/conda.bat"

mambabin="res://mamba/condabin/mamba.bat"

environment="res://mamba/envs/aidventure/"

[installer]

successful_installation=true

installer_version=2010

[preferences]

display_mode=0

hide_top_bar=false

font=0

font_size=16.0

font_color=Color( 1, 1, 1, 1 )

selection_color=Color( 0.101961, 0.372549, 0.705882, 1 )

line_separation=7.0

point_of_view=0

interface_language=""

[ai_settings]

max_length=40.0

do_sample=true

top_k=50.0

top_p=1.0

max_time=60.0

memory_length=400.0

context_length=1000.0

trim_unfinished=false

temperature=0.7

repetition_penalty=1.1

num_beams=1.0

no_repeat_ngram_size=0.0

use_gpu=true

low_memory=true

[server]

remote_server=false

server_ip="127.0.0.1"

server_port=9999.0

connection_delay=5.0

[auto_translation]

translation_enabled=false

language=""


Logs ?:

Godot Engine v3.5.1.stable.official.6fed1ffa3 - https://godotengine.org

OpenGL ES 3.0 Renderer: NVIDIA GeForce RTX 2070 SUPER/PCIe/SSE2

Async. shader compilation: OFF

1.4.1

2023/01/06 20:35:53

Game version: 1.4.1

OS: Windows

ERROR: Connection lost with the server. clean: False

   at: call (modules/gdscript/gdscript_functions.cpp:775) - Connection lost with the server. clean: False


Server logs:


15:32:24,622 AIdventure_Server DEBUG loading generator

15:32:58,335 AIdventure_Server INFO Model successfully loaded from local file

15:32:58,335 AIdventure_Server INFO Is CUDA available: False

15:32:58,335 AIdventure_Server DEBUG Is GPU enabled for the generator: False

Installer log (was far too long and exceeding character limit here):
https://pastebin.com/ygb7EghB
Developer (1 edit)

Hello, your CPU has the same specs as mine and I can easily generate a good sentence in less than 10 seconds with GPT2-Large. So it’s weird. My CPU is an intel though.

Maybe it could be the model type. It would be interesting to try with a GPT-Neo model to see if you get a better result.

What could be interesting is to see the server logs after a generation (a generation with no text). You can find the server logs in the game’s folder. It is named server_logs. Copy the content of this file after a generation (and before exiting the game).

Hello, this is from the demo version. 

Godot Engine v3.5.1.stable.official.6fed1ffa3 - https://godotengine.org

OpenGL ES 3.0 Renderer: NVIDIA GeForce GTX 1660 Ti/PCIe/SSE2

Async. shader compilation: OFF

AIdventure Demo Build (Godot 3.5.1 stable)

Type help to get more information about usage

2023/07/26 23:27:24

Game version: 1.5.1.DEMO

OS: Windows

Press F1 to close the console.

remote_server : false

ERROR: Connection lost with the server. clean: False

   at: call (modules/gdscript/gdscript_functions.cpp:775) - Connection lost with the server. clean: False

===== Server's logs dump =====

Developer

Hello, make sure you check the FAQ too https://itch.io/t/1820837/faq.

I have a few questions to help me debug this.

  • When did it happen exactly? Did the server start, did it happen after choosing an AI?
  • Did you run the installer once?
  • Can you send me the content of the file named server_logs.text

Thanks for making this game, it's an incredible piece of technology. I had looked at GPT and thought it'd be cool if Commodore 64 era text adventures could be hooked up to it, and was quite excited to come across your game.

I have been having trouble running stable 1.5.1 release of aidventure with CUDA (both in the paid and demo versions). I can play the stable release with CPU. The experimental release does seem to make CUDA available, however, I run into a separate issue where the game crashes.

I have a NVIDIA GeForce RTX 3070/PCIe/SSE2, and outside of the game, have successfully installed a PyTorch environment where CUDA functions. I'm testing with the distilgpt2 model as per your troubleshooting suggestions elsewhere.

Here are the troubleshooting steps I've tried:

==Start afresh==
For the logs below, I deleted all installations of aidventure, Deleted .local/share/aidventure config files, Rebooted

==initial issue on the stable version of the game==
Installed aidventure, cowwarts scenario, installed distilgpt2

From installer log: [2023.08.13 20:30:42] | DEBUG | [installer.gd] [start_installation] >> Is cuda enabled: True

OpenGL ES 3.0 Renderer: NVIDIA GeForce RTX 3070/PCIe/SSE2

Game version: 1.5.1

OS: Linux

Press F1 to close the console.

remote_server : false

Server started ws://0.0.0.0:9999

Client 127.0.0.1 connected

Model successfully loaded from local file

Is CUDA available: False

Is GPU enabled for the generator: False


The game works fine with CPU, however.

Opening ./mamba/envs/aidventure/bin/python3 and running "import torch; torch.cuda.is_available()" returns false. "torch.zeros(1).cuda()" returns "AssertionError: Torch not compiled with CUDA enabled"

==is it my computer==
Outside of the game, I installed pytorch, cuda in my local python environment. I did have some driver issues that I resolved. In my local python environment I was able to get torch.cuda.is_available() to return True and was able to compute torch.zeros(1).cuda()

==do later updates in experimental fix it==

Install experimental to a clean folder. Reboot.

From installer log: [2023.08.13 21:08:38] | DEBUG | [installer.gd] [start_installation] >> Is cuda enabled: True

I understand that experimental releases don't come with promises of being stable. It does look like conda config file with new dependencies resolves my CUDA issue. Opening ./mamba/envs/aidventure/bin/python3 and running "import torch; torch.cuda.is_available()" returns True, and the game recognises CUDA as available.

However, I run into a separate error.

I first opened the game and started a new cowwarts file. This runs into a server error.

Godot Engine v3.5.1.stable.official.6fed1ffa3 - https://godotengine.org

OpenGL ES 3.0 Renderer: NVIDIA GeForce RTX 3070/PCIe/SSE2

Async. shader compilation: OFF

AIdventure (Godot 3.5.1 stable)

Type help to get more information about usage

2023/08/13 21:22:55

Game version: 1.5.2-BETA.1

OS: Linux

Press F1 to close the console.

remote_server : false

ERROR: Connection lost with the server. clean: False

   at: call (modules/gdscript/gdscript_functions.cpp:775)

connection closed

ERROR: File must be opened before use.

   at: get_as_text (core/bind/core_bind.cpp:2092)

ERROR: Couldn't read res://server_logs.text Error: 7

   at: call (modules/gdscript/gdscript_functions.cpp:775)

   

   I then exit and try to load the game i just created

..further errors

Client 127.0.0.1 connected

This gives the model selection. I try to install distilgpt.

This downloads the files to the cache in .local/share/aidventure/cache . However, the game has trouble moving this into the models folder.

Error renaming user://cache/config.json into /media/user/HDD/aidventure-experimental/models/LyaaaaaGames/distilgpt2/config.json. Error : 1

Couldn't move user://cache/config.json to /media/user/HDD/aidventure-experimental/models/LyaaaaaGames/distilgpt2/config.json

Error renaming user://cache/merges.txt into /media/user/HDD/aidventure-experimental/models/LyaaaaaGames/distilgpt2/merges.txt. Error : 1

Couldn't move user://cache/merges.txt to /media/user/HDD/aidventure-experimental/models/LyaaaaaGames/distilgpt2/merges.txt

Error renaming user://cache/special_tokens_map.json into /media/user/HDD/aidventure-experimental/models/LyaaaaaGames/distilgpt2/special_tokens_map.json. Error : 1

...

Couldn't move user://cache/pytorch_model-00003-of-00003.bin to /media/user/HDD/aidventure-experimental/models/LyaaaaaGames/distilgpt2/pytorch_model-00003-of-00003.bin

Error renaming user://cache/pytorch_model.bin.index.json into /media/user/HDD/aidventure-experimental/models/LyaaaaaGames/distilgpt2/pytorch_model.bin.index.json. Error : 1

Couldn't move user://cache/pytorch_model.bin.index.json to /media/user/HDD/aidventure-experimental/models/LyaaaaaGames/distilgpt2/pytorch_model.bin.index.json

loading generator

Setting up the model.

--------

Server logs  for the experimental release are are

21:28:48,484 AIdventure_Server DEBUG loading generator

21:28:48,485 AIdventure_Server INFO Setting up the model.

21:28:48,485 AIdventure_Server DEBUG {'low_memory_mode': True, 'allow_offload': False, 'limit_memory': False, 'max_memory': {'0': '0 MB', 'cpu': '0 MB'}, 'allow_download': False, 'device_map': 'auto', 'torch_dtype': 4, 'offload_dict': True}

21:28:48,485 AIdventure_Server DEBUG Clearing GPU cache

21:28:48,527 AIdventure_Server DEBUG ---------------Memory allocated---------------

21:28:48,527 AIdventure_Server DEBUG 0.0000 B

21:28:48,527 AIdventure_Server DEBUG ---------------Max memory allocated---------------

21:28:48,527 AIdventure_Server DEBUG 0.0000 B

21:28:48,527 AIdventure_Server DEBUG ---------------Memory reserved---------------

21:28:48,527 AIdventure_Server DEBUG 0.0000 B

21:28:48,527 AIdventure_Server DEBUG ---------------Max memory reserved---------------

21:28:48,527 AIdventure_Server DEBUG 0.0000 B

21:28:48,527 AIdventure_Server INFO Token file in 'models/LyaaaaaGames/distilgpt2' not found.

21:28:48,527 AIdventure_Server INFO Couldn't load the model files.

21:28:48,527 AIdventure_Server INFO Downloading the model with the server is disabled.

21:28:48,527 AIdventure_Server INFO Is CUDA available: True

---

The game appears as normal. Typing anything and hitting Generate crashes the game:

Unexpected error shutting down the server

ERROR: Connection lost with the server. clean: False

   at: call (modules/gdscript/gdscript_functions.cpp:775)

connection closed

The experimental server logs now read

21:31:55,674 asyncio ERROR Task exception was never retrieved

future: <Task finished name='Task-4' coro=<WebSocketServerProtocol.handler() done, defined at /media/user/HDD/aidventure-experimental/mamba/envs/aidventure/lib/python3.9/site-packages/websockets/legacy/server.py:153> exception=SystemExit(None)>

Traceback (most recent call last):

  File "/media/user/HDD/aidventure-experimental/server/server.py", line 135, in handler

    data_to_send = handle_request(p_websocket, json_message)

  File "/media/user/HDD/aidventure-experimental/server/server.py", line 169, in handle_request

    generated_text = generator.generate_text(prompt,

  File "/media/user/HDD/aidventure-experimental/server/generator.py", line 52, in generate_text

    model_input    = self._Tokenizer(model_input, return_tensors = "pt")

AttributeError: 'Generator' object has no attribute '_Tokenizer'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):

  File "/media/user/HDD/aidventure-experimental/server/server.py", line 267, in <module>

    asyncio.run(main())

  File "/media/user/HDD/aidventure-experimental/mamba/envs/aidventure/lib/python3.9/asyncio/runners.py", line 44, in run

    return loop.run_until_complete(main)

  File "/media/user/HDD/aidventure-experimental/mamba/envs/aidventure/lib/python3.9/asyncio/base_events.py", line 629, in run_until_complete

    self.run_forever()

  File "/media/user/HDD/aidventure-experimental/mamba/envs/aidventure/lib/python3.9/asyncio/base_events.py", line 596, in run_forever

    self._run_once()

  File "/media/user/HDD/aidventure-experimental/mamba/envs/aidventure/lib/python3.9/asyncio/base_events.py", line 1890, in _run_once

    handle._run()

  File "/media/user/HDD/aidventure-experimental/mamba/envs/aidventure/lib/python3.9/asyncio/events.py", line 80, in _run

    self._context.run(self._callback, *self._args)

  File "/media/user/HDD/aidventure-experimental/mamba/envs/aidventure/lib/python3.9/site-packages/websockets/legacy/server.py", line 236, in handler

    await self.ws_handler(self)

  File "/media/user/HDD/aidventure-experimental/mamba/envs/aidventure/lib/python3.9/site-packages/websockets/legacy/server.py", line 1175, in _ws_handler

    return await cast(

  File "/media/user/HDD/aidventure-experimental/server/server.py", line 150, in handler

    shutdown_server()

  File "/media/user/HDD/aidventure-experimental/server/server.py", line 253, in shutdown_server

    exit()

  File "/media/user/HDD/aidventure-experimental/mamba/envs/aidventure/lib/python3.9/_sitebuiltins.py", line 26, in __call__

    raise SystemExit(code)

SystemExit: None

Developer (2 edits)

Hello, I have two ideas to solve your problem. I will only be focusing on your errors with the Experimental version (It’s much better than the stable one and requires little tricks to use it.)

The game raises errors when you download the AI, then try to move it from the cache folder (a place where rights are permissive user://cache) to /media/user/HDD/aidventure-experimental/models/LyaaaaaGames/distilgpt2/. It looks like it is on an external drive. So it’s probably just a permission problem. (Starting the game in admin (sudo mode) might work too but it’s not the best way)

You can either give more permissive rights on the AIdventure folder (just for the download part) or just move the game to your main drive.

Otherwise, (second option) you can let the server do the download for you (You will have little information about what happens, and I can’t guarantee success.) To do that, go in the game’s options and in the server tab. There, check ALLOW_MODEL_DOWNLOAD.

By reading the logs of the server, I understand the server doesn’t find the AI’s files.

I dunno if this is new, but whenever I download an AI model it just doesnt work.

I tried the simplest one avaliable, ran the entire download process, made sure that it had actually downloaded something, and nothing worked.

Godot Engine v3.5.1.stable.official.6fed1ffa3 - https://godotengine.org

OpenGL ES 3.0 Renderer: Radeon RX 570 Series

Async. shader compilation: OFF

AIdventure (Godot 3.5.1 stable)

Type help to get more information about usage

2023/10/03 15:17:59

Game version: 1.5.2

OS: Windows

Press F1 to close the console.

remote_server : false

ERROR: Connection lost with the server. clean: False

   at: call (modules/gdscript/gdscript_functions.cpp:775) - Connection lost with the server. clean: False

connection closed

remote_server : false

C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/config.json is missing.

C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/merges.txt is missing.

C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/special_tokens_map.json is missing.

C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/tokenizer.json is missing.

C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/tokenizer_config.json is missing.

C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/vocab.json is missing.

C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/generation_config.json is missing.

C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/pytorch_model-00001-of-00004.bin is missing.

C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/pytorch_model-00002-of-00004.bin is missing.

C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/pytorch_model-00003-of-00004.bin is missing.

C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/pytorch_model-00004-of-00004.bin is missing.

C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/pytorch_model.bin.index.json is missing.

Copied user://cache/config.json into C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/config.json

Moved user://cache/config.json to C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/config.json

Copied user://cache/merges.txt into C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/merges.txt

Moved user://cache/merges.txt to C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/merges.txt

Copied user://cache/special_tokens_map.json into C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/special_tokens_map.json

Moved user://cache/special_tokens_map.json to C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/special_tokens_map.json

Copied user://cache/tokenizer.json into C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/tokenizer.json

Moved user://cache/tokenizer.json to C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/tokenizer.json

Copied user://cache/tokenizer_config.json into C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/tokenizer_config.json

Moved user://cache/tokenizer_config.json to C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/tokenizer_config.json

Copied user://cache/vocab.json into C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/vocab.json

Moved user://cache/vocab.json to C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/vocab.json

Copied user://cache/generation_config.json into C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/generation_config.json

Moved user://cache/generation_config.json to C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/generation_config.json

Copied user://cache/pytorch_model-00001-of-00004.bin into C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/pytorch_model-00001-of-00004.bin

Moved user://cache/pytorch_model-00001-of-00004.bin to C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/pytorch_model-00001-of-00004.bin

Copied user://cache/pytorch_model-00002-of-00004.bin into C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/pytorch_model-00002-of-00004.bin

Moved user://cache/pytorch_model-00002-of-00004.bin to C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/pytorch_model-00002-of-00004.bin

Copied user://cache/pytorch_model-00003-of-00004.bin into C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/pytorch_model-00003-of-00004.bin

Moved user://cache/pytorch_model-00003-of-00004.bin to C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/pytorch_model-00003-of-00004.bin

Copied user://cache/pytorch_model-00004-of-00004.bin into C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/pytorch_model-00004-of-00004.bin

Moved user://cache/pytorch_model-00004-of-00004.bin to C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/pytorch_model-00004-of-00004.bin

Copied user://cache/pytorch_model.bin.index.json into C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/pytorch_model.bin.index.json

Moved user://cache/pytorch_model.bin.index.json to C:/Users/Lil Binch/Desktop/Games/AIdventure/models/LyaaaaaGames/gpt2/pytorch_model.bin.index.json

ERROR: Connection lost with the server. clean: False

   at: call (modules/gdscript/gdscript_functions.cpp:775) - Connection lost with the server. clean: False

===== Server's logs dump =====

15:21:45,594 AIdventure_Server INFO Server started ws://0.0.0.0:9999

15:21:59,25 AIdventure_Server INFO Client 127.0.0.1 connected

15:24:39,94 AIdventure_Server INFO Setting up the model.

15:24:39,288 AIdventure_Server INFO Tokens successfully loaded from local files

15:24:47,950 AIdventure_Server INFO Model successfully loaded from local files

15:24:47,951 AIdventure_Server INFO Is CUDA available: False

15:25:06,304 AIdventure_Server ERROR "LayerNormKernelImpl" not implemented for 'Half'

15:25:06,304 AIdventure_Server INFO Shutting down the server

15:25:06,374 asyncio ERROR Task exception was never retrieved

future: <Task finished name='Task-5' coro=<WebSocketServerProtocol.handler() done, defined at C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\site-packages\websockets\legacy\server.py:153> exception=SystemExit(0)>

Traceback (most recent call last):

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\server\server.py", line 137, in handler

    data_to_send = handle_request(p_websocket, json_message)

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\server\server.py", line 169, in handle_request

    generated_text = generator.generate_text(prompt,

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\server\generator.py", line 60, in generate_text

    model_output = self._Model.generate(**model_input)

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context

    return func(*args, **kwargs)

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\site-packages\transformers\generation\utils.py", line 1452, in generate

    return self.sample(

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\site-packages\transformers\generation\utils.py", line 2468, in sample

    outputs = self(

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl

    return forward_call(*args, **kwargs)

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\site-packages\transformers\models\gpt2\modeling_gpt2.py", line 1075, in forward

    transformer_outputs = self.transformer(

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl

    return forward_call(*args, **kwargs)

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\site-packages\transformers\models\gpt2\modeling_gpt2.py", line 899, in forward

    outputs = block(

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl

    return forward_call(*args, **kwargs)

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\site-packages\transformers\models\gpt2\modeling_gpt2.py", line 388, in forward

    hidden_states = self.ln_1(hidden_states)

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl

    return forward_call(*args, **kwargs)

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\site-packages\torch\nn\modules\normalization.py", line 190, in forward

    return F.layer_norm(

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\site-packages\torch\nn\functional.py", line 2515, in layer_norm

    return torch.layer_norm(input, normalized_shape, weight, bias, eps, torch.backends.cudnn.enabled)

RuntimeError: "LayerNormKernelImpl" not implemented for 'Half'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\server\server.py", line 268, in <module>

    asyncio.run(main())

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\asyncio\runners.py", line 44, in run

    return loop.run_until_complete(main)

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\asyncio\base_events.py", line 629, in run_until_complete

    self.run_forever()

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\asyncio\windows_events.py", line 316, in run_forever

    super().run_forever()

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\asyncio\base_events.py", line 596, in run_forever

    self._run_once()

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\asyncio\base_events.py", line 1890, in _run_once

    handle._run()

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\asyncio\events.py", line 80, in _run

    self._context.run(self._callback, *self._args)

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\site-packages\websockets\legacy\server.py", line 236, in handler

    await self.ws_handler(self)

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\site-packages\websockets\legacy\server.py", line 1175, in _ws_handler

    return await cast(

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\server\server.py", line 150, in handler

    shutdown_server(exit_code)

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\server\server.py", line 254, in shutdown_server

    exit(p_exit_code)

  File "C:\Users\Lil Binch\Desktop\Games\AIdventure\mamba\envs\aidventure\lib\_sitebuiltins.py", line 26, in __call__

    raise SystemExit(code)

SystemExit: 0

Developer (2 edits)

Hello, indeed the download works fine. The problem comes from your settings and your hardware. By default, the game is configured for users with CUDA available (Nvidia GPU). We can see CUDA isn’t available.

15:24:47,951 AIdventure_Server INFO Is CUDA available: False

Then the error happens.

15:25:06,304 AIdventure_Server ERROR "LayerNormKernelImpl" not implemented for 'Half'

If you don’t have a NVIDIA card with CUDA. You need to go in the game’s option and change an option

  1. Go in the settings
  2. Go in AI options
  3. Display advanced options
  4. Change Torch_Dtype to float32

I download the demo to see if I would like it before I buy and I installed the game and updated environment and I made it to the beginning but after I type one word it say the client lost connection  pls help

Developer (3 edits)

Hello, can you send me the logs please? I can’t help you without them.

Where to find the game’s logs?

  • On Windows - %APPDATA%/Roaming/aidventure
  • On Linux - ~local/share/godot/app_userdata/AIdventure OR ~.local/share/godot/app_userdata/AIdventure

By the way, if you find this line RuntimeError: "LayerNormKernelImpl" not implemented for 'Half' in the logs. You can directly apply the fix I explained on the post above ;)

If you don’t have a NVIDIA card with CUDA. You need to go in the game’s option and change an option

Go in the settings
Go in AI options
Display advanced options
Change Torch_Dtype to float32
(+1)

thank you soo much for the help your solution helped fixed it  

(1 edit)

every time I try to install it just said error occurred during installation.


Godot Engine v4.2.1.stable.official.b09f793f5 - https://godotengine.org

Vulkan API 1.3.260 - Forward+ - Using Vulkan Device #0: NVIDIA - NVIDIA GeForce RTX 2060 SUPER

WARNING: Error getting files at F:/FDownloads/aidventure-windows-64/fonts/. Error code: 31

     at: push_warning (core/variant/variant_utility.cpp:1111)

WARNING: Couldn't open F:/FDownloads/aidventure-windows-64/server_logs.text

 Error: 7

     at: push_warning (core/variant/variant_utility.cpp:1111)

2024-03-15 15:47:42

Game version: 2.0.3

OS: Windows

Developer

Hello, just a few questions:

  • Did you run the installer ?
  • When is this error happening ? (when you start aidventure.exe ?)

Make sure to run the installer and click “install”. If you keep having troubles with the installation, you can try the Steam version (it has no installer and easier to set up). You get a steam key by buying the game.

(2 edits)

I will try the steam key though yes its when I run the installer exe and its says it almost instantly upon clicking install
okay now I get this when trying to load ai and play and its said failed connection in a pop up box

07:39:08,385 AIdventure_Server INFO Server started ws://0.0.0.0:9999

07:39:08,410 AIdventure_Server INFO Client 127.0.0.1 connected

never mind its  worked on the second try

Developer (1 edit)

I’m glad to hear you can finally enjoy the game! Feel free to post another message if you encounter problems!

And don’t hesitate to rate the game ;)

(1 edit)

I'm having problems receiving the steam key because i misstyped my email adress and i cant send a new one to the existing email

Developer

Hello, it’s not a big problem. Can you send to lyaaaaa@lyaaaaagames.com the invoice for your purchase (with the correct email ;)). I will verify the info and send you a steam key myself.

Ran into a random error with a story where the generator refuses to continue with one character but some others still work. Unsure if this is just due to the length of the story or if there are other factors that contribute to the issue. I have tried: rebooting PC, messing with folder permissions, and generating other stories to replicate the issue. It references something to do with compiling with a different option for CUDA, but I am uncertain of specifics. I am not familiar with Python to really venture more of a guess. Appreciate your time and what you have created so far!

On Windows 11, log below:

-----

21:30:48,448 AIdventure_Server INFO Server started ws://0.0.0.0:9999

21:30:51,652 AIdventure_Server INFO Client 127.0.0.1 connected

21:30:56,602 AIdventure_Server INFO Initialising the model: LyaaaaaGames/GPT-Neo-2.7B-Horni-LN at C:/Users/Surumon/AppData/Roaming/aidventure/models/generators/LyaaaaaGames/GPT-Neo-2.7B-Horni-LN

21:30:56,602 AIdventure_Server INFO Is CUDA available: True

21:30:56,603 AIdventure_Server INFO Setting up the Generator.

21:30:56,833 AIdventure_Server INFO Tokens successfully loaded from local files

21:31:02,235 AIdventure_Server INFO Model successfully loaded from local files

21:31:14,728 AIdventure_Server INFO Loading inputs to GPU

21:31:18,795 AIdventure_Server ERROR CUDA error: device-side assert triggered

CUDA kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect.

For debugging consider passing CUDA_LAUNCH_BLOCKING=1.

Compile with `TORCH_USE_CUDA_DSA` to enable device-side assertions.

21:31:18,795 AIdventure_Server INFO Shutting down the server

21:31:18,883 asyncio ERROR Task exception was never retrieved

future: <Task finished name='Task-5' coro=<WebSocketServerProtocol.handler() done, defined at C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\websockets\legacy\server.py:153> exception=SystemExit(0)>

Traceback (most recent call last):

  File "C:\Program Files\AIdventure\server\server.py", line 153, in handler

    data_to_send = handle_request(p_websocket, json_message)

  File "C:\Program Files\AIdventure\server\server.py", line 183, in handle_request

    generated_text = generator.generate_text(prompt, parameters)

  File "C:\Program Files\AIdventure\server\generator.py", line 67, in generate_text

    model_output = self._Model.generate(**model_input)

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context

    return func(*args, **kwargs)

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\transformers\generation\utils.py", line 1452, in generate

    return self.sample(

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\transformers\generation\utils.py", line 2468, in sample

    outputs = self(

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl

    return self._call_impl(*args, **kwargs)

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl

    return forward_call(*args, **kwargs)

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\transformers\models\gpt_neo\modeling_gpt_neo.py", line 741, in forward

    transformer_outputs = self.transformer(

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl

    return self._call_impl(*args, **kwargs)

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl

    return forward_call(*args, **kwargs)

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\transformers\models\gpt_neo\modeling_gpt_neo.py", line 621, in forward

    outputs = block(

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl

    return self._call_impl(*args, **kwargs)

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl

    return forward_call(*args, **kwargs)

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\transformers\models\gpt_neo\modeling_gpt_neo.py", line 326, in forward

    attn_outputs = self.attn(

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl

    return self._call_impl(*args, **kwargs)

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl

    return forward_call(*args, **kwargs)

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\transformers\models\gpt_neo\modeling_gpt_neo.py", line 278, in forward

    return self.attention(

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl

    return self._call_impl(*args, **kwargs)

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl

    return forward_call(*args, **kwargs)

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\transformers\models\gpt_neo\modeling_gpt_neo.py", line 241, in forward

    attn_output, attn_weights = self._attn(query, key, value, attention_mask, head_mask)

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\transformers\models\gpt_neo\modeling_gpt_neo.py", line 194, in _attn

    mask_value = torch.tensor(mask_value, dtype=attn_weights.dtype).to(attn_weights.device)

RuntimeError: CUDA error: device-side assert triggered

CUDA kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect.

For debugging consider passing CUDA_LAUNCH_BLOCKING=1.

Compile with `TORCH_USE_CUDA_DSA` to enable device-side assertions.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):

  File "C:\Program Files\AIdventure\server\server.py", line 290, in <module>

    asyncio.run(main())

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\asyncio\runners.py", line 44, in run

    return loop.run_until_complete(main)

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\asyncio\base_events.py", line 629, in run_until_complete

    self.run_forever()

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\asyncio\windows_events.py", line 316, in run_forever

    super().run_forever()

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\asyncio\base_events.py", line 596, in run_forever

    self._run_once()

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\asyncio\base_events.py", line 1890, in _run_once

    handle._run()

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\asyncio\events.py", line 80, in _run

    self._context.run(self._callback, *self._args)

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\websockets\legacy\server.py", line 236, in handler

    await self.ws_handler(self)

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\site-packages\websockets\legacy\server.py", line 1175, in _ws_handler

    return await cast(

  File "C:\Program Files\AIdventure\server\server.py", line 166, in handler

    shutdown_server(exit_code)

  File "C:\Program Files\AIdventure\server\server.py", line 276, in shutdown_server

    exit(p_exit_code)

  File "C:\Program Files\AIdventure\mamba\envs\aidventure\lib\_sitebuiltins.py", line 26, in __call__

    raise SystemExit(code)

SystemExit: 0

Developer(+1)

Hello, I never saw this error before. What character did you send to the AI? You said it happened with one specific character.

I did try different settings and I think I figured it out. I had been stress testing my system and the program. I had set Context's Max Length to 10,000 characters. I lowered this setting by 1,000 and kept reloading the server. Once I hit 7,000 for Context's Max Length, the server stopped crashing.

For context on the save: The story in that save is about 14,500 characters. The story contains standard English letters, punctuation, quotation mark, and no numbers. There are no made up words except two character names and one lore book entry giving a description of one character. The story was started with the "Very Bad Awakening" scenario. The memory has not been altered since beginning the story.

I am not sure if the error I had is just an out of memory issue or something else, but changing the Context's Max Length option seems to have fixed it. It also appears that the server does not respect mid-run settings changes to the AI Settings or when I shut the server down with the save loaded. I have to specifically return to the main menu and choose to shut down the server before adjusting the AI Settings to get them to stick to the server.

I hope this information is useful to you. Thank you for the help!

Developer

It’s very curious. Increasing the context length wouldn’t fix an out of memory error. I think it’s something else. I will write your feedback down and check it when I can. Thanks!

It isnt letting me install the game, here are my logs, im using the paid version


[2024.11.14 23:45:44] | INFO  | [installer.gd] [check_os()] >> Checking the os and its architecture

[2024.11.14 23:45:44] | DEBUG | [installer.gd] [check_os()] >> Architecture: 64 bits

[2024.11.14 23:45:44] | DEBUG | [installer.gd] [check_os()] >> OS: Windows

[2024.11.14 23:45:44] | INFO  | [installer.gd] [check_previous_install] >> Checking the previous installation

[2024.11.14 23:45:51] | DEBUG | [installer.gd] [start_installation] >> Is cuda enabled: True

[2024.11.14 23:45:51] | INFO  | [installer.gd] [move_env_config] >> Coping conda environment config file to user://

[2024.11.14 23:45:51] | INFO  | [installer.gd] [move_env_config] >> Copied res://conda_config_cuda.yml to user://conda_config.yml

[2024.11.14 23:45:51] | INFO  | [installer.gd] [_create_config] >> Creating the config file.

[2024.11.14 23:45:51] | INFO  | [installer.gd] [_create_config] >> Successfully created the config file.

[2024.11.14 23:45:51] | INFO  | [installer.gd] [install_mamba] >> Installing mamba

[2024.11.14 23:46:07] | INFO  | [installer.gd] [install_mamba] >> Successfully installed mamba

[2024.11.14 23:46:07] | DEBUG | [installer.gd] [install_mamba] >> [

                                           __

          __  ______ ___  ____ _____ ___  / /_  ____ _

         / / / / __ `__ \/ __ `/ __ `__ \/ __ \/ __ `/

        / /_/ / / / / / / /_/ / / / / / / /_/ / /_/ /

       / .___/_/ /_/ /_/\__,_/_/ /_/ /_/_.___/\__,_/

      /_/

conda-forge/win-64                                          Using cache

conda-forge/noarch                                          Using cache

Transaction

  Prefix: D:\ai thingy\mamba

  All requested packages already installed

Transaction starting

Transaction finished

To activate this environment, use:

    $ [micro]mamba activate <environment>

Or to execute a single command in this environment, use:

    $ [micro]mamba run -n <environment> mycommand

]

[2024.11.14 23:46:07] | INFO  | [installer.gd] [create_environment] >> Creating the environment.

[2024.11.14 23:46:07] | ERROR | [installer.gd] [create_environment] >> Failed creating the environment. Error: 109

[2024.11.14 23:46:07] | DEBUG | [installer.gd] [create_environment] >> []

[2024.11.14 23:46:07] | INFO  | [installer.gd] [_on_Installer_installation_done] >> Writing installation's info in the config file.

[2024.11.14 23:46:07] | INFO  | [installer.gd] [_on_Installer_installation_done] >> Successfully wrote in the config file.

Developer

After checking my FAQ, I found a part about it:

Make sure to verify the folder named mamba in your game folder. If in this folder you find no directory named condabin then it indeed failed.

  • Try to disable your antivirus and see if the installation works (This happened with an user of AVG AntiVirus Free).
  • Make sure the path to your game has no white space!
Developer

Otherwise, you can try the Steam version (it doesn’t require an installation). You get a Steam key with your purchase

I cannot download any other ai. Everytime I click download it says add the downloading to the queue. But I waited almost an hour and nothing got downloaded.

Developer

hey, can you try with a small AI? Try distilgpt2 (it will be WAY faster if your connection is slow). Otherwise, send me the logs after trying to download.

Nothing happened even when I try download the smallest ai.

Is it because I have to use vpn?