Running Local Large Language Model in Nextcloud?

My YunoHost server

Hardware: VPS bought online
YunoHost version:
yunohost:
repo: stable
version: 11.2.11.2
yunohost-admin:
repo: stable
version: 11.2.5
moulinette:
repo: stable
version: 11.2
ssowat:
repo: stable
version: 11.2

I have access to my server : Through SSH | through the webadmin
Are you in a special context or did you perform some particular tweaking on your YunoHost instance ? : no
If yes, please explain:
If your request is related to an app, specify its name and version: app Nextcloud 28.0.4~ynh2

Description of my issue

I’m trying to run Nextcloud’s AI features using the Local Large Language Model. I’ve downloaded the language model and the admin page for it has no warnings. Every time I give it a prompt though it says there was an error and to check the logs. The logs just say:

[llm] Warning: Traceback (most recent call last):
  File "/var/www/nextcloud/apps/llm/src-py/index.py", line 3, in <module>
    from chains.formalize import FormalizeChain
  File "/var/www/nextcloud/apps/llm/src-py/chains/formalize.py", line 2, in <module>
    from langchain.prompts import BasePromptTemplate, PromptTemplate
ModuleNotFoundError: No module named 'langchain'

I have similar issues with the Recognize app, even after downloading the model. Has anyone managed to get the LLLM running on Yunohost+Nextcloud?

This app requires Python 3.10 and python3-venv module, first of which is not shipped with current Debian version (bullseye ships 3.9), the latter not in Nextcloud package requirements.

Overall the app is not compatible with YNH in its current form, sorry :confused: