How to use
Deploy the server yourself
Clone the repo
git clone https://github.com/DargonXuan/AutoRiddleMaster.git
Ensure your local environment install pip3:
wget https://bootstrap.pypa.io/get-pip.py sudo python3 get-pip.py
Install dependencies by pip3
pip3 install -r requirements.txt
Try to run
python3 model.pyto see whether the dependency is successfully installed (optional)
WARNING:tensorflow:From /usr/local/lib/python3.8/dist-packages/tensorflow/python/compat/v2_compat.py:101: disable_resource_variables (from tensorflow. python.ops.variable_scope) is deprecated and will be removed in a future version. Instructions for updating: non-resource variables are not supported in the long term WARNING:tensorflow:From model.py:19: load (from tensorflow.python.saved_model.loader_impl) is deprecated and will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.loader.load or tf.compat.v1.saved_model.load. Th ere will be a new function for importing SavedModels in Tensorflow 2.0. A
The correct output should be like this, pay attention to the character A at the end, this is the answer to
On Windows Server, you also need to install this dependency. Visual C++ Redistributable
Run api server by uvicorn or gunicorn If you are personal use, then please run api.py directly.
If you have a lot of needs, you can through gunicorn.
gunicorn -c run.py api:app
The default port is 80.
Port can be modified in line 53 of
api.pyor line 13 of
Add your server address to the connet list of user.js, and modify the
// @connect autopony.ltd // // ==/UserScript== const API_SERVER = 'http://autopony.ltd';
autopony.ltdto your address, note that there is no
/character at the end.
Open our test site and observe if your script is running normally. test.autopony.ltd
Use the server we provide
We cannot guarantee the network quality and stable operation of the server, and it is very likely that the answer will not be returned within the specified time.