Description
Describe the bug
After the installation of Pythainlp version 1.7 inside Google Colab, I have called a function pythainlp.sentiment(string, 'ulmfit') and then an error occurred as below example.
To Reproduce
Steps to reproduce the behaviour:
- Create new Google colab's notebook, setting notebook to use Python3 and Hardware accelerator as GPU
- Code as below
!pip install pythainlp
!pip install torchvision
!pip install https://github.com/fastai/fastai/archive/1.0.22.zip
import pythainlp
test_string = 'ประโยคทดสอบ'
print(pythainlp.sentiment(test_string, 'ulmfit'))
Note
After I've tried to install torchvision and fastai in the latest version and v1.0.22 (according to this issue).
The fastai library still causes an error ModuleNotFoundError: No module named 'fastai.lm_rnn'
So, I raise this issue.
Expected behaviour
According to pythainlp 1.7 document on sentiment analysis,
This method should return a prediction result from a string and not raise an error after calling sentiment(string, 'ulmfit) method without any additional code that importing all of the required libraries.
Example
Here is a google colab's notebook link
Desktop
- OS: macOS High Sierra 10.13.4
- Python Version: 3.6 (Google Colab)
- Version: pythainlp=1.7
Suggestion
As this issue might be about Fastai's problem, I suggest that this method should automatically install a torch library and the specific version of Fastai, or make a model become compatible with the latest version.