Skip to content

ModuleNotFoundError after calling pythainlp.sentiment(string, 'ulmfit') #172

Closed
@LXZE

Description

@LXZE

Describe the bug
After the installation of Pythainlp version 1.7 inside Google Colab, I have called a function pythainlp.sentiment(string, 'ulmfit') and then an error occurred as below example.

To Reproduce
Steps to reproduce the behaviour:

  1. Create new Google colab's notebook, setting notebook to use Python3 and Hardware accelerator as GPU
  2. Code as below
!pip install pythainlp
!pip install torchvision
!pip install https://github.com/fastai/fastai/archive/1.0.22.zip
import pythainlp
test_string = 'ประโยคทดสอบ'
print(pythainlp.sentiment(test_string, 'ulmfit'))

Note
After I've tried to install torchvision and fastai in the latest version and v1.0.22 (according to this issue).
The fastai library still causes an error ModuleNotFoundError: No module named 'fastai.lm_rnn' So, I raise this issue.

Expected behaviour
According to pythainlp 1.7 document on sentiment analysis,
This method should return a prediction result from a string and not raise an error after calling sentiment(string, 'ulmfit) method without any additional code that importing all of the required libraries.

Example
Here is a google colab's notebook link

Desktop

  • OS: macOS High Sierra 10.13.4
  • Python Version: 3.6 (Google Colab)
  • Version: pythainlp=1.7

Suggestion
As this issue might be about Fastai's problem, I suggest that this method should automatically install a torch library and the specific version of Fastai, or make a model become compatible with the latest version.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions