Description
lgb.plot_importance() throws an error: ValueError: not enough values to unpack (expected 2, got 0)
Reproducible example
import numpy as np
import lightgbm as lgb
from sklearn.model_selection import train_test_split
X = np.array([[ 4. , -10.1, 10.4],
[ 41. , -12.9, 11.3],
[ 19. , -12.3, 11. ],
[ 11. , -7.3, 11.7],
[ 36. , -15.3, 11.6],
[ 15. , -11.6, 10.8],
[ 35. , -10.6, 10.3],
[ 38. , -8.3, 10.4],
[ 27. , -8.2, 8.8],
[ 37. , -9.2, 9.6]])
y = np.array([4.8, 4.8, 4.7, 5.5, 4. , 5.1, 5.2, 5.4, 5.4, 5.4])
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.5)
lgb_train = lgb.Dataset(X_train, y_train)
lgb_val = lgb.Dataset(X_test, y_test, reference=lgb_train)
params = {
'boosting_type': 'gbdt',
'objective': 'regression',
'metric': 'rmse',
'random_state': 0,
'verbose' : -1,
}
model = lgb.train(params,
lgb_train,
valid_sets=[lgb_train, lgb_val],
num_boost_round = 100000,
callbacks=[lgb.early_stopping(stopping_rounds=1000,
verbose=True)])
lgb.plot_importance(model, figsize=(8,4), max_num_features=5, importance_type='gain') # <-- this throws an error
Environment info
LightGBM version: 4.6.0
Command(s) you used to install LightGBM:
- Windows 11
- VSCode 1.98.1
- Python 3.12.4
Additional Comments
When the data is from from sklearn.datasets import load_diabetes, it does not throw an error. A graph for feature importance is shown.
Description
lgb.plot_importance()throws an error: ValueError: not enough values to unpack (expected 2, got 0)Reproducible example
Environment info
LightGBM version: 4.6.0
Command(s) you used to install LightGBM:
Additional Comments
When the data is from
from sklearn.datasets import load_diabetes, it does not throw an error. A graph for feature importance is shown.