xgboost
有一个parameter feature_weights
,应该会影响为模型 Select 特征的概率,也就是说,我们可以给每个特征赋予或多或少的权重,但看起来参数不起作用还是我做错了什么?
X <- as.matrix(iris[,-5])
Y <- ifelse(iris$Species=="setosa", 1, 0)
library(xgboost)
dm1 <- xgb.DMatrix(X, label = Y)
#I set different probabilities for each feature
dm2 <- xgb.DMatrix(X, label = Y, feature_weights = c(1, 0, 0, 0.01))
params <- list(objective = "binary:logistic", eval_metric = "logloss")
set.seed(1)
xgb1 <- xgboost(data = dm1, params = params, nrounds = 10, print_every_n = 5)
[1] train-logloss:0.448305
[6] train-logloss:0.090220
[10]train-logloss:0.033148
xgb2 <- xgboost(data = dm2, params = params, nrounds = 10, print_every_n = 5)
[1] train-logloss:0.448305
[6] train-logloss:0.090220
[10]train-logloss:0.033148
但模型的表现完全相同,似乎参数feature_weights
被简单忽略了