How to jointly use makeFeatSelWrapper and resample function in mlr










0















I'm fitting classification models for binary issues using MLR package in R. For each model, I perform a cross-validation with embedded feature selection using "selectFeatures" function. In output, I retrieve mean AUCs over test sets and predictions. To do so, after having get some advices (Get predictions on test sets in MLR), I use "makeFeatSelWrapper" function in combination with "resample" function. The goal seems to be achieved but results are strange. With a logistic regression as classifier, I get an AUC of 0.5 which means no variable selected. This result is unexpected as I get an AUC of 0.9824432 with this classifier using the method mentioned in the linked question. With a neural network as classifier, I get an error message




Error in sum(x) : invalid 'type' (list) of argument




What is wrong?



Here is the sample code:



# 1. Find a synthetic dataset for supervised learning (two classes)
###################################################################

install.packages("mlbench")
library(mlbench)
data(BreastCancer)

# generate 1000 rows, 21 quantitative candidate predictors and 1 target variable
p<-mlbench.waveform(1000)

# convert list into dataframe
dataset<-as.data.frame(p)

# drop thrid class to get 2 classes
dataset2 = subset(dataset, classes != 3)

# 2. Perform cross validation with embedded feature selection using logistic regression
#######################################################################################

library(BBmisc)
library(nnet)
library(mlr)

# Choice of data
mCT <- makeClassifTask(data =dataset2, target = "classes")

# Choice of algorithm i.e. neural network
mL <- makeLearner("classif.logreg", predict.type = "prob")

# Choice of cross-validations for folds

outer = makeResampleDesc("CV", iters = 10,stratify = TRUE)

# Choice of feature selection method

ctrl = makeFeatSelControlSequential(method = "sffs", maxit = NA,alpha = 0.001)

# Choice of hold-out sampling between training and test within the fold

inner = makeResampleDesc("Holdout",stratify = TRUE)

lrn = makeFeatSelWrapper(mL, resampling = inner, control = ctrl)
r = resample(lrn, mCT, outer, extract = getFeatSelResult,measures = list(mlr::auc,mlr::acc,mlr::brier),models=TRUE)

# 3. Perform cross validation with embedded feature selection using neural network
##################################################################################

library(BBmisc)
library(nnet)
library(mlr)

# Choice of data
mCT <- makeClassifTask(data =dataset2, target = "classes")

# Choice of algorithm i.e. neural network
mL <- makeLearner("classif.nnet", predict.type = "prob")

# Choice of cross-validations for folds

outer = makeResampleDesc("CV", iters = 10,stratify = TRUE)

# Choice of feature selection method

ctrl = makeFeatSelControlSequential(method = "sffs", maxit = NA,alpha = 0.001)

# Choice of sampling between training and test within the fold

inner = makeResampleDesc("Holdout",stratify = TRUE)

lrn = makeFeatSelWrapper(mL, resampling = inner, control = ctrl)
r = resample(lrn, mCT, outer, extract = getFeatSelResult,measures = list(mlr::auc,mlr::acc,mlr::brier),models=TRUE)









share|improve this question




























    0















    I'm fitting classification models for binary issues using MLR package in R. For each model, I perform a cross-validation with embedded feature selection using "selectFeatures" function. In output, I retrieve mean AUCs over test sets and predictions. To do so, after having get some advices (Get predictions on test sets in MLR), I use "makeFeatSelWrapper" function in combination with "resample" function. The goal seems to be achieved but results are strange. With a logistic regression as classifier, I get an AUC of 0.5 which means no variable selected. This result is unexpected as I get an AUC of 0.9824432 with this classifier using the method mentioned in the linked question. With a neural network as classifier, I get an error message




    Error in sum(x) : invalid 'type' (list) of argument




    What is wrong?



    Here is the sample code:



    # 1. Find a synthetic dataset for supervised learning (two classes)
    ###################################################################

    install.packages("mlbench")
    library(mlbench)
    data(BreastCancer)

    # generate 1000 rows, 21 quantitative candidate predictors and 1 target variable
    p<-mlbench.waveform(1000)

    # convert list into dataframe
    dataset<-as.data.frame(p)

    # drop thrid class to get 2 classes
    dataset2 = subset(dataset, classes != 3)

    # 2. Perform cross validation with embedded feature selection using logistic regression
    #######################################################################################

    library(BBmisc)
    library(nnet)
    library(mlr)

    # Choice of data
    mCT <- makeClassifTask(data =dataset2, target = "classes")

    # Choice of algorithm i.e. neural network
    mL <- makeLearner("classif.logreg", predict.type = "prob")

    # Choice of cross-validations for folds

    outer = makeResampleDesc("CV", iters = 10,stratify = TRUE)

    # Choice of feature selection method

    ctrl = makeFeatSelControlSequential(method = "sffs", maxit = NA,alpha = 0.001)

    # Choice of hold-out sampling between training and test within the fold

    inner = makeResampleDesc("Holdout",stratify = TRUE)

    lrn = makeFeatSelWrapper(mL, resampling = inner, control = ctrl)
    r = resample(lrn, mCT, outer, extract = getFeatSelResult,measures = list(mlr::auc,mlr::acc,mlr::brier),models=TRUE)

    # 3. Perform cross validation with embedded feature selection using neural network
    ##################################################################################

    library(BBmisc)
    library(nnet)
    library(mlr)

    # Choice of data
    mCT <- makeClassifTask(data =dataset2, target = "classes")

    # Choice of algorithm i.e. neural network
    mL <- makeLearner("classif.nnet", predict.type = "prob")

    # Choice of cross-validations for folds

    outer = makeResampleDesc("CV", iters = 10,stratify = TRUE)

    # Choice of feature selection method

    ctrl = makeFeatSelControlSequential(method = "sffs", maxit = NA,alpha = 0.001)

    # Choice of sampling between training and test within the fold

    inner = makeResampleDesc("Holdout",stratify = TRUE)

    lrn = makeFeatSelWrapper(mL, resampling = inner, control = ctrl)
    r = resample(lrn, mCT, outer, extract = getFeatSelResult,measures = list(mlr::auc,mlr::acc,mlr::brier),models=TRUE)









    share|improve this question


























      0












      0








      0








      I'm fitting classification models for binary issues using MLR package in R. For each model, I perform a cross-validation with embedded feature selection using "selectFeatures" function. In output, I retrieve mean AUCs over test sets and predictions. To do so, after having get some advices (Get predictions on test sets in MLR), I use "makeFeatSelWrapper" function in combination with "resample" function. The goal seems to be achieved but results are strange. With a logistic regression as classifier, I get an AUC of 0.5 which means no variable selected. This result is unexpected as I get an AUC of 0.9824432 with this classifier using the method mentioned in the linked question. With a neural network as classifier, I get an error message




      Error in sum(x) : invalid 'type' (list) of argument




      What is wrong?



      Here is the sample code:



      # 1. Find a synthetic dataset for supervised learning (two classes)
      ###################################################################

      install.packages("mlbench")
      library(mlbench)
      data(BreastCancer)

      # generate 1000 rows, 21 quantitative candidate predictors and 1 target variable
      p<-mlbench.waveform(1000)

      # convert list into dataframe
      dataset<-as.data.frame(p)

      # drop thrid class to get 2 classes
      dataset2 = subset(dataset, classes != 3)

      # 2. Perform cross validation with embedded feature selection using logistic regression
      #######################################################################################

      library(BBmisc)
      library(nnet)
      library(mlr)

      # Choice of data
      mCT <- makeClassifTask(data =dataset2, target = "classes")

      # Choice of algorithm i.e. neural network
      mL <- makeLearner("classif.logreg", predict.type = "prob")

      # Choice of cross-validations for folds

      outer = makeResampleDesc("CV", iters = 10,stratify = TRUE)

      # Choice of feature selection method

      ctrl = makeFeatSelControlSequential(method = "sffs", maxit = NA,alpha = 0.001)

      # Choice of hold-out sampling between training and test within the fold

      inner = makeResampleDesc("Holdout",stratify = TRUE)

      lrn = makeFeatSelWrapper(mL, resampling = inner, control = ctrl)
      r = resample(lrn, mCT, outer, extract = getFeatSelResult,measures = list(mlr::auc,mlr::acc,mlr::brier),models=TRUE)

      # 3. Perform cross validation with embedded feature selection using neural network
      ##################################################################################

      library(BBmisc)
      library(nnet)
      library(mlr)

      # Choice of data
      mCT <- makeClassifTask(data =dataset2, target = "classes")

      # Choice of algorithm i.e. neural network
      mL <- makeLearner("classif.nnet", predict.type = "prob")

      # Choice of cross-validations for folds

      outer = makeResampleDesc("CV", iters = 10,stratify = TRUE)

      # Choice of feature selection method

      ctrl = makeFeatSelControlSequential(method = "sffs", maxit = NA,alpha = 0.001)

      # Choice of sampling between training and test within the fold

      inner = makeResampleDesc("Holdout",stratify = TRUE)

      lrn = makeFeatSelWrapper(mL, resampling = inner, control = ctrl)
      r = resample(lrn, mCT, outer, extract = getFeatSelResult,measures = list(mlr::auc,mlr::acc,mlr::brier),models=TRUE)









      share|improve this question
















      I'm fitting classification models for binary issues using MLR package in R. For each model, I perform a cross-validation with embedded feature selection using "selectFeatures" function. In output, I retrieve mean AUCs over test sets and predictions. To do so, after having get some advices (Get predictions on test sets in MLR), I use "makeFeatSelWrapper" function in combination with "resample" function. The goal seems to be achieved but results are strange. With a logistic regression as classifier, I get an AUC of 0.5 which means no variable selected. This result is unexpected as I get an AUC of 0.9824432 with this classifier using the method mentioned in the linked question. With a neural network as classifier, I get an error message




      Error in sum(x) : invalid 'type' (list) of argument




      What is wrong?



      Here is the sample code:



      # 1. Find a synthetic dataset for supervised learning (two classes)
      ###################################################################

      install.packages("mlbench")
      library(mlbench)
      data(BreastCancer)

      # generate 1000 rows, 21 quantitative candidate predictors and 1 target variable
      p<-mlbench.waveform(1000)

      # convert list into dataframe
      dataset<-as.data.frame(p)

      # drop thrid class to get 2 classes
      dataset2 = subset(dataset, classes != 3)

      # 2. Perform cross validation with embedded feature selection using logistic regression
      #######################################################################################

      library(BBmisc)
      library(nnet)
      library(mlr)

      # Choice of data
      mCT <- makeClassifTask(data =dataset2, target = "classes")

      # Choice of algorithm i.e. neural network
      mL <- makeLearner("classif.logreg", predict.type = "prob")

      # Choice of cross-validations for folds

      outer = makeResampleDesc("CV", iters = 10,stratify = TRUE)

      # Choice of feature selection method

      ctrl = makeFeatSelControlSequential(method = "sffs", maxit = NA,alpha = 0.001)

      # Choice of hold-out sampling between training and test within the fold

      inner = makeResampleDesc("Holdout",stratify = TRUE)

      lrn = makeFeatSelWrapper(mL, resampling = inner, control = ctrl)
      r = resample(lrn, mCT, outer, extract = getFeatSelResult,measures = list(mlr::auc,mlr::acc,mlr::brier),models=TRUE)

      # 3. Perform cross validation with embedded feature selection using neural network
      ##################################################################################

      library(BBmisc)
      library(nnet)
      library(mlr)

      # Choice of data
      mCT <- makeClassifTask(data =dataset2, target = "classes")

      # Choice of algorithm i.e. neural network
      mL <- makeLearner("classif.nnet", predict.type = "prob")

      # Choice of cross-validations for folds

      outer = makeResampleDesc("CV", iters = 10,stratify = TRUE)

      # Choice of feature selection method

      ctrl = makeFeatSelControlSequential(method = "sffs", maxit = NA,alpha = 0.001)

      # Choice of sampling between training and test within the fold

      inner = makeResampleDesc("Holdout",stratify = TRUE)

      lrn = makeFeatSelWrapper(mL, resampling = inner, control = ctrl)
      r = resample(lrn, mCT, outer, extract = getFeatSelResult,measures = list(mlr::auc,mlr::acc,mlr::brier),models=TRUE)






      cross-validation feature-selection mlr






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Nov 12 '18 at 13:50







      Chris

















      asked Nov 12 '18 at 13:30









      ChrisChris

      215




      215






















          1 Answer
          1






          active

          oldest

          votes


















          0














          If you run your logistic regression part of the code a couple of times, you should also get the Error in sum(x) : invalid 'type' (list) of argument error. However, I find it strange that fixing a particular seed (e.g., set.seed(1)) before resampling does not ensure that the error does or does not appear.



          The error occurs in internal mlr code for printing the output of feature selection to the console. A very simple workaround is to simply avoid printing such output with show.info = FALSE in makeFeatSelWrapper (see code below). While this removes the error, it is possible that what caused it may have other consequences, although I it is possible the error only affects the printing code.



          When running your code, I only get AUC above 0.90. Please find below a your code for logistic regression, slightly re-organized and with the workaround. I have added a droplevels() to the dataset2 to remove the missing level 3 from the factor, though this is not related with the workaround.



          library(mlbench)
          library(mlr)
          data(BreastCancer)

          p<-mlbench.waveform(1000)
          dataset<-as.data.frame(p)
          dataset2 = subset(dataset, classes != 3)
          dataset2 <- droplevels(dataset2 )

          mCT <- makeClassifTask(data =dataset2, target = "classes")
          ctrl = makeFeatSelControlSequential(method = "sffs", maxit = NA,alpha = 0.001)
          mL <- makeLearner("classif.logreg", predict.type = "prob")
          inner = makeResampleDesc("Holdout",stratify = TRUE)
          lrn = makeFeatSelWrapper(mL, resampling = inner, control = ctrl, show.info = FALSE)
          # uncomment this for the error to appear again. Might need to run the code a couple of times to see the error
          # lrn = makeFeatSelWrapper(mL, resampling = inner, control = ctrl)
          outer = makeResampleDesc("CV", iters = 10,stratify = TRUE)
          r = resample(lrn, mCT, outer, extract = getFeatSelResult,measures = list(mlr::auc,mlr::acc,mlr::brier),models=TRUE)


          Edit: I've reported an issue and created a pull request with a fix.






          share|improve this answer

























          • Thank you. After further tests, it seems that it is linked to the use of sffs method.

            – Chris
            Nov 14 '18 at 8:54










          Your Answer






          StackExchange.ifUsing("editor", function ()
          StackExchange.using("externalEditor", function ()
          StackExchange.using("snippets", function ()
          StackExchange.snippets.init();
          );
          );
          , "code-snippets");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "1"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53263251%2fhow-to-jointly-use-makefeatselwrapper-and-resample-function-in-mlr%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          0














          If you run your logistic regression part of the code a couple of times, you should also get the Error in sum(x) : invalid 'type' (list) of argument error. However, I find it strange that fixing a particular seed (e.g., set.seed(1)) before resampling does not ensure that the error does or does not appear.



          The error occurs in internal mlr code for printing the output of feature selection to the console. A very simple workaround is to simply avoid printing such output with show.info = FALSE in makeFeatSelWrapper (see code below). While this removes the error, it is possible that what caused it may have other consequences, although I it is possible the error only affects the printing code.



          When running your code, I only get AUC above 0.90. Please find below a your code for logistic regression, slightly re-organized and with the workaround. I have added a droplevels() to the dataset2 to remove the missing level 3 from the factor, though this is not related with the workaround.



          library(mlbench)
          library(mlr)
          data(BreastCancer)

          p<-mlbench.waveform(1000)
          dataset<-as.data.frame(p)
          dataset2 = subset(dataset, classes != 3)
          dataset2 <- droplevels(dataset2 )

          mCT <- makeClassifTask(data =dataset2, target = "classes")
          ctrl = makeFeatSelControlSequential(method = "sffs", maxit = NA,alpha = 0.001)
          mL <- makeLearner("classif.logreg", predict.type = "prob")
          inner = makeResampleDesc("Holdout",stratify = TRUE)
          lrn = makeFeatSelWrapper(mL, resampling = inner, control = ctrl, show.info = FALSE)
          # uncomment this for the error to appear again. Might need to run the code a couple of times to see the error
          # lrn = makeFeatSelWrapper(mL, resampling = inner, control = ctrl)
          outer = makeResampleDesc("CV", iters = 10,stratify = TRUE)
          r = resample(lrn, mCT, outer, extract = getFeatSelResult,measures = list(mlr::auc,mlr::acc,mlr::brier),models=TRUE)


          Edit: I've reported an issue and created a pull request with a fix.






          share|improve this answer

























          • Thank you. After further tests, it seems that it is linked to the use of sffs method.

            – Chris
            Nov 14 '18 at 8:54















          0














          If you run your logistic regression part of the code a couple of times, you should also get the Error in sum(x) : invalid 'type' (list) of argument error. However, I find it strange that fixing a particular seed (e.g., set.seed(1)) before resampling does not ensure that the error does or does not appear.



          The error occurs in internal mlr code for printing the output of feature selection to the console. A very simple workaround is to simply avoid printing such output with show.info = FALSE in makeFeatSelWrapper (see code below). While this removes the error, it is possible that what caused it may have other consequences, although I it is possible the error only affects the printing code.



          When running your code, I only get AUC above 0.90. Please find below a your code for logistic regression, slightly re-organized and with the workaround. I have added a droplevels() to the dataset2 to remove the missing level 3 from the factor, though this is not related with the workaround.



          library(mlbench)
          library(mlr)
          data(BreastCancer)

          p<-mlbench.waveform(1000)
          dataset<-as.data.frame(p)
          dataset2 = subset(dataset, classes != 3)
          dataset2 <- droplevels(dataset2 )

          mCT <- makeClassifTask(data =dataset2, target = "classes")
          ctrl = makeFeatSelControlSequential(method = "sffs", maxit = NA,alpha = 0.001)
          mL <- makeLearner("classif.logreg", predict.type = "prob")
          inner = makeResampleDesc("Holdout",stratify = TRUE)
          lrn = makeFeatSelWrapper(mL, resampling = inner, control = ctrl, show.info = FALSE)
          # uncomment this for the error to appear again. Might need to run the code a couple of times to see the error
          # lrn = makeFeatSelWrapper(mL, resampling = inner, control = ctrl)
          outer = makeResampleDesc("CV", iters = 10,stratify = TRUE)
          r = resample(lrn, mCT, outer, extract = getFeatSelResult,measures = list(mlr::auc,mlr::acc,mlr::brier),models=TRUE)


          Edit: I've reported an issue and created a pull request with a fix.






          share|improve this answer

























          • Thank you. After further tests, it seems that it is linked to the use of sffs method.

            – Chris
            Nov 14 '18 at 8:54













          0












          0








          0







          If you run your logistic regression part of the code a couple of times, you should also get the Error in sum(x) : invalid 'type' (list) of argument error. However, I find it strange that fixing a particular seed (e.g., set.seed(1)) before resampling does not ensure that the error does or does not appear.



          The error occurs in internal mlr code for printing the output of feature selection to the console. A very simple workaround is to simply avoid printing such output with show.info = FALSE in makeFeatSelWrapper (see code below). While this removes the error, it is possible that what caused it may have other consequences, although I it is possible the error only affects the printing code.



          When running your code, I only get AUC above 0.90. Please find below a your code for logistic regression, slightly re-organized and with the workaround. I have added a droplevels() to the dataset2 to remove the missing level 3 from the factor, though this is not related with the workaround.



          library(mlbench)
          library(mlr)
          data(BreastCancer)

          p<-mlbench.waveform(1000)
          dataset<-as.data.frame(p)
          dataset2 = subset(dataset, classes != 3)
          dataset2 <- droplevels(dataset2 )

          mCT <- makeClassifTask(data =dataset2, target = "classes")
          ctrl = makeFeatSelControlSequential(method = "sffs", maxit = NA,alpha = 0.001)
          mL <- makeLearner("classif.logreg", predict.type = "prob")
          inner = makeResampleDesc("Holdout",stratify = TRUE)
          lrn = makeFeatSelWrapper(mL, resampling = inner, control = ctrl, show.info = FALSE)
          # uncomment this for the error to appear again. Might need to run the code a couple of times to see the error
          # lrn = makeFeatSelWrapper(mL, resampling = inner, control = ctrl)
          outer = makeResampleDesc("CV", iters = 10,stratify = TRUE)
          r = resample(lrn, mCT, outer, extract = getFeatSelResult,measures = list(mlr::auc,mlr::acc,mlr::brier),models=TRUE)


          Edit: I've reported an issue and created a pull request with a fix.






          share|improve this answer















          If you run your logistic regression part of the code a couple of times, you should also get the Error in sum(x) : invalid 'type' (list) of argument error. However, I find it strange that fixing a particular seed (e.g., set.seed(1)) before resampling does not ensure that the error does or does not appear.



          The error occurs in internal mlr code for printing the output of feature selection to the console. A very simple workaround is to simply avoid printing such output with show.info = FALSE in makeFeatSelWrapper (see code below). While this removes the error, it is possible that what caused it may have other consequences, although I it is possible the error only affects the printing code.



          When running your code, I only get AUC above 0.90. Please find below a your code for logistic regression, slightly re-organized and with the workaround. I have added a droplevels() to the dataset2 to remove the missing level 3 from the factor, though this is not related with the workaround.



          library(mlbench)
          library(mlr)
          data(BreastCancer)

          p<-mlbench.waveform(1000)
          dataset<-as.data.frame(p)
          dataset2 = subset(dataset, classes != 3)
          dataset2 <- droplevels(dataset2 )

          mCT <- makeClassifTask(data =dataset2, target = "classes")
          ctrl = makeFeatSelControlSequential(method = "sffs", maxit = NA,alpha = 0.001)
          mL <- makeLearner("classif.logreg", predict.type = "prob")
          inner = makeResampleDesc("Holdout",stratify = TRUE)
          lrn = makeFeatSelWrapper(mL, resampling = inner, control = ctrl, show.info = FALSE)
          # uncomment this for the error to appear again. Might need to run the code a couple of times to see the error
          # lrn = makeFeatSelWrapper(mL, resampling = inner, control = ctrl)
          outer = makeResampleDesc("CV", iters = 10,stratify = TRUE)
          r = resample(lrn, mCT, outer, extract = getFeatSelResult,measures = list(mlr::auc,mlr::acc,mlr::brier),models=TRUE)


          Edit: I've reported an issue and created a pull request with a fix.







          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Nov 14 '18 at 16:12

























          answered Nov 13 '18 at 14:31









          bojanbojan

          1335




          1335












          • Thank you. After further tests, it seems that it is linked to the use of sffs method.

            – Chris
            Nov 14 '18 at 8:54

















          • Thank you. After further tests, it seems that it is linked to the use of sffs method.

            – Chris
            Nov 14 '18 at 8:54
















          Thank you. After further tests, it seems that it is linked to the use of sffs method.

          – Chris
          Nov 14 '18 at 8:54





          Thank you. After further tests, it seems that it is linked to the use of sffs method.

          – Chris
          Nov 14 '18 at 8:54

















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53263251%2fhow-to-jointly-use-makefeatselwrapper-and-resample-function-in-mlr%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          How to how show current date and time by default on contact form 7 in WordPress without taking input from user in datetimepicker

          Syphilis

          Darth Vader #20