2

I have heard of ensemble methods, such as XGBoost, for binary or categorical machine learning models. However, does this exist for regression? If so, how are the weights for each model in the process of predictions determined?

I am looking to do this manually, as I was planning on training two different models using separate frameworks (YoloV3 aka Darknet and Tensorflow for bounding box regression). Is there a way I can establish a weight for each model in the overall prediction for these boxes?

Or is this a bad idea?

nbro
  • 42,615
  • 12
  • 119
  • 217
niallmandal
  • 211
  • 2
  • 6

2 Answers2

1

Table ? This means that there are not promising versions of this algorithm fro regression until 2012. After your question, I have found one of the survey research paper which is done or ensemple methods for regression. This table also extracted from this paper. Read this paper, it will help you a lot more

This one is latest paper published on object detection with an ensemble approach

0

There's similar boosting classes in XGBoost for regression. You can implement their built-in classes for your problem, rather than implementing from scratch. You can read more about it from their website. You can also take a look at catboost, which implements a different approach.