Distributionally robust stochastic programs with side information based on trimmings
This is a research paper whose authors are Adrián Esteban-Pérez and Juan M. Morales.
Abstract:
- We look at stochastic programmes that are conditional on some covariate information, where the only knowledge of the possible relationship between the unknown parameters and the covariates is a limited data sample of their joint distribution. We build a data-driven Distributionally Robust Optimization (DRO) framework to hedge the decision against the inherent error in the process of inferring conditional information from limited joint data by leveraging the close relationship between the notion of trimmings of a probability measure and the partial mass transportation problem.
- We demonstrate that our technique is computationally as tractable as the usual (no side information) Wasserstein-metric-based DRO and provides performance guarantees. Furthermore, our DRO framework may be easily applied to data-driven decision-making issues involving tainted samples. Finally, using a single-item newsvendor problem and a portfolio allocation problem with side information, the theoretical findings are presented.
Conclusions:
- We used the relationship between probability reductions and partial mass transit in this study to give a straightforward, yet powerful and creative technique to expand the usual Wasserstein-metric-based DRO to the situation of conditional stochastic programming. In the process of inferring the conditional probability measure of the random parameters from a limited sample drawn from the genuine joint data-generating distribution, our technique generates judgments that are distributionally resilient to uncertainty. In a series of numerical tests based on the single-item newsvendor issue and a portfolio allocation problem, we proved that our strategy achieves much higher out-of-sample performance than several current options. We backed up these actual findings with theoretical analysis, demonstrating that our strategy had appealing performance guarantees.
To learn more about probability, visit :
brainly.com/question/11234923
#SPJ4
Alright so if you have a negative number let's say -5 if you put the absolute value symbols around it like so |-5| then it becomes positive. Whatever is in the lines becomes positive, that's why I call it the positive box. It describes the distance of a number on the number line from 0 without considering which direction from zero the number lies. The number is never negative
The answer is a because it starts at 0 then goes up to 23 and then it goes up to 46.
Answer:![{ \left[\begin{array}{ccc}45.5\\\\\end{array}\right] }{}\\\\\end{array}\right] } ----](https://tex.z-dn.net/?f=%7B%20%5Cleft%5B%5Cbegin%7Barray%7D%7Bccc%7D45.5%5C%5C%5C%5C%5Cend%7Barray%7D%5Cright%5D%20%7D%7B%7D%5C%5C%5C%5C%5Cend%7Barray%7D%5Cright%5D%20%7D%20----)
Step-by-step explanation: Hey, Basically we are going to arrange the data in an ascending order and the median is the middle value. If the number of values is an even number, the median will be the average of the two middle numbers.
Hope this helps!
<h2>
Answer:</h2>
The value of y is 8.4375
<h2>
Step-by-step explanation:</h2><h3>Known :</h3>
<h3>Asked :</h3>
- The value of y when x is 2.25
<h3>Solution :</h3>
2/7.5 = 2.25/y
Do a cross multiplication,
2/7.5 = 2.25/y
=> 20/75 = 2.25/y
=> 75 . 2.25 = 20y
=> 168.75 = 20y
Reverse the equation,
168.75 = 20y
=> 20y = 168.75
Find the value of y,
20y = 168.75
=> y = 8.4375
<h3>Conclusion :</h3>
y = 8.4375