For index-based hedging design, the scatter plot of the hedging contract losses versus the losses to be hedged is generally used to visualize and quantify basis risk. While studying this scatter plot, which does not cluster along the diagonal as desired, a “bundled loss” phenomenon is found. In a setting where both the hedging and the hedged contracts have 100,000 years of simulated losses, this shows that if we need to hedge one loss in a year for the hedged contract, we may need to pay for other losses in other years in the hedging contract, which are unnecessary and unwanted. The reason is that the index used in the hedging may have identical loss values in different years while the hedged contract may not. This finding is a guiding principle for forming the risk measures and solution frameworks. To solve the problem so formed, a hybrid multi-parent and orthogonal crossover genetic algorithm, GA-MPC-OX, is used and pertinent adjustments are studied. For a problem with hundreds of dimensions, using eleven parents seems best, while a problem with tens of dimensions would prefer nine parents. Depending on the dimensions, relevant best strategies of the orthogonal crossover are also suggested by experimental results. To combat the stagnation of the algorithm, the perturbation by Lévy stable distribution is studied. This reveals possible effective parameters and forms. Numerical comparison with other algorithms is also conducted that confirms its competence for the hedging problem.

## Citation

Validus Research Inc. Oct 2014