The other week, I posted a simple algorithm to figure out Aumann-Serrano riskiness. The algorithm is slow and not very inventive, so I have been brainstorming all week how to improve it.
Since we know exactly the value we are trying to reach and the parameters of the output, I figured we could converge on the solution from both sides and arrive at the solution much more quickly.
Thus, I redesigned the algorithm to bounce back and forth between max and min values, dividing by half for each iteration. Here is the source code for my redesigned version of asRisk(). As always, feed it a vector of possible returns.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 | #This function bounces back and forth to find asRisk much quicker #Feed me a vector of returns asRisk <- function(x){ #If there are no negative bets and negative expected value, then return 0 if (mean(x)<0|min(x)>=0){ return(0) } else { #first let's use e as our first guess asNumber <- exp(1) total <- sum((1/length(x))*exp(-x/asNumber)) #If the number is too low, we will try to find a min limit if (total < 1){ max <- asNumber while (total < 1){ asNumber <- asNumber / 10 total <- sum((1/length(x))*exp(-x/asNumber)) } #we found the max, so set the min min <- asNumber } else if (total > 1){ min <- asNumber while (total > 1){ asNumber <- asNumber * 10 total <- sum((1/length(x))*exp(-x/asNumber)) } #we found the min, so set the max max <- asNumber } #Precision is adjustable #We will do until difference between max/min #is within tolerance. while ((max-min)>0.000000001){ asNumber <- (max+min)/2 total <- sum((1/length(x))*exp(-x/asNumber)) if(total > 1){ min <- asNumber } else if (total < 1){ max <- asNumber } } return(sprintf("%.6f",asNumber)) } } |
You can fiddle around with the code to adjust the precision, but I am currently very happy with the speed/precision tradeoff at the moment. This new function quickly calculates very precise Aumann-Serrano riskiness, as verified by my older function. I have yet to include warnings though. If the problem of out of bounds, I merely return "0" rather than throwing an error.
Bonus: The code to produce the graph pictured above
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 | library(quantmod) library(ggplot2) getSymbols("AAPL") x<-weeklyReturn(AAPL) count<-1 minV <- 1:100 maxV <- 1:100 if (mean(x)<0|min(x)>=0){ return(0) } else { asNumber <- exp(1) total <- sum((1/length(x))*exp(-x/asNumber)) count <- count + 1 if (total < 1){ max <- asNumber maxV[count] <-asNumber while (total < 1){ asNumber <- asNumber / 10 total <- sum((1/length(x))*exp(-x/asNumber)) count <-count + 1 } #we found the max, so set the min min <- asNumber minV[count] <- asNumber } else if (total > 1){ min <- asNumber minV[count] <- asNumber while (total > 1){ asNumber <- asNumber * 10 total <- sum((1/length(x))*exp(-x/asNumber)) count <- count + 1 } #we found the min, so set the max max <- asNumber maxV[count] <- asNumber } #Precision is adjustable #We will do until difference between max/min #is within tolerance. while ((max-min)>0.000000001){ asNumber <- (max+min)/2 total <- sum((1/length(x))*exp(-x/asNumber)) count <- count + 1 if(total > 1){ min <- asNumber minV[count] <- asNumber } else if (total < 1){ max <- asNumber maxV[count] <- asNumber } } } #trim output and put it into data fram output minV <- subset(minV, (minV %% 1)!=0) maxV <- subset(maxV, (maxV %% 1)!=0) if(length(minV)<length(maxV)){ count <- length(minV) } else { count <- length(maxV) } output <- data.frame(count=c(1:count),min=minV[1:count], max=maxV[1:count]) #let's graph g <- ggplot(output, aes(count, min)) g <- g + geom_point() + geom_point(aes(count,max)) g <- g+ geom_ribbon(aes(x=count, ymin=min, ymax=max), alpha=.4) g <- g + opts(title='Convergence')+ scale_x_continuous('Repetitions')+scale_y_continuous('Max and Min') g ggsave("file.png") |
I don't have R but I've used Matlab to calculate AS via the Newton algorithm, also with Mathematica. I have faster method when you use historical data. Keep me posted if you are still on the subject.
I will try to emulate your algo from R on Matlab