# Tecdat: R language portfolio optimization Solver: constrained optimization and nonlinear programming

Time：2021-7-12

## Link to the original text:http://tecdat.cn/?p=22853

This paper introduces different solvers in r that can be used for portfolio optimization.

# Universal solver

The general solver can deal with any nonlinear optimization problem, but the cost may be slow convergence.

## defaultpackage

The package stats (basic R package installed by default) provides several general optimizations.

• optimize()。 For one-dimensional unconstrained function optimization in the interval (for one-dimensional root, use uniroot ()).
``````f <- function(x) exp(-0.5\*x) \* sin(10\*pi\*x)
f(0.5)`````` ``````result <- optimize(f, interval = c(0, 1), tol = 0.0001)
result`````` ``````#   draw
curve(0, 1, n = 200)`````` • `Optim () general optimization, there are six different optimization methods.`
• `Nelder Mead: relatively robust method (default), without derivative.`
• `CG: low memory optimization for high dimensional unconstrained problems`
• `BFGS: a simple unconstrained quasi Newton method`
• `L-bfgs-b: optimization for boundary constrained problems`
• `SANN`: simulated annealing method
• `Brent`: for one-dimensional problems (actually calling optimize()).

This example makes a least square fitting: minimization ``````#   Data points to fit
#   L2 norm error square of linear fitting   y  ~  par\[1\]  +  par\[2\]*x
#    Call the Solver (initial value is C (0,   1) , the default method is  " Nelder-Mead"）。
optim(par = c(0, 1), f, data = dat)

#   Draw linear regression chart``````  ``````#   Compare with the built-in linear regression in R
lm(y ~ x, data = dat)`````` The next example illustrates the use of gradient, the famous Rosenbrock banana function:  Unconstrained minimization problem ````````````
#    Rosenbrock banana function and its gradient
banana <- function(x)
c(-400 * x\[1\] * (x\[2\] - x\[1\] * x\[1\]) - 2 * (1 - x\[1\]),
200 * (x\[2\] - x\[1\] * x\[1\]))
optim(c(-1.2, 1), f_banana)`````` ``optim(c(-1.2, 1), f, gr, method = "BFGS")``

The following example uses bound constraints.

minimize Constraints: ``````p <- length(x); sum(c(1, rep(4, p-1)) * (x - c(1, x\[-p\])^2)^2) }
#   25 dimensional constraints
optim(rep(3, 25), f,lower = rep(2, 25), upper = rep(4`````` This example uses simulated annealing (for global optimization).

``````#The global minimum is about - 15
res <- optim(50, f, method = "SANN")`````` ``````#   Now make a partial improvement (usually only a small part)
optim(res\$par, f , method = "BFGS")``````  • constrOptim()。 An adaptive constraint algorithm is used to minimize a function under linear inequality constraints (optim () is called).
````````````
#    Inequality constraints (UI)  %*%  theta  >=  ci):   x  <=  0.9,    y  -  x  >  zero point one
constrOptim(c(.5, 0)`````` • `nlm()`This function uses Newton algorithm to minimize the objective function.
``nlm(f, c(10,10))``  • `nlminb()`: unbounded constrained optimization
``nlminb(c(-1.2, 1), f)`` ``nlminb(c(-1.2, 1), f, gr)`` ## optim

As a package of many other solvers, the basic function optim () can be used and compared easily. ``````#   opm()   Several methods can be used at the same time
opm(  f , method = c("Nelder-Mead", "BFGS"))`````` ## global optimization

The concept of global optimization is completely different from that of local optimization (global optimization solver is usually called stochastic solver, trying to avoid local optimum).

# Solvers for specific class problems

If the problem to be solved belongs to a certain kind of problem, such as LS, LP, MILP, QP, SOCP or SDP, it is better to use the special solver for this kind of problem.

## Least squares (LS)

The linear least squares (LS) problem is to combine Minimization, which may be bounded or linearly constrained.

## Linear programming (LP)

The function solvelp() can easily solve the following LP problems:

Minimize: Constraints: ````````````

cvec  <-  c(1800,   600,   600)   #  Gross profit rate
bvec  <-  c(40,   90,   2500)   #  Amount of donation

#   Run the solver
solveLP(maximum = TRUE)`````` ## Mixed integer linear programming  ( MILP)

Lpsolve (much faster than linprog because it is coded in C language) can solve the linear mixed integer problem (LP with some integer constraints).

``````#   Setup questions:
#   maximize      x1 + 9 x2 + x3
#   subject to    x1 + 2 x2 + 3 x3 <= 9
#               3 x1 + 2 x2 + 2 x3 <= 15

#   Run the solution
res <- lp("max", f, con)``````  ``````#   Run again, this time all three variables are integers
lp(  int.vec = 1:3)`````` ``solution`` The following forms of QP can be easily solved

``````Minimize:
Constraints:``````
``````#   Setup questions:
#   minimize    -(0 5 0) %*% x + 1/2 x^T x
#   subject to  A^T x >= b
#   with b = (-8,2,0)^T
#       (-4 2  0)
#   A = (-3 1 -2)
#       ( 0 0  1)

#Run the solution
solve(Dmat,...)`````` The quadratic programming with absolute value constraints and absolute value in objective function is solved.

## Second order cone programming (SOCP)

There are several packages:

• Ecosolver provides an interface with embedded conic Solver (eCos), which is a famous, efficient and robust C language library for solving convex problems.
• Clsocp provides an implementation of one-step smoothing Newton method for solving SOCP problems.

# Optimization basis

We’ve seen two packages, which are packages for many other solvers.

## For convex, MIP and nonconvex problems

ROI package provides a framework for dealing with optimization problems in R. It uses object-oriented method to define and solve various optimization tasks in R, which can come from different problem categories (for example, linear, quadratic, nonlinear programming problems).

LP – Consider LP: Constraints: ``````#>   ROI:   R   Optimize infrastructure
#>   Solver plug in:   nlminb,   ecos,   lpsolve,   scs.
#>   Default Solver:   auto.

OP(objective = L_objective(c(3, 7, -12)),...,
maximum = TRUE)

#>   Return on investment optimization problem:`````` ``````#   Let's look at the available solvers

# solve it
res <- ROI_solve(prob)
res`````` MILP – Consider the previous LP, and make it a MILP by adding constraints X2, x3 ∈ Z

``````#   Just change the previous question
types(prob) <- c("C", "I", "I")
prob`````` BLP – Consider binary linear programming  ( BLP):

Minimize: Constraints: `````` OP(objective = L_objective,..., ,
types = rep("B", 5))
ROI_solve(prob)

#> Optimal solution found.
#> The objective value is: -1.01e+02``````

SOCP – Consider SOCP:

Maximize: Constraints: Note the SOC constraints It can be written as or In the code, it is implemented as follows: `````` OP(objective = L_objective,...,
maximum = TRUE)`````` ## SDP — consider SDP:

Minimize: Constraints: And pay attention to the SDP constraints It can be written as (the size is 3 because in our problem, the matrix is 2 × 2, but vech () extracts three independent variables because the matrix is symmetric).

``````OP(objective = L_objective,...,
rhs ))`````` # NLP – Consider nonlinear programming (NLP)

Maximize constraint ``````

``````
OP(objective = F_objective,..., bounds ,
maximum = TRUE)`````` ## Convex optimization

R provides an object-oriented modeling language for convex optimization. It allows users to formulate convex optimization problems with natural mathematical syntax instead of the restrictive standard form required by most solvers. By using the function library with known mathematical characteristics, the set of objectives and constraints is specified by combining constants, variables and parameters. Now let’s look at a few examples.

least square method – Let’s start with a simple LS example: minimization Of course, we can use the basic linear model fitting function LM () of R.

``````#   Generating data
m <- 100
n <- 10
beta_true <- c(-4:5)

#   Generating data
res  <-  lm(y  ~  0  +  X)    #  0 means there is no intercept in our model.`````` Do it with cvxr

``````result <- solve(prob)
str(result)`````` We can now easily add a constraint to solve non negative ls.

``````Problem(Minimize(obj), constraints = list(beta >= 0))
solve(prob)`````` Robust Huber regression-Let’s consider a simple example of robust regression

minimize among ``````

``````
sum(huber(y - X %*% beta, M)
Problem(Minimize(obj))
solve(prob)``````

Regularization of elastic networks-The problem we’re going to solve now is: minimization ``````#   Define regularization term
elastic<- function(beta) {
ridge <- (1 - alpha) * sum(beta^2)
lasso <- alpha * p_norm(beta, 1)

#   Define the problem and solve it

sum((y - X %*% beta)^2) + elastic(beta, lambda, alpha)
Problem(Minimize(obj))
solve(prob)``````

Sparse inverse covariance matrix — consider the convex problem of matrix valueMaximize On the condition that ``````log\_det(X) - matrix\_trace(X %*% S)
list(sum(abs(X)) <= alpha)``````

Covariance — convex problem considering matrix value: in Under the condition of maximum ``````constr <- list(Sigma\[1,1\] == 0.2, Sigma\[1,2\] >= 0, Sigma\[1,3\] >= 0,
Sigma\[2,2\] == 0.1, Sigma\[2,3\] <= 0, Sigma\[2,4\] <= 0,
Sigma\[3,3\] == 0.3, Sigma\[3,4\] >= 0, Sigma\[4,4\] == 0.1)``````

Portfolio optimization–Consider Markowitz portfolio design: maximizing  ``````Problem(Maximize(obj), constr)
solve(prob)``````

# conclusion

There are many solvers available in R language. The following steps are recommended.

• If it is a convex optimization problem, then the preliminary test is started.
• If the speed is not fast enough, use ROI.
• If faster speed is still needed, then if the problem belongs to one of the well-defined categories, use a class specific Solver (for example, lpsolve is recommended for LP and quadprog is recommended for QP).
• However, if the problem does not belong to any category, then the general solver of nonlinear optimization must be used. In this sense, if a local solution is enough, many solver packages can be used. If you need a global solver, then the software package gloptim is a good choice. It is the package of many global solvers. Most popular insights

## Programming Xiaobai must understand the network principle

How is the network composed? Why can we easily surf the Internet now?Whether you are a computer major or not, you may always have such questions in your heart!And today we will solve this matter and tell you the real answer! Basic composition of network First, let’s look at this sentence Connect all computers together […]