A list of puns related to "Convex function"
I want to know if it's possible to maximize the sum of cumulative distribution functions for independent normal distributions in Excel using Solver (or OpenSolver).
https://preview.redd.it/rlpog89qmb471.png?width=276&format=png&auto=webp&s=9a23913091e70bbfe2f268a6187c5b733037d938
where Ξ¦(β ) is the standard normal cdf (or NORM.DIST in Excel). Additionally pi and qi are β₯0.
Since pi and qi are β₯0, then the arguments to Ξ¦ are β₯0, and Ξ¦ is concave in that region. Therefore I'd be maximizing a concave function (equivalently, minimizing a convex function). So it should be possible, but I'm not sure how to model it in Excel.
In convex optimization, we usually try to solve a problem of minimizing/maximizing a convex function over a convex set/domain. Intuitively, I think it makes sense but I'm not be able to point out the exact reason why we want the set to be convex. Most of the analysis that I've seen so far usually use the convexity of the objective function rather than the set. So my question is why the constraint convex set is important and what happens if we optimize a convex function over non-convex set? Any insights is appreciated!
Update 1: I am interested in nonparametric regression where the underlying regression function has a convexity property, rather than a convexity constraint.
--
OP: What are some nonparametric regression methods for convex functions and have consistency guarantees? For clarification, I'm looking for methods that don't necessarily produce convex functions, but just consistent estimates of convex functions. With that said, more general approaches also qualify, but I'm surveying for methods that are more efficient on convex functions.
I am trying to wrap my head around what are the differences between the following types of functions: convex functions, non-convex functions and concave functions.
Based on this post over here (https://stats.stackexchange.com/questions/324561/difference-between-convex-and-concave-functions ), it would seem that:
Convex Functions : a function that strictly has one minimum (relatively to easy to determine this minimum through optimization procedures)
Non Convex Functions: a function that could several minimums, but one global minimum (more difficult to determine the global minimum through optimization procedures)
Concave Functions: This is basically the "negative" of a convex functions.
Is this a correct understanding of these 3 types of functions?
Thanks
How to find out whether a function of two variables f(x, y) is concave or convex?
Say I want to prove
a/sqrt(a^2 + 8bc) + b/sqrt(b^2 + 8ac) + c/sqrt(c^2 + 8ab) β₯ 1 for positive real a,b,c.
Then we can just note that
[af(a^2 + 8bc) + bf(b^2 + 8ac) + cf(c^2 + 8ab)]/[a+b+c] β₯ f([a(a^2 + 8bc) + b(b^2 + 8ac) + c(c^2 + 8ab)]/[a+b+c]),
where f(x) := x^(-1/2). So rearranging, we get that
a/sqrt(a^2 + 8bc) + b/sqrt(b^2 + 8ac) + c/sqrt(c^2 + 8ab) β₯ (a+b+c)^(3/2) /sqrt(a^3 + b^3 + c^3 + 24abc),
so it suffices to show that (a + b + c)^3 β₯ a^3 + b^3 + c^3 + 24abc, which follows immediately.
However, how do we know which function to pick, and how do we know that function is indeed convex?
https://preview.redd.it/u0kyv48n5lz51.png?width=627&format=png&auto=webp&s=8d7aa34901e79a9394de878355e35e7ccf97fa96
https://preview.redd.it/pgab793q5lz51.png?width=640&format=png&auto=webp&s=c2ef0ed054b34c9fce142727bed79c3bf31fa095
At the end why do they have to mention that the same argument goes also for c<x. I thought in order to combine the two inequalities we only need the case x<c , so that we can always find some delta with |c-x|< delta and |f(c)-f(x)|<epsilon.
I am trying to solve a simple problem using "Convex" package and need some review to help me understand what wrong i am doing here
Problem : i have to arrange a set of 1 dimensional vectors (y1..ym) such that none overlaps with the other and all vectors put together occupies minimum space on the number line - Also every vector has to follow the interval sequence within the vector.
example : vector (y1)[1,7,9,10] has the following interval sequence (d) [0.0, 6.0, 2.0, 1.0] - in this case i expect the function to return [0.0, 6.0, 2.0, 1.0] since that is the minimum value given 0 >=x <= somemax b but it returns [6.0; 6.0; 6.0; 6.0]
function get_convex(distance,b)
function add_distance_constraints(d, x)
for i in 1:length(d)-1
v = x[i] - x[i+1]
p.constraints += [v >= d[i+1]]
end
end
s = length(distance)
Ξ» = Variable(s)
x = Variable(s)
r_sum = 0
r_p = 0
# or c' * x
p = minimize(dot(distance, x))
cobjects = []
for i in 1:length(x)-1
cobject = Ξ»[i] * (distance[i + 1] - distance[i] )
p.constraints += [x >= distance[i]]
push!(cobjects,cobject)
end
convex_combo = sum(cobjects)
p.constraints += [x>=0;x<=b;Ξ» >= 0;Ξ» <= 1;sum(Ξ») == 1;convex_combo == x]
solve!(p,solver,verbose=true)
println(p.optval)
print(distance)
print(round.(x.value; digits=3))
end
get_convex([0.0, 6.0, 2.0, 1.0],20)
I'm new to vector calculus and I'm trying to solve this sum. How do I approach this?
Thanks in advance for any answers.
Suggested Approach Compute the Hessian Hp(v)Hp(v). Function p(v)p(v) is convex around point vv if the eigenvalues of HpHp are both positive at vv*. On a coarse* [v1, v2][v1, v2] grid, compute the minimum eigenvalue. Look for regions where the minimum eigenvalue is 00*. Produce a contour plot of the minimum eigenvalue in that region, if you notice it. Carefully label your axes.*
A convex function looks sort of like the letter V when it has zero derivative. Concave functions can be remembered as "the other type of function" or by visualizing tipping the letter C over a bit.
Hello members, How do you solve a maximization of a convex problem that is linearly constrained with bounds on variables by using Lagrange relaxation? Thanks!
For LeNet trained on MNIST with the lowest possible loss (global minima),
For the following functions determine if they are convex or concave. Perhaps they are both, which are? Lastly, what is the sign of the derivative for each?
a. y = 4 - 4x + x^2
b. y = 6x^1/2, 0 < x < infinity
c. y = 18 = 12x - 6x^2 + x^3
I don't quite understand the question.
Hi! I'm currently struggling with a part of a problem. I need to evaluate a function, and then prove that the function is strictly convex. I've been shown this way of proving the convexity, but I've never seen an actual example with a function so I have no idea how I'm supposed to apply it.(Ξ»x1+(1-Ξ»)x2) < Ξ»f (x1)+(1-Ξ») f (x2)
My function is (Ξ΄x1 ^(-r-1))/((1- Ξ΄)x2 ^(-r-1))
I guess I have no choice but to prove it using the way with the lambdas I've written above, since my hessian matrix would be way too messy to easily be able to tell the convexity. I need help in knowing just how I'm supposed to proceed in order to prove it this way. Thanks in advance for your time!
This is in reference to Sutton's Reinforcement Introduction book (second edition). Chapter "Off-policy methods with approximation" > "Linera value-function geometry".
Linear value function space is a smaller subspace of the true value function space.
Value error projection (squared distance) seems to be convex. This is well-known I think.
Sutton suggests that minimimum projected Bellman error is not achievable via iterative process. To me that means the geometry of this projected Bellman error is not convex, having many local minima.
Note: Bellman error is actually the expected TD error, the direction you should take to get closer to the true value function.
Projected Bellman error is presented bacause in the case of function approximation you cannot optimize exactly to that becasue approximation function lives in a subspace. Even though the direction is right, but you will not actually get there. You will get only closest to there on your subspace.
That seems questionable that such a thing would happen in the linear function approxmiation case. Or, it is the case that linear function under this projected Bellman error is indeed non-convex.
I ask here for a confirmation if that is the case.
Concave generally means curved inwards (βlike a caveβ), while convex is the opposite. However, a βconcave functionβ curves outwards (βconvex shapeβ) while a convex function curves inwards (βconcave shapeβ). What is the reason for the functionsβ nomenclature being the opposite of the common usage of these terms for shapes?
TIA!
EDIT: (for visual clarity of what I mean) A concave shape: βUβ. Yet a βconcave functionβ graphed is shaped like ββ©β. A convex shape: ββ©β. βConvex functionβ on graph: βUβ For example, a βconcave abdomenβ in medicine means an abdomen which is compressed inwards, etc etc Looking for the difference between the medical nomenclature (or general laymanβs nomenclature) vs. what appears to be the mathematical terminology usage.
For classification problems, we use logistic regression. But if the cost function of logistic regression is always convex, and we are guaranteed one local minimum which is also the global maximum, then what's the purpose of other classification algorithm?
Why not just use logistic regression on ALL classification problems?
Am I right to understand that logistic regression will always get global minimum?
Besides performance issues, since logistic regression will always get global minimum, using other methods (artificial neural network, k-nearest neighbour, support vector machine) will not provide a better model?
I know it's an arbitrary naming thing and that you can say that the set of points above a convex function is convex but this has always been unintuitive and gnawed at me
There is f(x) which is convex. There is g(x) which is also convex. I want to minimize f wrt g.
Is this problem equivalent to minimizing ***f(x) + g(x)***?
If yes, can you provide me a proof? The way I know about solving constraint minimization is, we should come up with Lagrange multipliers and equate the gradient of both f and g and then solve. But why is this solved this way?
I have done all the work deriving the input requirement solutions for both the linear and non-convex to the origin case.
Linear case: we have either 1 of 2 corner solutions and a set of solutions, depending on mrts</=/>price ratio.
Non-convex case: we have either 1 of 2 corner solutions and both can be a solution, depending on mrts</=/>price ratio.
My problem is that when articulating the cost function in a shortened formed they are both given like this.
Can somehow explain to me the meaning of this form because:
the solution could be anywhere on the line.
the solution could be at either corner.
Descriptively they are different, but they are given in the same format.
Lets say I have a convex function f:K->R, where K is a convex subset of R^2 . Furthermore, f blows up near the boundary of K, and the rate depends on exactly where you are on the boundary.
Are there any "nice" methods/tricks for numerically approximating such functions? It is quite important that the singularity is preserved, ideally I should be able to make the approximation quite smooth, ideally at least C^4 , with easy to compute derivatives. In a perfect world the approximation would also be convex, but as long as it isn't too crazy, this shouldn't be a problem.
I had thought of approximating it by something of the form P(Log(Q(x))), where P and Q are polynomials to be determined by some kind of least squares argument, I know that the singularity is kind of logarithmic, but this seems totally infeasible to any degree of accuracy.
For what its worth, I can compute the function and its derivative relatively easily at a single point.
Thanks in advance for any and all help! This is one of those questions that I'm sure has a well researched answer if you know the right keywords.
[x^(-1/4)+y^(-1/4)+z^(-1/4)]^(-1/2)
x^(1/4)y^(1/2)z^(1/8)+ln(y^(2)z)-e^(x^(2)+y^(2)+z^(2)
Let us say I have f(x) which is convex and I have to minimize it with respect to a convex set G. We define G(x) to be an indicator function where
G(x) = 0**,** if x is in G,
G(x) = Inf, if x is not in G.
Thus G(x) is convex.
Can I simplify the problem to minimising f(x)+G(x)?
If yes, can you point me to its proof?
Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.