Answer
We show that
$c$ is the minimum value of $f\left( {{x_1},...,{x_n}} \right) = {x_1}^2{\sigma _1}^2 + \cdot\cdot\cdot + {x_n}^2{\sigma _n}^2$ subject to ${x_1} + \cdot\cdot\cdot + {x_n} = 1$, where $c = {\left( {\mathop \sum \limits_{j = 1}^n {\sigma _j}^{ - 2}} \right)^{ - 1}}$.
Work Step by Step
We have $f\left( {{x_1},...,{x_n}} \right) = {x_1}^2{\sigma _1}^2 + \cdot\cdot\cdot + {x_n}^2{\sigma _n}^2$, where ${\sigma _1},...,{\sigma _n}$ are nonzero numbers, and the constraint $g\left( {x,y,z} \right) = {x_1} + \cdot\cdot\cdot + {x_n} - 1 = 0$. Our task is to find the minimum value of $f$.
Step 1. Write out the Lagrange equations
Using Theorem 1, the Lagrange condition $\nabla f = \lambda \nabla g$ yields
$\left( {2{x_1}{\sigma _1}^2,2{x_2}{\sigma _2}^2,...,2{x_n}{\sigma _n}^2} \right) = \lambda \left( {1,1,...,1} \right)$
(1) ${\ \ \ }$ $2{x_1}{\sigma _1}^2 = \lambda $, ${\ \ }$ $2{x_2}{\sigma _2}^2 = \lambda $, ${\ \ }$ ..., ${\ \ }$ $2{x_n}{\sigma _n}^2 = \lambda $
Step 2. Solve for $\lambda$ in terms of $x$ and $y$
Since $\left( {0,0,0} \right)$ does not satisfy the constraint, we may assume that ${x_1},{x_2},...,{x_n} \ne 0$.
So, $\lambda = 2{x_1}{\sigma _1}^2 = 2{x_2}{\sigma _2}^2 = \cdot\cdot\cdot = 2{x_n}{\sigma _n}^2$.
Step 3. Solve for $x$ and $y$ using the constraint
From Step 2, we obtain
${x_1}{\sigma _1}^2 = {x_2}{\sigma _2}^2 = \cdot\cdot\cdot = {x_n}{\sigma _n}^2$
${x_1}{\sigma _1}^2 = {x_2}{\sigma _2}^2$, ${\ \ \ }$ ${x_1}{\sigma _1}^2 = {x_3}{\sigma _3}^2$, ${\ \ \ }$ ..., ${\ \ \ }$ ${x_1}{\sigma _1}^2 = {x_n}{\sigma _n}^2$
So,
${x_2} = {x_1}\frac{{{\sigma _1}^2}}{{{\sigma _2}^2}}$, ${\ \ \ }$ ${x_3} = {x_1}\frac{{{\sigma _1}^2}}{{{\sigma _3}^2}}$, ${\ \ \ }$ ..., ${\ \ \ }$ ${x_n} = {x_1}\frac{{{\sigma _1}^2}}{{{\sigma _n}^2}}$
Substituting ${x_2},{x_3},...,{x_n}$ in the constraint $g\left( {x,y,z} \right) = {x_1} + ... + {x_n} - 1 = 0$ gives
${x_1} + {x_1}\frac{{{\sigma _1}^2}}{{{\sigma _2}^2}} + {x_1}\frac{{{\sigma _1}^2}}{{{\sigma _3}^2}} + \cdot\cdot\cdot + {x_1}\frac{{{\sigma _1}^2}}{{{\sigma _n}^2}} - 1 = 0$
${x_1}\left( {1 + \frac{{{\sigma _1}^2}}{{{\sigma _2}^2}} + \frac{{{\sigma _1}^2}}{{{\sigma _3}^2}} + \cdot\cdot\cdot + \frac{{{\sigma _1}^2}}{{{\sigma _n}^2}}} \right) = 1$
${x_1}\left( {\frac{{{\sigma _1}^2}}{{{\sigma _1}^2}} + \frac{{{\sigma _1}^2}}{{{\sigma _2}^2}} + \frac{{{\sigma _1}^2}}{{{\sigma _3}^2}} + \cdot\cdot\cdot + \frac{{{\sigma _1}^2}}{{{\sigma _n}^2}}} \right) = 1$
${x_1}{\sigma _1}^2\left( {\frac{1}{{{\sigma _1}^2}} + \frac{1}{{{\sigma _2}^2}} + \frac{1}{{{\sigma _3}^2}} + \cdot\cdot\cdot + \frac{1}{{{\sigma _n}^2}}} \right) = 1$
${x_1} = \frac{1}{{{\sigma _1}^2}}{\left( {\frac{1}{{{\sigma _1}^2}} + \frac{1}{{{\sigma _2}^2}} + \frac{1}{{{\sigma _3}^2}} + \cdot\cdot\cdot + \frac{1}{{{\sigma _n}^2}}} \right)^{ - 1}}$
Write $c = {\left( {\mathop \sum \limits_{j = 1}^n {\sigma _j}^{ - 2}} \right)^{ - 1}}$. So, ${x_1} = \frac{c}{{{\sigma _1}^2}}$.
Substituting ${x_1} = \frac{c}{{{\sigma _1}^2}}$ in ${x_2} = {x_1}\frac{{{\sigma _1}^2}}{{{\sigma _2}^2}}$, ${x_3} = {x_1}\frac{{{\sigma _1}^2}}{{{\sigma _3}^2}}$, ..., ${x_n} = {x_1}\frac{{{\sigma _1}^2}}{{{\sigma _n}^2}}$ gives
${x_2} = \frac{c}{{{\sigma _2}^2}}$, ${\ \ \ }$ ${x_3} = \frac{c}{{{\sigma _3}^2}}$, ${\ \ \ }$ ..., ${\ \ \ }$ ${x_n} = \frac{c}{{{\sigma _n}^2}}$
So, the critical point is $\left( {\frac{c}{{{\sigma _1}^2}},\frac{c}{{{\sigma _2}^2}},...,\frac{c}{{{\sigma _n}^2}}} \right)$.
Step 4. Calculate the critical values
We evaluate $f$ at the critical point:
$f\left( {\frac{c}{{{\sigma _1}^2}},\frac{c}{{{\sigma _2}^2}},...,\frac{c}{{{\sigma _n}^2}}} \right) = {\left( {\frac{c}{{{\sigma _1}^2}}} \right)^2}{\sigma _1}^2 + \cdot\cdot\cdot + {\left( {\frac{c}{{{\sigma _n}^2}}} \right)^2}{\sigma _n}^2$
$ = \frac{{{c^2}}}{{{\sigma _1}^2}} + \cdot\cdot\cdot + \frac{{{c^2}}}{{{\sigma _n}^2}}$
$ = {c^2}\left( {\frac{1}{{{\sigma _1}^2}} + \frac{1}{{{\sigma _2}^2}} + \frac{1}{{{\sigma _3}^2}} + \cdot\cdot\cdot + \frac{1}{{{\sigma _n}^2}}} \right)$
Since $c = {\left( {\mathop \sum \limits_{j = 1}^n {\sigma _j}^{ - 2}} \right)^{ - 1}}$ or $\frac{1}{c} = \left( {\mathop \sum \limits_{j = 1}^n {\sigma _j}^{ - 2}} \right)$, so $f\left( {\frac{c}{{{\sigma _1}^2}},\frac{c}{{{\sigma _2}^2}},...,\frac{c}{{{\sigma _n}^2}}} \right) = c$.
Since $f\left( {{x_1},...,{x_n}} \right) = {x_1}^2{\sigma _1}^2 + \cdot\cdot\cdot + {x_n}^2{\sigma _n}^2$ is increasing, we conclude that $c$ is the minimum value of $f$ subject to ${x_1} + \cdot\cdot\cdot + {x_n} = 1$, where $c = {\left( {\mathop \sum \limits_{j = 1}^n {\sigma _j}^{ - 2}} \right)^{ - 1}}$.