Mason will always run the optimization block before the statistical block.

Generally, statistical goals should be relaxed from the optimization requirements.

<statistics>

<random iterations="100“></monte>

<goal calculation=”20*log10(abs(S_1_1))”

min=”-200”

max=”-16”

weight=”1”></goal>

</statistics>

Generally, statistical goals should be relaxed from the optimization requirements.

Look at the device section to see how to embed optimization limits for particular parameters.

When you click on the statistical component, the following window pop-ups (click to enlarge). Initially only one Optimizer and one Argument are defined. Below, a second optimizer has been added using the Add button in the bottom left. Note when optimizing, it makes sense to do a random followed by a simplex to efficiently search the space. When doing statistical work, it makes sense to do a simplex to improve a (hopefully already optimized) design, and then a random (Monte Carlo) to check the optimization.

The following are valid stat routines:

### TinyCAD

The statistical block looks as follows in the GUI, essentially everything in the TinyCAD component is pasted into the Mason file and follows a fairly logical progression.When you click on the statistical component, the following window pop-ups (click to enlarge). Initially only one Optimizer and one Argument are defined. Below, a second optimizer has been added using the Add button in the bottom left. Note when optimizing, it makes sense to do a random followed by a simplex to efficiently search the space. When doing statistical work, it makes sense to do a simplex to improve a (hopefully already optimized) design, and then a random (Monte Carlo) to check the optimization.

### Details

The following are valid stat routines:

- random
- Description: perform a monte carlo run
- Inputs
- iterations: sampling size of the Monte Carlo

- iterations: sampling size of the Monte Carlo

- simplex (see validation16.txt)
- Description: simplex is a non-linear gradient search
- Performs a simplex using the Fail Rate from Monte Carlo runs as the figure of merit
- Able to optimize over both linear and discrete values
- Approximates deterministic solution IF sufficient iterations in the Monte Carlo used

- Performs a simplex using the Fail Rate from Monte Carlo runs as the figure of merit
- Inputs
- iterations: maximum number of simplex runs to use (due to the
randomness in the Fail Rate due to the Monte Carlo approach, it is
possible for the simplex to find an improvement but no local minimum). With the optimization version of simplex multiple iterations isn't very helpful; due to the random nature of the Monte Carlo multiple iterations will help.
- monte_samples: number of iterations in the Monte Carlo used to generate the Figure of Merit (not the iterations for the simplex)
- tolerance (default 1e-6): when the difference between the FOM between runs drops below this value, stop the optimizer. The value is compared against the current gradient step divided by the allowable spread: delta/(opt_max-opt_min)

- iterations: maximum number of simplex runs to use (due to the
randomness in the Fail Rate due to the Monte Carlo approach, it is
possible for the simplex to find an improvement but no local minimum). With the optimization version of simplex multiple iterations isn't very helpful; due to the random nature of the Monte Carlo multiple iterations will help.

- Description: simplex is a non-linear gradient search