I will make the USL methodology clear...
USL methodology is like playing a probability game...
From my previous proof i can say that USL methodology that uses
nonlinear regression is a like playing probability game...
Because in the USL methodology the much much greater part of the chance probabilistically will hit and gives us the possibility of forecating up
to 10X the maximum number of cores and threads of the performance data measurements, so it is a better approximation.
And because a much much smaller part of the chance probabilistically
will hit and gives us the possibility of forecating up to 5X the maximum
number of cores and threads of the performance data measurements.
So forecasting up to 10X the maximum number of cores and threads
of the performance data measurements is the limit with USL methodlogy,
so if you want to optimize the criterion of the cost, you have to
forecast up to 10X the maximum number of cores and threads of the
performance data measurements, and see the tendency, if it says
that you can scale more and more on for example NUMA architecture ,
so when you want to buy bigger NUMA systems, make sure that you buy them
with the right configuration that permit to add more processors
and more memory, and you have to go buying step by step more and more processors and memory, and on each step you will be able to test again
the Computer NUMA system that you have bought empirically with my USL programs,to better forecast again farther the scalability and optimize
more the criterion of the cost, so as you have noticed my USL programs
are great tools and important tools !
I have included the 32 bit and 64 bit windows executables of my
programs inside the zip file to easy the job for you.
You can download my USL programs version 3.0 with the source code from:
Amine Moulay Ramdane.
--- SoupGate-Win32 v1.05
* Origin: fsxNet Usenet Gateway (21:1/5)