• More of my philosophy about the German model and about quality and more

    From Amine Moulay Ramdane@21:1/5 to All on Mon Sep 12 13:05:51 2022
    Hello,




    More of my philosophy about the German model and about quality and more of my thoughts..

    I am a white arab, and i think i am smart since i have also
    invented many scalable algorithms and algorithms..


    I think i am highly smart since I have passed two certified IQ tests and i have scored above 115 IQ, so i will ask the following philosophical question of:


    Why is Germany so successful in spite of least working hours?


    So i think one of the most important factors are:


    Of course the first factor is that Germany has good schools and vocational training - for everyone. This makes the average worker much more productive in terms of value add per hour.

    And the second "really" important factor is the following:

    It’s in the culture of Germany to focus on quality and being effective (all the way back to Martin Luther and his protestant work ethic)... Higher quality in every step of the chain leads to a massive reduction in defects and rework. This increases
    everyone’s productivity. But notice that i am also speaking in my below thoughts about the other ways to increase productivity by being specialization etc., and the way of the German model to focus on quality and being effective by also focusing on
    quality in every step of the chain that leads to a massive reduction in defects and rework, is also done by the following methodologies of quality control and Six Sigma etc., so read my following thoughts about them:

    More of my philosophy about quality control and more of my thoughts..

    I have just looked and understood quickly the following paper about SPC(Statistical process control):

    https://owic.oregonstate.edu/sites/default/files/pubs/EM8733.pdf


    I think i am highly smart, but i think that the above paper doesn't speak about the fact that you can apply the central limit theorem as following:

    The central limit theorem states that the sampling distribution of the mean of any independent, random variable will be normal or nearly normal, if the sample size is large enough.

    Also the above paper doesn't speak about the following very important things:

    And I have quickly understood quality control with SPC(Statistical process control) and i have just discovered a smart pattern with my fluid intelligence and it is that with SPC(Statistical process control) we can debug the process, like in software
    programming, by looking at its variability, so if the variability doesn't follow a normal distribution, so it means that there are defects in the process, and we say that there is special causes that causes those defects, and if the variability follows a
    normal distribution, we say that the process is stable and it has only common causes, and it means that we can control it much more easily by looking at the control charts that permit to debug and control the variability by for example changing the
    machines or robots and looking at the control charts and measuring again with the control charts

    More of my philosophy about the Post Graduate Program on lean Six Sigma and more..

    More of my philosophy about Six Sigma and more..

    I think i am smart, and now i will talk more about Six Sigma
    since i have just talked about SPC(Statistical quality control), so
    you have to know that Six Sigma needs to fulfill the following steps:

    1- Define the project goals and customer (external and internal)
    deliverables.

    2- Control future performance so improved process doesn't degrade.

    3- Measure the process so that to determine current performance and
    quantify the problem.

    4- Analyze and determine the root cause(s) of the defects.

    5- Improve the process by eliminating the defects.


    And you have to know that those steps are also important steps toward attaining ISO 9000 certification, and notice that you can use SPC(Statistical process control) and the control charts on step [4] and step [5] above.

    Other than that i have just read the following interesting important paper about SPC(Statistical process control) that explains all the process of SPC(Statistical process control), so i invite you to read it
    carefully:

    https://owic.oregonstate.edu/sites/default/files/pubs/EM8733.pdf

    So as you notice in the above paper that the central limit theorem
    in mathematics is so important, but notice carefully that the necessary and important condition so that the central limit theorem works is that you have to use independent and random variables, and notice in the above paper that you have to do two things
    and it's that you have to reduce or eliminate the defects and you have to control the "variability" of the defects, and this is why the paper is talking about how to construct a control chart. Other than that the central limit theorem is not only related
    to SPC(Statistical process control), but it is also related to PERT and my PERT++ software project below, and notice that in my software project below that is called PERT++, i have provided you with two ways of how to estimate the critical path, first,
    by the way of CPM(Critical Path Method) that shows all the arcs of the estimate of the critical path, and the second way is by the way of the central limit theorem by using the inverse normal distribution function, and you have to provide my software
    project that is called PERT++ with three types of estimates that are the following:

    Optimistic time - generally the shortest time in which the activity
    can be completed. It is common practice to specify optimistic times
    to be three standard deviations from the mean so that there is
    approximately a 1% chance that the activity will be completed within
    the optimistic time.

    Most likely time - the completion time having the highest
    probability. Note that this time is different from the expected time.

    Pessimistic time - the longest time that an activity might require. Three standard deviations from the mean is commonly used for the pessimistic time.

    And you can download my PERT++ from reading my following below thoughts:

    More of my philosophy about the central limit theorem and about my PERT++ and more..

    The central limit theorem states that the sampling distribution of the mean of any independent, random variable will be normal or nearly normal, if the sample size is large enough.

    How large is "large enough"?

    In practice, some statisticians say that a sample size of 30 is large enough when the population distribution is roughly bell-shaped. Others recommend a sample size of at least 40. But if the original population is distinctly not normal (e.g., is badly
    skewed, has multiple peaks, and/or has outliers), researchers like the sample size to be even larger. So i invite you to read my following thoughts about my software
    project that is called PERT++, and notice that the PERT networks are referred to by some researchers as "probabilistic activity networks" (PAN) because the duration of some or all of the arcs are independent random variables with known probability
    distribution functions, and have finite ranges. So PERT uses the central limit theorem (CLT) to find the expected project duration.

    And as you are noticing this Central Limit Theorem is also so important
    for quality control, read the following to notice it(I also understood Statistical Process Control (SPC)):

    An Introduction to Statistical Process Control (SPC)

    https://www.engineering.com/AdvancedManufacturing/ArticleID/19494/An-Introduction-to-Statistical-Process-Control-SPC.aspx

    Also PERT networks are referred to by some researchers as "probabilistic activity networks" (PAN) because the duration of some or all of the arcs are independent random variables with known probability distribution functions, and have finite ranges. So
    PERT uses the central limit theorem (CLT) to find the expected project duration.

    So, i have designed and implemented my PERT++ that that is important for quality, please read about it and download it from my website here:

    https://sites.google.com/site/scalable68/pert-an-enhanced-edition-of-the-program-or-project-evaluation-and-review-technique-that-includes-statistical-pert-in-delphi-and-freepascal

    ---


    So I have provided you in my PERT++ with the following functions:


    function NormalDistA (const Mean, StdDev, AVal, BVal: Extended): Single;

    function NormalDistP (const Mean, StdDev, AVal: Extended): Single;

    function InvNormalDist(const Mean, StdDev, PVal: Extended; const Less: Boolean): Extended;

    For NormalDistA() or NormalDistP(), you pass the best estimate of completion time to Mean, and you pass the critical path standard deviation to StdDev, and you will get the probability of the value Aval or the probability between the values of Aval and
    Bval.

    For InvNormalDist(), you pass the best estimate of completion time to Mean, and you pass the critical path standard deviation to StdDev, and you will get the length of the critical path of the probability PVal, and when Less is TRUE, you will obtain a
    cumulative distribution.


    So as you are noticing from my above thoughts that since PERT networks are referred to by some researchers as "probabilistic activity networks" (PAN) becaus
  • From "I don't know@21:1/5 to Amine Moulay Ramdane on Sat Oct 15 11:20:26 2022
    Everything is so developed as it can for the moment. If someone has a lot of free time, then he can think and develop things on to a fantastical or very great quality.




    On Monday, September 12, 2022 at 11:05:54 PM UTC+3, Amine Moulay Ramdane wrote:
    Hello,




    More of my philosophy about the German model and about quality and more of my thoughts..

    I am a white arab, and i think i am smart since i have also
    invented many scalable algorithms and algorithms..


    I think i am highly smart since I have passed two certified IQ tests and i have scored above 115 IQ, so i will ask the following philosophical question of:


    Why is Germany so successful in spite of least working hours?


    So i think one of the most important factors are:


    Of course the first factor is that Germany has good schools and vocational training - for everyone. This makes the average worker much more productive in terms of value add per hour.

    And the second "really" important factor is the following:

    It’s in the culture of Germany to focus on quality and being effective (all the way back to Martin Luther and his protestant work ethic)... Higher quality in every step of the chain leads to a massive reduction in defects and rework. This increases
    everyone’s productivity. But notice that i am also speaking in my below thoughts about the other ways to increase productivity by being specialization etc., and the way of the German model to focus on quality and being effective by also focusing on
    quality in every step of the chain that leads to a massive reduction in defects and rework, is also done by the following methodologies of quality control and Six Sigma etc., so read my following thoughts about them:

    More of my philosophy about quality control and more of my thoughts..

    I have just looked and understood quickly the following paper about SPC(Statistical process control):

    https://owic.oregonstate.edu/sites/default/files/pubs/EM8733.pdf


    I think i am highly smart, but i think that the above paper doesn't speak about the fact that you can apply the central limit theorem as following:

    The central limit theorem states that the sampling distribution of the mean of any independent, random variable will be normal or nearly normal, if the sample size is large enough.

    Also the above paper doesn't speak about the following very important things:

    And I have quickly understood quality control with SPC(Statistical process control) and i have just discovered a smart pattern with my fluid intelligence and it is that with SPC(Statistical process control) we can debug the process, like in software
    programming, by looking at its variability, so if the variability doesn't follow a normal distribution, so it means that there are defects in the process, and we say that there is special causes that causes those defects, and if the variability follows a
    normal distribution, we say that the process is stable and it has only common causes, and it means that we can control it much more easily by looking at the control charts that permit to debug and control the variability by for example changing the
    machines or robots and looking at the control charts and measuring again with the control charts

    More of my philosophy about the Post Graduate Program on lean Six Sigma and more..

    More of my philosophy about Six Sigma and more..

    I think i am smart, and now i will talk more about Six Sigma
    since i have just talked about SPC(Statistical quality control), so
    you have to know that Six Sigma needs to fulfill the following steps:

    1- Define the project goals and customer (external and internal) deliverables.

    2- Control future performance so improved process doesn't degrade.

    3- Measure the process so that to determine current performance and
    quantify the problem.

    4- Analyze and determine the root cause(s) of the defects.

    5- Improve the process by eliminating the defects.


    And you have to know that those steps are also important steps toward attaining ISO 9000 certification, and notice that you can use SPC(Statistical process control) and the control charts on step [4] and step [5] above.

    Other than that i have just read the following interesting important paper about SPC(Statistical process control) that explains all the process of SPC(Statistical process control), so i invite you to read it
    carefully:

    https://owic.oregonstate.edu/sites/default/files/pubs/EM8733.pdf

    So as you notice in the above paper that the central limit theorem
    in mathematics is so important, but notice carefully that the necessary and important condition so that the central limit theorem works is that you have to use independent and random variables, and notice in the above paper that you have to do two
    things and it's that you have to reduce or eliminate the defects and you have to control the "variability" of the defects, and this is why the paper is talking about how to construct a control chart. Other than that the central limit theorem is not only
    related to SPC(Statistical process control), but it is also related to PERT and my PERT++ software project below, and notice that in my software project below that is called PERT++, i have provided you with two ways of how to estimate the critical path,
    first, by the way of CPM(Critical Path Method) that shows all the arcs of the estimate of the critical path, and the second way is by the way of the central limit theorem by using the inverse normal distribution function, and you have to provide my
    software project that is called PERT++ with three types of estimates that are the following:

    Optimistic time - generally the shortest time in which the activity
    can be completed. It is common practice to specify optimistic times
    to be three standard deviations from the mean so that there is
    approximately a 1% chance that the activity will be completed within
    the optimistic time.

    Most likely time - the completion time having the highest
    probability. Note that this time is different from the expected time.

    Pessimistic time - the longest time that an activity might require. Three standard deviations from the mean is commonly used for the pessimistic time.

    And you can download my PERT++ from reading my following below thoughts:

    More of my philosophy about the central limit theorem and about my PERT++ and more..

    The central limit theorem states that the sampling distribution of the mean of any independent, random variable will be normal or nearly normal, if the sample size is large enough.

    How large is "large enough"?

    In practice, some statisticians say that a sample size of 30 is large enough when the population distribution is roughly bell-shaped. Others recommend a sample size of at least 40. But if the original population is distinctly not normal (e.g., is badly
    skewed, has multiple peaks, and/or has outliers), researchers like the sample size to be even larger. So i invite you to read my following thoughts about my software
    project that is called PERT++, and notice that the PERT networks are referred to by some researchers as "probabilistic activity networks" (PAN) because the duration of some or all of the arcs are independent random variables with known probability
    distribution functions, and have finite ranges. So PERT uses the central limit theorem (CLT) to find the expected project duration.

    And as you are noticing this Central Limit Theorem is also so important
    for quality control, read the following to notice it(I also understood Statistical Process Control (SPC)):

    An Introduction to Statistical Process Control (SPC)

    https://www.engineering.com/AdvancedManufacturing/ArticleID/19494/An-Introduction-to-Statistical-Process-Control-SPC.aspx

    Also PERT networks are referred to by some researchers as "probabilistic activity networks" (PAN) because the duration of some or all of the arcs are independent random variables with known probability distribution functions, and have finite ranges. So
    PERT uses the central limit theorem (CLT) to find the expected project duration.

    So, i have designed and implemented my PERT++ that that is important for quality, please read about it and download it from my website here:

    https://sites.google.com/site/scalable68/pert-an-enhanced-edition-of-the-program-or-project-evaluation-and-review-technique-that-includes-statistical-pert-in-delphi-and-freepascal

    ---


    So I have provided you in my PERT++ with the following functions:


    function NormalDistA (const Mean, StdDev, AVal, BVal: Extended): Single;

    function NormalDistP (const Mean, StdDev, AVal: Extended): Single;

    function InvNormalDist(const Mean, StdDev, PVal: Extended; const Less: Boolean): Extended;

    For NormalDistA() or NormalDistP(), you pass the best estimate of completion time to Mean, and you pass the critical path standard deviation to StdDev, and you will get the probability of the value Aval or the probability between the values of Aval and
    Bval.

    For InvNormalDist(), you pass the best estimate of completion time to Mean, and you pass the critical path standard deviation to StdDev, and you will get the length of the critical path of the probability PVal, and when Less is TRUE, you will obtain a
    cumulative distribution.


    So as you are noticing from my above thoughts that since PERT networks are referred to by some researchers as "probabilistic activity networks" (PAN) because the duration of some or all of the arcs are independent random variables with known
    probability distribution functions, and have finite ranges. So PERT uses the central limit theorem (CLT) to find the expected project duration. So then you have to use my above functions
    that are Normal distribution and inverse normal distribution functions, please look at my demo inside my zip file to understand better how i am doing it:

    You can download and read about my PERT++ from my website here:

    https://sites.google.com/site/scalable68/pert-an-enhanced-edition-of-the-program-or-project-evaluation-and-review-technique-that-includes-statistical-pert-in-delphi-and-freepascal

    I think i am smart and i invite you to read carefully the following webpage of Alan Robinson Professor of Operations Management at University of Massachusetts and that is a full-time professor at the Isenberg School of Management of UMass and a
    consultant and book author specializing in managing ideas (idea-generation and idea-driven organization) and building high-performance organizations, creativity, innovation, quality, and lean management:

    https://www.simplilearn.com/pgp-lean-six-sigma-certification-training-course?utm_source=google&utm_medium=cpc&utm_term=&utm_content=11174393172-108220153863-506962883161&utm_device=c&utm_campaign=Display-MQL-DigitalOperationsCluster-PG-QM-CLSS-UMass-
    YTVideoInstreamCustomIntent-US-Main-AllDevice-adgroup-QM-Desktop-CI&gclid=Cj0KCQiA3rKQBhCNARIsACUEW_ZGLHcUP2htLdQo46zP6Eo2-vX0MQYvc-o6GQP55638Up4tex85RBEaArn9EALw_wcB


    And notice in the above webpage of the professor, that he is giving Post Graduate Program in Lean Six Sigma and on agile methodology, and i think that this Post Graduate Program is easy for me since i am really smart and i can easily understand lean
    Six Sigma or Six Sigma and i can easily understand agile methodology, and notice that i am in my below thoughts also explaining much more smartly what is agile methodology, and i think that the more difficult part of Six Sigma or lean Six Sigma is to
    understand the central limit theorem and to understand what is SPC(Statistical quality control) and how to use the control charts so that to control the variability of the defects, and notice that i am talking about it in my below thoughts, but i think
    that the rest of lean Six Sigma and Six Sigma is easy for me.


    More of my philosophy about IQ tests and more of my thoughts..


    I think i am highly smart, and I have passed two certified IQ tests and i have scored above 115 IQ, but i have just passed more and more IQ tests, and i have just noticed that the manner in wich we test with IQ tests is not correct, since in an IQ test
    you can be more specialized and above average in one subtest of intelligence than in another subtest of intelligence inside an IQ test, since IQ tests test for many kind of intelligences such as the spatial and speed and calculations and logical
    intelligence etc., so i think that you can be really above average in logical intelligence, as i am really above average in logical intelligence, but at the same time you can be below average in calculations and/or spatial.., so since an IQ test doesn't
    test for this kind of specializations of intelligence, so i think it is not good, since testing for this kind specializations in intelligence is really important so that to be efficient by knowing the strong advantages of this or that person in every
    types of intelligences. And about the importance of specialization, read carefully my following thought about it:


    More of my philosophy about specialization and about efficiency and productivity..

    The previous CEO Larry Culp of General Electric and the architect of a strategy that represented a new turning point in the world corporate strategies, Larry Culp's strategy was to divide the company according to its activities. Something like we are
    better of alone, seperately and
    focused on each one's own activity, than together in a large
    conglomerate. And it is a move from integration to specialization.
    You see it is thought that a company always gains economies of scale
    as it grows, but this is not necessarily the case, since as the company gains in size - especially if it engages in many activities - it
    also generates its own bureaucracy, with all that entails in term
    of cost and efficiency. And not only that, it is also often the case
    that by bringing together very different activities, strategic focus is lost and decision-making is diluted, so that in the end no one ends up
    taking responsability, it doesn't always happen, but this reasons are basically what is driving this increasing specialization. So i invite to look at the following video so that to understand more about it:

    The decline of industrial icon of the US - VisualPolitik EN

    https://www.youtube.com/watch?v=-hqwYxFCY-k


    And here is my previous thoughts about specialization and productivity so that to understand much more:

    More about the Japanese Ikigai and about productivity and more of my thoughts..

    Read the following interesting article about Japanese Ikigai:

    The More People With Purpose, the Better the World Will Be

    https://singularityhub.com/2018/06/15/the-more-people-with-purpose-the-better-the-world-will-be/

    I think i am highly smart, so i say that the Japanese Ikigai is like a Japanese philosophy that is like the right combination or "balance" of passion, vocation, and mission, and Ikigai and MTP, as concepts, urge us to align our passions with a mission
    to better the world, but i think that Japanese Ikiai is a also smart since it gets the "passion" from the "mission", since the mission is also the engine, so you have to align the passion with the mission of the country or the global world so that to be
    efficient, and Japanese Ikigai is also smart since so that to higher productivity and be efficient, you have to "specialize" in doing a job, but so that to higher more productivity and be more efficient you can also specialize in what you do "better",
    and it is what is doing Japanese Ikigai, since i think that in Japanese Ikigai, being the passion permits to make you specialized in a job in what you do better, and here is what i have just smartly said about productivity:

    I think i am highly smart, and i have passed two certified IQ tests and i have scored above 115 IQ, and i will now talk about another important idea of Adam Smith the father of economic Liberalism, and it is about "specialization" in an economic system,
    since i say that in an economic system we have to be specialized in doing a job so that to be efficient and productive, but not only that, but we have to specialize in doing a job in what we do better so that to be even more efficient and productive,
    and we have to minimize at best the idle time or the wasting of time doing a job, since i can also say that this average idle time or wasting time of the workers working in parallel can be converted to a contention like in parallel programming, so you
    have to minimize it at best, and you have to minimize at best the coherency like in parallel programming so that to scale much better, and of course all this can create an economy of scale, and also i invite you to read my following smart and interesting
    thoughts about scalability of productivity:

    I will talk about following thoughts from the following PhD computer scientist:

    https://lemire.me/blog/about-me/

    Read more carefully here his thoughts about productivity:

    https://lemire.me/blog/2012/10/15/you-cannot-scale-creativity/

    And i think he is making a mistake in his above webpage about productivity:

    Since we have that Productivity = Output/Input

    But better human training and/or better tools and/or better human smartness and/or better human capacity can make the Parallel productivity part much bigger that the Serial productivity part, so it can scale much more (it is like Gustafson's Law).

    And it looks like the following:

    About parallelism and about Gustafson’s Law..

    Gustafson’s Law:

    • If you increase the amount of work done by each parallel
    task then the serial component will not dominate
    • Increase the problem size to maintain scaling
    • Can do this by adding extra complexity or increasing the overall
    problem size

    Scaling is important, as the more a code scales the larger a machine it
    can take advantage of:

    • can consider weak and strong scaling
    • in practice, overheads limit the scalability of real parallel programs • Amdahl’s law models these in terms of serial and parallel fractions • larger problems generally scale better: Gustafson’s law


    Load balance is also a crucial factor.


    And read my following thoughts about Evolutionary Design methodology and that is so important so that to understand:

    And I invite you to look at step 4 of my below thoughts of software Evolutionary Design methodology with agile, here it is:

    4- When in agile a team breaks a project into phases, it’s called incremental development. An incremental process is one in which
    software is built and delivered in pieces. Each piece, or increment, represents a complete subset of functionality. The increment may be
    either small or large, perhaps ranging from just a system’s login
    screen on the small end to a highly flexible set of data management
    screens. Each increment is fully coded Sprints, Planning, and Retrospectives.

    And you will notice that it has to be done by "prioritizing" the pieces of the software to be delivered to the customers, and here again in agile you are noticing that we are also delivering prototypes of the software, since we often associate
    prototypes with nearly completed or just-before launch versions of products. However, designers create prototypes at all phases of the design process at various resolutions. In engineering, students are taught to and practitioners think deeply before
    setting out to build. However, as the product or system becomes increasingly complex, it becomes increasingly difficult to consider all factors while designing. Facing this reality, designers are no longer just "thinking to build" but also "building to
    think." By getting hands on and trying to create prototypes, unforeseen issues are highlighted early, saving costs related with late stage design changes. This rapid iterative cycle of thinking and building is what allows designers to learn rapidly from
    doing. Creating interfaces often benefit from the "build to think" approach. For example, in trying to layout the automotive cockpit, one can simply list all the features, buttons, and knobs that must be incorporated. However, by prototyping the cabin
    does one really start to think about how the layout should be to the driver in order to avoid confusion while maximizing comfort. This then allows the designer iterate on their initial concept to develop something that is more intuitive and refined. Also
    prototypes and there demonstrations are designed to get potential customers interested and excited.

    More of my philosophy about the Evolutionary Design methodology and more..

    Here are some important steps of software Evolutionary Design methodology:

    1- Taking a little extra time during the project to write solid code and
    fix problems today, they create a codebase that’s easy to maintain tomorrow.

    2- And the most destructive thing you can do to your project is to build
    new code, and then build more code that depends on it, and then still
    more code that depends on that, leading to that painfully familiar
    domino effect of cascading changes...and eventually leaving you with
    an unmaintainable mess of spaghetti code. So when teams write code,
    they can keep their software designs simple by creating software
    designs based on small, self-contained units (like classes, modules, services, etc.) that do only one thing; this helps avoid the domino
    effect.

    3- Instead of creating one big design at the beginning of the project
    that covers all of the requirements, agile architects use incremental design, which involves techniques that allow them to design a system
    that is not just complete, but also easy for the team to modify as
    the project changes.

    4- When in agile a team breaks a project into phases, it’s called incremental development. An incremental process is one in which
    software is built and delivered in pieces. Each piece, or increment, represents a complete subset of functionality. The increment may be
    either small or large, perhaps ranging from just a system’s login
    screen on the small end to a highly flexible set of data management
    screens. Each increment is fully coded Sprints, Planning, and Retrospectives.

    5- And an iterative process in agile is one that makes progress through successive refinement. A development team takes a first cut
    at a system, knowing it is incomplete or weak in some (perhaps many)
    areas. They then iteratively refine those areas until the product is satisfactory. With each iteration the software is improved through
    the addition of greater detail.

    More of philosophy about Democracy and the Evolutionary Design methodology..

    I will make a logical analogy between software projects and Democracy,
    first i will say that because of the today big complexity of software projects, so the "requirements" of those complex software projects are
    not clear and a lot could change in them, so this is
    why we are using an Evolutionary Design methodology with different tools such as Unit Testing, Test Driven Development, Design Patterns,
    Continuous Integration, Domain Driven Design, but we have to notice carefully that an important thing in Evolutionary Design methodology is
    that when those complex software projects grow, we have first to
    normalize there growth by ensuring that the complex software projects
    grow "nicely" and "balanced" by using standards, and second we have to optimize growth of the complex software projects by balancing between
    the criteria of the easy to change the complex software projects and the performance of the complex software projects, and third you have to
    maximize the growth of the complex software projects by making the most
    out of each optimization, and i think that by logical analogy we can
    notice that in Democracy we have also to normalize the growth by not allowing "extremism" or extremist ideologies that hurt Democracy, and we have also to optimize Democracy by for example well balancing between "performance" of the society and in the Democracy and the "reliability"
    of helping others like the weakest members of the society among the
    people that of course respect the laws.


    More of my philosophy about the the importance of randomness in
    the genetic algorithm and in the evolutionary algorithms and more
    of my thoughts..

    More of my philosophy about the genetic algorithm and about artificial intelligence and more of my thoughts..

    I think i am highly smart, so i will ask the following philosophical question about the genetic algorithm:

    Is the genetic algorithm a brute-force search and if it is
    not, how is it different than the brute-force search ?

    So i have just quickly took a look at some example of a minimization problem with a genetic algorithm, and i think that the genetic algorithm is not a brute-force search, since i think that when in a minimization
    problem with a genetic algorithm you do a crossover, also called recombination, that is a genetic operator used to combine the genetic information of two parents to generate new offspring, the genetic algorithm has this tendency to also explore locally
    and we call it exploitation, and when the genetic algorithm does genetical mutations with a level of probability, the genetic algorithm has this tendency to explore globally and we call it exploration, so i think a good genetic algorithm is the one that
    balance efficiently exploration and exploitation so that to avoid premature convergence, and notice that when you explore locally and globally you can do it with a bigger population that makes it search faster, so it is is why i think the genetic
    algorithm has this kind of patterns that makes it a much better search than brute-force search. And so that to know more about this kind of artificial intelligence , i invite you to read my following thoughts in the following web link about evolutionary
    algorithms and artificial intelligence so that to understand more:

    https://groups.google.com/g/alt.culture.morocco/c/joLVchvaCf0

    More of my philosophy about the other conditions of the genetic algorithm and about artificial intelligence and more of my thoughts..
    .
    I think i am highly smart, and i think that the genetic algorithm
    is interesting too, but i have to speak about one other most important thing about the genetic algorithm, so i will ask a philosophical question about it:

    Since as i just said previously, read it below, that a good genetic algorithm has to efficiently balance between global(exploration) and local(exploitation) search , but how can you be sure that you have found a global optimum ?

    I think i am smart, and i will say that it also depends on the kind of problem, so if for example we have a minimization problem, you can
    rerun a number of times the genetic algorithm so that to select the best minimum among all the results and you can also give more time to
    the exploration so that to find the a better result, also you have to know that the genetic algorithm can be more elitist in the crossover steps, but i think that this kind of Elitism can has the tendency to not efficiently higher the average best of
    the average members of the population, so then it depends on wich problem you want to use the genetic algorithm, also i think that the genetic algorithm is
    a model that explains from where comes humans, since i also think
    that the genetic mutations of humans, that happens with a probability, has also not only come from the inside body from the chromosomes and genes, but they also were the result of solar storms that, as has said NASA, that may have been key to life on
    Earth, read here so that to notice it:

    https://www.nasa.gov/feature/goddard/2016/nasa-solar-storms-may-have-been-key-to-life-on-earth

    I think i am highly smart, and i will invite you to read my following
    smart thoughts about evolutionary algorithms and artificial intelligence so that you notice how i am talking about the so important thing that we call "randomness":

    https://groups.google.com/g/alt.culture.morocco/c/joLVchvaCf0


    So i think i am highly smart, and notice that i am saying in the above web link the following about evolutionary algorithms:

    "I think that Modern trends in solving tough optimization problems tend
    to use evolutionary algorithms and nature-inspired metaheuristic
    algorithms, especially those based on swarm intelligence (SI), two major characteristics of modern metaheuristic methods are nature-inspired, and
    a balance between randomness and regularity."

    So i think that in the genetic algorithm, there is a part that is hard coded, like selecting the best genes, and i think that it is what
    we call regularity, since it is hard coded like that, but there is
    a so important thing in that genetic algorithm that we call randomness,
    and i think that it is the genetic mutations that happen with a
    probability and that give a kind of diversity, so i think that this
    genetic mutations are really important, since i can for example
    say that if the best genes are the ones that use "reason", so then reason too can make the people that has the tendency to use reason do
    a thing that is against there survival, like going to war when we
    feel that there is too much risk, but this going to war can make
    the members or people that use reason so that to attack the other enemy
    be extinct in a war when they loose a war, and it is the basis of randomness in a genetic algorithm, since even when there is a war
    between for example two Ant colonies, there are some members that do not make war and that can survive if other are extinct by making war, and i say it also comes from randomness of the genetics.

    More of my philosophy about student performance and about artificial intelligence and more of my thoughts..


    I have just read the following interesting article from McKinsey, and
    i invite you to read it carefully:

    Drivers of student performance: Asia insights

    https://www.mckinsey.com/industries/education/our-insights/drivers-of-student-performance-asia-insights

    And i think i am smart, and i think that the following factors in the above article that influence student performance are not so difficult to implement:

    1- Students who receive a blend of inquiry-based and teacher-directed instruction have the best outcomes

    2- School-based technology yields the best results when placed in the
    hands of teachers

    3- Early childhood education has a positive impact on student scores,
    but the quality and type of care is important

    But i think that the factor that is tricky to implement (since it needs good smartness) is good motivation calibration that permits to score 8 to 14 percent higher on the science test than poorly calibrated one, and the high self-identified motivation
    that permits to score 6 to 8 percent higher.

    More of my philosophy about what Americans think of Metaverse
    and more of my thoughts..

    Surprise! 78 percent of Americans think Metaverse is just hype


    [continued in next message]

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)