• More of my philosophy about natural selection and about average IQ of a

    From Amine Moulay Ramdane@21:1/5 to All on Thu Dec 15 16:50:02 2022
    Hello,


    More of my philosophy about natural selection and about average IQ of arabs and more of my thoughts..

    I am a white arab from Morocco, and i think i am smart since i have also invented many scalable algorithms and algorithms..



    I think that average IQ of arabs will increase by around 6 points per decade, so it will take around 2 decades for the average IQ of arabs
    to equal that of average IQ of western white europeans and read my following thoughts so that to understand why:

    I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, and i mean that it is "above", so i think that my approximation for genetical smartness of arabs is correct,
    but you will for example ask why have i not taken into account the process of natural selection, so i think that so that to do a good approximation for genetical smartness of arabs, you have to look at some other variables that are not natural selection,
    and here they are:

    So GDP per capita of USA was in year 1965 at 3828 US dollars,
    so notice that it is around the today GDP per capita of Morocco or of many arab countries, so read more here in the following web page so that to notice it:

    https://www.macrotrends.net/countries/USA/united-states/gdp-per-capita


    Other than that, average IQ of USA has increased from
    year 1965 to today by 15 IQ points due to growth of GDP per capita,
    since the today GDP per capita of USA is 69288 US dollars, so i think
    that this tendency of the rise of average IQ by 15 IQ points shows
    that average IQ for arabs of Morocco or arabs of arab countries
    will rise by around 15 points due to growth in GDP per capita, so i think that the genetical average IQ of arabs is not problematic, since i think that genetical smartness of arabs is the same as the genetical smartness of western white europeans.

    Why IQs Rise When Nations Experience Rapid Economic Development

    The latest data support these observations by showing that IQs have been rising steadily in countries experiencing the most rapid economic development during the past few decades. As a measure of the interaction between intelligence and modern cognitive
    stimuli that strengthen capacities for rational classification, quantitative reasoning, etc., a population’s average IQ is therefore an indicator of economic modernization and development, not their cause.

    Read more here:

    https://evonomics.com/does-your-iq-predict-how-rich-you-will-be/


    More of my philosophy about how the entrepreneurial spirit is alive and well in the United States and more of my thoughts..

    I invite you to read the following interesting article
    that shows what economists call ‘creative destruction,’ wherein new innovation springs up because of the failure of particular industries or businesses , like in the time of Covid-19, and it shows that despite the risks of opening a business during a
    global pandemic, new data from the U.S. Census Bureau reveals that the entrepreneurial spirit is alive and well in the United States.

    Visualizing America’s Entrepreneurial Spirit During COVID-19

    Read more here:

    https://www.visualcapitalist.com/visualizing-americas-entrepreneurial-spirit-during-covid-19/

    More of my philosophy about IDC that provided revenue and forecasts for X86 servers and non-X86 servers and more of my thoughts..

    I am a white arab from Morocco, and i think i am smart since i have also invented many scalable algorithms and algorithms..


    IDC provided revenue and forecasts for X86 servers and non-X86 servers,
    and notice also on the following article how look like the share of server sales of different companies:

    The world is still hungry for servers - for now

    https://www.nextplatform.com/2022/12/14/the-world-is-still-hungry-for-servers-for-now/


    More of my philosophy about computer science pioneer Frederick Brooks, Jr. and about technology and more of my thoughts..

    Paying Tribute to Computer Science Pioneer Frederick Brooks, Jr.
    He helped develop the IBM System/360 and its operating system

    Read more here:

    https://spectrum.ieee.org/frederick-brooks-jr-obit


    More of my philosophy about how AlphaCode can solve complex problems and create code using AI and about technology and more of my thoughts..

    I invite you to read the following interesting article:

    AlphaCode can solve complex problems and create code using AI

    https://interestingengineering.com/innovation/alphacode-can-solve-complex-problems-and-create-code-using-ai


    So notice carefully how AlphaCode ranked in the top 54.3%, and notice
    that AlphaCode is using Transformers, but i think that AlphaCode
    and other AI models that use Transformers can still improve there results by increasing there size (A transformer is a deep learning model that adopts the mechanism of self-attention) , and i also think that you can also improve them much more if they
    were trained on a substantially larger amount of data considering an article that DeepMind just published a few days ago demonstrating that the performance of these models can be drastically improved by scaling data more aggressively than parameters (
    Read it here: https://arxiv.org/pdf/2203.15556.pdf ).


    More of my philosophy about non-linear regression and about logic and about technology and more of my thoughts..

    I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, and i mean that it is "above" , so i think that R-squared is invalid for non-linear regression, but i think that something that look like R-squared for
    non-linear regression is to use Relative standard error that is the standard deviation of the mean of the sample divide by the Estimate that is the mean of the sample, but if you calculate just the standard error of the estimate (Mean Square Error), it
    is not sufficient since you have to know what is the size of the standard error of the estimate relatively to the curve and its axes, so read my following thoughts so that to understand more:


    So the R-squared is invalid for non-linear regression, so you have to use the standard error of the estimate (Mean Square Error), and of course you have to calculate the Relative standard error that is the standard deviation of the mean of the sample
    divide by the Estimate that is the mean of the sample, and i think that the Relative standard Error is an important thing that brings more quality to the statistical calculations, and i will now talk to you more about my interesting software project for
    mathematics, so my new software project uses artificial intelligence to implement a generalized way with artificial intelligence using the software that permit to solve the non-linear "multiple" regression, and it is much more powerful than Levenberg–
    Marquardt algorithm , since i am implementing a smart algorithm using artificial intelligence that permits to avoid premature
    convergence, and it is also one of the most important thing, and
    it will also be much more scalable using multicores so that to search with artificial intelligence much faster the global optimum, so i am
    doing it this way so that to be professional and i will give you a tutorial that explains my algorithms that uses artificial intelligence so that you learn from them, and of course it will automatically calculate the above Standard error of the estimate
    and the Relative standard Error.

    More of my philosophy about non-linear regression and more..

    I think i am really smart, and i have also just finished quickly the software implementation of Levenberg–Marquardt algorithm and of the Simplex algorithm to solve non-linear least squares problems, and i will soon implement a generalized way with
    artificial intelligence using the software that permit to solve the non-linear "multiple" regression, but i have also noticed that in mathematics you have to take care of the variability of the y in non-linear least squares problems so that to
    approximate, also the Levenberg–Marquardt algorithm (LMA or just LM) that i have just implemented , also known as the damped least-squares (DLS) method, is used to solve non-linear least squares problems. These minimization problems arise especially in
    least squares curve fitting. The Levenberg–Marquardt algorithm is used in many software applications for solving generic curve-fitting problems. The Levenberg–Marquardt algorithm was found to be an efficient, fast and robust method which also has a
    good global convergence property. For these reasons, It has been incorporated into many good commercial packages performing non-linear regression. But my way of implementing the non-linear "multiple" regression in the software will be much more powerful
    than Levenberg–Marquardt algorithm, and of course i will share with you many parts of my software project, so stay tuned !


    More of my philosophy about the truth table of the logical implication and about automation and about artificial intelligence and more of my thoughts..


    I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, and i mean that it is "above", and now
    i will ask a philosophical question of:

    What is a logical implication in mathematics ?

    So i think i have to discover patterns with my fluid intelligence
    in the following truth table of the logical implication:

    p q p -> q
    0 0 1
    0 1 1
    1 0 0
    1 1 1

    Note that p and q are logical variables and the symbol -> is the logical implication.

    And here are the patterns that i am discovering with my fluid intelligence that permit to understand the logical implication in mathematics:

    So notice in the above truth table of the logical implication
    that p equal 0 can imply both q equal 0 and q equal 1, so for
    example it can model the following cases in reality:

    If it doesn't rain , so it can be that you can take or not your umbrella, so the pattern is that you can take your umbrella since
    it can be that another logical variable can be that it can rain
    in the future, so you have to take your umbrella, so as you
    notice that it permits to model cases of the reality ,
    and it is the same for the case in the above truth table of the implication of if p equal 1, it imply that q equal 0 , since the implication is not causation, but p equal 1 means for example
    that it rains in the present, so even if there is another logical variable that says that it will not rain in the future, so you have
    to take your umbrella, and it is why in the above truth table
    p equal 1 imply q equal 1 is false, so then of course i say that
    the truth table of the implication permits to model the case of causation, and it is why it is working.

    More of my philosophy about objective truth and subjective truth and more of my thoughts..

    Today i will use my fluid intelligence so that to explain more
    the way of logic, and i will discover patterns with my fluid intelligence so that to explain the way of logic, so i will start by asking the following philosophical question:

    What is objective truth and what is subjective truth ?

    So for example when we look at the the following equality: a + a = 2*a,
    so it is objective truth, since it can be made an acceptable general truth, so then i can say that objective truth is a truth that can be made an acceptable general truth, so then subjective truth is a truth that can not be made acceptable general truth,
    like saying that Jeff Bezos is the best human among humans is a subjective truth. So i can say that we are in mathematics also using the rules of logic so that to logically prove that a theorem or the like is truth or not, so notice the following truth
    table of the logical implication:

    p q p -> q
    0 0 1
    0 1 1
    1 0 0
    1 1 1

    Note that p and q are logical variables and the symbol -> is the logical implication.

    The above truth table of the logical implication permits us
    to logically infer a rule in mathematics that is so important in logic and it is the following:

    (p implies q) is equivalent to ((not p) or q)


    And of course we are using this rule in logical proofs since
    we are modeling with all the logical truth table of the
    logical implication and this includes the case of the causation in it,
    so it is why it is working.

    And i think that the above rule is the most important rule that permits
    in mathematics to prove like the following kind of logical proofs:

    (p -> q) is equivalent to ((not(q) -> not(p))

    Note: the symbol -> means implies and p and q are logical
    variables.

    or

    (not(p) -> 0) is equivalent to p


    And for fuzzy logic, here is the generalized form(that includes fuzzy logic) for the three operators AND,OR,NOT:

    x AND y is equivalent to min(x,y)
    x OR y is equivalent to max(x,y)
    NOT(x) is equivalent to (1 - x)

    So now you are understanding that the medias like CNN have to be objective by seeking the attain the objective truth so that democracy works correctly.

    More of my philosophy about artificial intelligence and about automation and about how to boost productivity with artificial intelligence and more..

    I am a white arab from Morocco, and i think i am smart since i have also invented many scalable algorithms and algorithms..


    You can boost productivity with artificial intelligence by:

    1- More accurate demand forecasting using AI and machine learning
    2- Predictive maintenance
    3- Hyper-personalized manufacturing
    4- Optimizing manufacturing processes
    5- Automated material procurement

    Read more here carefully about those 5 ways artificial intelligence can boost productivity:

    https://www.industryweek.com/technology-and-iiot/article/22025683/5-ways-artificial-intelligence-can-boost-productivity


    And more of my philosophy about understanding K-means Clustering in Machine Learning and more..


    I have just read about the K-means clustering algorithm, and i think
    it is also for grouping similar data points together and discover underlying patterns, it is why it is used in machine learning, i have just quickly understood it, so i invite you to read about it in the following interesting article:

    Understanding K-means Clustering in Machine Learning

    https://towardsdatascience.com/understanding-k-means-clustering-in-machine-learning-6a6e67336aa1


    And to be more smart, i invite you to look in the following at how K-means Clustering algorithm is used smartly in a delivery store optimization that optimizes the process of good delivery using truck drones by using a combination of k-means to find the
    optimal number of launch locations and a genetic algorithm to solve the truck route as a traveling salesman problem. And here is a paper from the journal of industrial engineering and management on the subject, you have to read it carefully, since i have
    read it and understood it and i think that i will implement it soon in Delphi and Freepascal:

    Optimization of a Truck-drone in Tandem Delivery Network
    Using K-means and Genetic Algorithm

    https://upcommons.upc.edu/bitstream/handle/2117/88986/1929-8707-1-pb.pdf?sequence=1&isallowed=y

    More of my philosophy about automation and about intelligent automation
    and more of my thoughts..

    "In recent decades, companies have used robotic process automation (RPA) as a way to streamline operations, reduce errors, and save money by automating routine business tasks, but now organizations are turning to intelligent automation to automate key
    business processes to boost revenues, operate more efficiently, and deliver exceptional customer experiences. Intelligent automation is a smarter version of RPA that makes use of machine learning, artificial intelligence (AI) and cognitive technologies
    such as natural language processing to handle more complex processes, guide better business decisions, and shed light on new opportunities."

    Read more here:

    https://www.computerworld.com/article/3680230/how-intelligent-automation-will-change-the-way-we-work.html


    And look in the following interesting article about how AI will create millions more Jobs than it Will destroy:

    https://singularityhub.com/2019/01/01/ai-will-create-millions-more-jobs-than-it-will-destroy-heres-how/

    And following are some of the advantages of automation, read them carefully:

    1. Automation is the key to the shorter workweek. Automation will allow
    the average number of working hours per week to continue to decline,
    thereby allowing greater leisure hours and a higher quality life.

    2. Automation brings safer working conditions for the worker. Since
    there is less direct physical participation by the worker in the
    production process, there is less chance of personal injury to the worker.

    3. Automated production results in lower prices and better products. It
    has been estimated that the cost to machine one unit of product by
    conventional general-purpose machine tools requiring human operators may
    be 100 times the cost of manufacturing the same unit using automated mass-production techniques. The electronics industry offers many
    examples of improvements in manufacturing technology that have
    significantly reduced costs while increasing product value (e.g., colour
    TV sets, stereo equipment, calculators, and computers).

    4. The growth of the automation industry will itself provide employment opportunities. This has been especially true in the computer industry,
    as the companies in this industry have grown (IBM, Digital Equipment
    Corp., Honeywell, etc.), new jobs have been created.
    These new jobs include not only workers directly employed by these
    companies, but also computer programmers, systems engineers, and other
    needed to use and operate the computers.

    5. Automation is the only means of increasing standard of living. Only
    through productivity increases brought about by new automated methods of production, it is possible to advance standard of living. Granting wage increases without a commensurate increase in productivity
    will results in inflation. To afford a better society, it is a must to
    increase productivity.

    And McKinsey estimates that AI(Artificial intelligence) may deliver an additional economic output of around US$13 trillion by 2030, increasing global GDP by about 1.2 % annually. This will mainly come from substitution of labour by automation and
    increased innovation in products and services.

    Read more here:

    https://www.europarl.europa.eu/RegData/etudes/BRIE/2019/637967/EPRS_BRI(2019)637967_EN.pdf


    And read the following so that to know how people have to adapt
    in Digital and AI literacy so that to be competitive:

    "Digital and AI literacy is of utmost importance to help Canadian businesses scale and compete internationally. Investing in widespread digital and AI literacy for the entire population will increase domestic demand for technology and technology jobs. A
    technologically literate population will create more data, which fuels AI and thus the data-driven economy as a whole. It is also necessary for workers to be able to upskill and re-skill in order to remain productive and competitive in an automated
    workforce. Canadian businesses that adopt AI technology will save from lower production costs, have increased output, and be able to invest more. Increased revenue from this domestic demand, as well as Canada’s global reputation for responsible AI,
    will help Canadian businesses scale globally and compete on the international level. Canada has a promising future in the data-driven economy, and strategic choices by policymakers are necessary to ensure that Canadians can benefit from an ethical and
    thriving AI ecosystem."

    Read more here:

    Canada's Economic Future with Artificial Intelligence

    https://www.kroegerpolicyreview.com/post/canada-s-economic-future-with-artificial-intelligence

    And i invite you to read about the next revolution in the software industry that is called Machine programming in the following article:

    https://venturebeat.com/2021/06/18/ai-weekly-the-promise-and-limitations-of-machine-programming-tools/


    More of my philosophy about the how bad is CLX memory latency and about technology and more of my thoughts..

    I am a white arab from Morocco, and i think i am smart since i have also invented many scalable algorithms and algorithms..


    HOW BAD IS THE CXL MEMORY LATENCY REALLY?

    "If the folks at Astera are to be believed, the latency isn’t as bad as you might think. The company’s Leo CXL memory controllers are designed to accept standard DDR5 memory DIMMs up to 5600 MT/sec. They claim customers can expect latencies roughly
    on par with accessing memory on a second CPU, one NUMA hop away. This puts it in the neighborhood of 170 nanoseconds to 250 nanoseconds. In fact, as far as the system is concerned, that’s exactly how these memory modules show up to the operating system.
    "

    Read more here:

    https://www.nextplatform.com/2022/12/05/just-how-bad-is-cxl-memory-latency/


    CXL memory pools: Just how big can they be?


    Read more here:

    https://blocksandfiles.com/2022/07/07/cxl-memory-pools-size/



    More of my philosophy about technology and about Intel technology and more of my thoughts..


    Intel says it will squeeze 1 trillion transistors onto a chip package by 2030

    "Intel Corp. researchers this weekend revealed a number of technological innovations and concepts, including packaging improvements that could result in computer chips that are 10 times as powerful as today’s most advanced silicon."


    Read more here:

    https://siliconangle.com/2022/12/04/intel-says-will-squeeze-1-trillion-transistors-onto-chip-package-2030/


    More of my philosophy about the 12 memory channels of
    the new AMD Epyc Genoa CPU and more of my thoughts..

    I am a white arab from Morocco, and i think i am smart since i have also invented many scalable algorithms and algorithms..


    So as i am saying below, i think that so that to use 12 memory
    channels in parallel that supports it the new AMD Genoa CPU, the GMI-Wide mode must enlarge more and connects each CCD with more GMI links, so i think that it is what is doing AMD in its new 4 CCDs configuration, even with the cost optimized Epyc Genoa
    9124 16 cores with 64 MB of L3 cache with 4 Core Complex Dies (CCDs), that costs around $1000 (Look at it here: https://www.tomshardware.com/reviews/amd-4th-gen-epyc-genoa-9654-9554-and-9374f-review-96-cores-zen-4-and-5nm-disrupt-the-data-center ), and
    as i am explaining more below that the Core Complex Dies (CCDs) connect to memory, I/O, and each other through the I/O Die (IOD) and each CCD connects to the IOD via a dedicated high-speed, or Global Memory Interconnect (GMI) link and the IOD also
    contains memory channels, PCIe Gen5 lanes, and Infinity Fabric links and all dies, or chiplets, interconnect with each other via AMD’s Infinity Fabric Technology, and of course this will permit my new software project of Parallel C++ Conjugate Gradient
    Linear System Solver Library that scales very well to scale on the 12 memory channels, read my following thoughts so that to understand more about it:

    More of my philosophy about the new Zen 4 AMD Ryzen™ 9 7950X and more of my thoughts..


    So i have just looked at the new Zen 4 AMD Ryzen™ 9 7950X CPU, and i invite you to look at it here:

    https://www.amd.com/en/products/cpu/amd-ryzen-9-7950x

    But notice carefully that the problem is with the number of supported memory channels, since it just support two memory channels, so it is not good, since for example my following Open source software project of Parallel C++ Conjugate Gradient Linear
    System Solver Library that scales very well is scaling around 8X on my 16 cores Intel Xeon with 2 NUMA nodes and with 8 memory channels, but it will not scale correctly on the
    new Zen 4 AMD Ryzen™ 9 7950X CPU with just 2 memory channels since it is also memory-bound, and here is my Powerful Open source software project of Parallel C++ Conjugate Gradient Linear System Solver Library that scales very well and i invite you to
    take carefully a look at it:

    https://sites.google.com/site/scalable68/scalable-parallel-c-conjugate-gradient-linear-system-solver-library

    So i advice you to buy an AMD Epyc CPU or an Intel Xeon CPU that supports 8 memory channels.

    ---


    And of course you can use the next Twelve DDR5 Memory Channels for Zen 4 AMD EPYC CPUs so that to scale more my above algorithm, and read about it here:

    https://www.tomshardware.com/news/amd-confirms-12-ddr5-memory-channels-on-genoa


    And here is the simulation program that uses the probabilistic mechanism that i have talked about and that prove to you that my algorithm of my Parallel C++ Conjugate Gradient Linear System Solver Library is scalable:

    If you look at my scalable parallel algorithm, it is dividing the each array of the matrix by 250 elements, and if you look carefully i am using two functions that consumes the greater part of all the CPU, it is the atsub() and asub(), and inside those
    functions i am using a probabilistic mechanism so that to render my algorithm scalable on NUMA architecture , and it also make it scale on the memory channels, what i am doing is scrambling the array parts using a probabilistic function and what i have
    noticed that this probabilistic mechanism i