• More of my philosophy about humanity and about technology and more of m

    From Amine Moulay Ramdane@21:1/5 to All on Wed Nov 16 09:30:56 2022
    Hello,


    More of my philosophy about humanity and about technology and more of my thoughts..

    I am a white arab, and i think i am smart since i have also
    invented many scalable algorithms and algorithms..



    The global average ecological footprint is 2.84 gha per person while the average biocapacity is 1.68 gha per person; it takes 1.69 Earth to cover the consumption of humanity; and this global average ecological footprint brings problems, so this global
    average ecological footprint is growing more and more, so economic growth can be separated from unsustainable resource consumption and harmful pollution. It is what our humanity has to do by also using science and technology. So i think that the problem
    of the global ecological footprint will be solved by technology and science, read the following so that to notice it:

    The Limits of the Earth, Part 1: Problems

    "In my own new book, The Infinite Resource: The Power of Ideas on a Finite Planet, I challenge this view. The problem isn’t economic growth, per se. Nor is the problem that our natural resources are too small. While finite, the natural resources the
    planet supplies are vast and far larger than humanity needs in order to continue to thrive and grow prosperity for centuries to come. The problem, rather, is the types of resources we access, and the manner and efficiency with which we use them.

    And the ultimate solution to those problems is innovation – innovation in the science and technology that we use to tap into physical resources, and innovation in the economic system that steers our consumption.

    The situation we’re in isn’t a looming wall that we’re doomed to crash into. It’s a race – a race between depletion and pollution of natural resources on one side, and our pace of innovation on the other."

    Read more here:

    https://blogs.scientificamerican.com/guest-blog/the-limits-of-the-earth-part-1-problems/

    And I have just read the following article from United Nations:

    Growing at a slower pace, world population is expected to reach 9.7
    billion in 2050 and could peak at nearly 11 billion around 2100

    Read more here:

    https://www.un.org/development/desa/en/news/population/world-population-prospects-2019.html

    So notice that it says the following:

    "Falling proportion of working-age population is putting pressure on
    social protection systems

    The potential support ratio, which compares numbers of persons at
    working ages to those over age 65, is falling around the world. In Japan
    this ratio is 1.8, the lowest in the world. An additional 29 countries,
    mostly in Europe and the Caribbean, already have potential support
    ratios below three. By 2050, 48 countries, mostly in Europe, Northern
    America, and Eastern and South-Eastern Asia, are expected to have
    potential support ratios below two. These low values underscore the
    potential impact of population ageing on the labour market and economic performance, as well as the fiscal pressures that many countries will
    face in the coming decades as they seek to build and maintain public
    systems of health care, pensions and social protection for older persons."

    So this is why you have to read the following to understand more:

    And I have just looked at this video of the french politician called
    Jean-Marie Le Pen and he is saying in the video that with those flows of immigrants in Europe that: "La 3ème Guerre mondiale est commencée", look
    at the following video to notice it:

    https://www.youtube.com/watch?v=Ene0hp7EAus

    But i think that Jean-Marie Le Pen is "not" thinking correctly, because
    if Western Europe wants to keep its social benefits, the countries of
    the E.U. are going to need more workers. No place in the world has an
    older population that's not into baby making than Europe, read more here
    on Forbes to notice it:

    Here's Why Europe Really Needs More Immigrants

    https://www.forbes.com/sites/kenrapoza/2017/08/15/heres-why-europe-really-needs-more-immigrants/#7319e2e24917

    I have just read the following interesting article,
    i invite you to read it carefully:

    Does Our Survival Depend on Relentless Exponential Growth?

    https://singularityhub.com/2017/10/11/do-we-need-relentless-exponential-growth-to-survive/

    As you also notice that the article above says the following:

    "There have concurrently been developments in agriculture and medicine
    and, in the 20th century, the Green Revolution, in which Norman Borlaug
    ensured that countries adopted high-yield varieties of crops—the first precursors to modern ideas of genetically engineering food to produce
    better crops and more growth. The world was able to produce an
    astonishing amount of food—enough, in the modern era, for ten billion people."

    So i think that the world will be able to produce enough food for world population in year 2100, since around 2100, the world population will
    peak at nearly 11 billions, read the following article to notice it:

    Growing at a slower pace, world population is expected to reach 9.7
    billion in 2050 and could peak at nearly 11 billion around 2100

    Read more here:

    https://www.un.org/development/desa/en/news/population/world-population-prospects-2019.html

    Also you can read my new writing about new interesting medical treatments and drugs and about antibiotic resistance here:

    https://groups.google.com/g/alt.culture.morocco/c/vChmXT_pXUI


    More of my philosophy about the 12 memory channels of
    the new AMD Epyc Genoa CPU and more of my thoughts..

    So as i am saying below, i think that so that to use 12 memory
    channels in parallel that supports it the new AMD Genoa CPU, the GMI-Wide mode must enlarge more and connects each CCD with more GMI links, so i think that it is what is doing AMD in its new 4 CCDs configuration, even with the cost optimized Epyc Genoa
    9124 16 cores with 64 MB of L3 cache with 4 Core Complex Dies (CCDs), that costs around $1000 (Look at it here: https://www.tomshardware.com/reviews/amd-4th-gen-epyc-genoa-9654-9554-and-9374f-review-96-cores-zen-4-and-5nm-disrupt-the-data-center ), and
    as i am explaining more below that the Core Complex Dies (CCDs) connect to memory, I/O, and each other through the I/O Die (IOD) and each CCD connects to the IOD via a dedicated high-speed, or Global Memory Interconnect (GMI) link and the IOD also
    contains memory channels, PCIe Gen5 lanes, and Infinity Fabric links and all dies, or chiplets, interconnect with each other via AMD’s Infinity Fabric Technology, and of course this will permit my new software project of Parallel C++ Conjugate Gradient
    Linear System Solver Library that scales very well to scale on the 12 memory channels, read my following thoughts so that to understand more about it:

    More of my philosophy about the new Zen 4 AMD Ryzen™ 9 7950X and more of my thoughts..


    So i have just looked at the new Zen 4 AMD Ryzen™ 9 7950X CPU, and i invite you to look at it here:

    https://www.amd.com/en/products/cpu/amd-ryzen-9-7950x

    But notice carefully that the problem is with the number of supported memory channels, since it just support two memory channels, so it is not good, since for example my following Open source software project of Parallel C++ Conjugate Gradient Linear
    System Solver Library that scales very well is scaling around 8X on my 16 cores Intel Xeon with 2 NUMA nodes and with 8 memory channels, but it will not scale correctly on the
    new Zen 4 AMD Ryzen™ 9 7950X CPU with just 2 memory channels since it is also memory-bound, and here is my Powerful Open source software project of Parallel C++ Conjugate Gradient Linear System Solver Library that scales very well and i invite you to
    take carefully a look at it:

    https://sites.google.com/site/scalable68/scalable-parallel-c-conjugate-gradient-linear-system-solver-library

    So i advice you to buy an AMD Epyc CPU or an Intel Xeon CPU that supports 8 memory channels.

    ---


    And of course you can use the next Twelve DDR5 Memory Channels for Zen 4 AMD EPYC CPUs so that to scale more my above algorithm, and read about it here:

    https://www.tomshardware.com/news/amd-confirms-12-ddr5-memory-channels-on-genoa


    And here is the simulation program that uses the probabilistic mechanism that i have talked about and that prove to you that my algorithm of my Parallel C++ Conjugate Gradient Linear System Solver Library is scalable:

    If you look at my scalable parallel algorithm, it is dividing the each array of the matrix by 250 elements, and if you look carefully i am using two functions that consumes the greater part of all the CPU, it is the atsub() and asub(), and inside those
    functions i am using a probabilistic mechanism so that to render my algorithm scalable on NUMA architecture , and it also make it scale on the memory channels, what i am doing is scrambling the array parts using a probabilistic function and what i have
    noticed that this probabilistic mechanism i