• Revelation Forth Processor and retro Projects.

    From S@21:1/5 to All on Thu Oct 20 07:46:48 2022
    I've come to a revelation on how to possibly do low
    energy silicon transistor scheme much lower than Chucks (not that Chuck reveals details, but it's seems it could be).

    It's possible to implement low energy and high performance circuits side by side, and even dynamically shift between low energy and high performance modes.

    It's possible to even have two sets of instructions, high and low in 32 instructions.

    ----

    On the retro project, I wanted to present a post on the latest thinking of it, but it goes like this. We would have been better off having 10 and 20 bit words in the 1970's processors.

    10 bits word of two 5 bit instructions, adds up to 2048 instructions, that's a descent amount for an low end embedded microcontroller, versus 8 or 4 bits.

    20 bits is 2.5MB which is decent for a microcontroller or computer of the time into the late 1980's.

    A multiple of 10 or 20 bits, makes more sense than 32 or 64 bits. 40 bits maxes out many computers, and 80 bits is a good alternative to 128 bits for many things the public are interested in.

    4 bit 16 colour pixels are too low. 5 bit 32 colour pixels are more ideal. 8 colours at four levels equals 8 levels of white too black, or 32 levels of monochrome which is close to primitive 1980's digital video. Add in multiple 32 colour pallets, then
    that's a good mix. You also don't have to do bit planing, like the Amiga, to get 32 colours.

    8 bit 256 colour pixels are a bit limited. 10 bit 1024 colour pixels are more ideal. In between 3 bits per primary plus two levels, or 4 bits for green ideally, to 1 bit for each primary with 128 levels (certain game styles take advantage of that).
    Again multiple set pallets by switching between pallet modes (5 bits: 2 bits green and red, and 1 bit blue, 2 bits green, 1 bit red and blue, plus 1 bit for two levels etc aswell). 1024 makes some extra bright HDR monochrome possible. As a 1024 sized
    colour pallet, it's ideal.

    20 bit pixels makes reasonable video (8 bit green, 6 bits red and blue, or 7 bits red, and 5 bits blue).

    10-20 bit pallet entries is ideal.

    30 bit colour makes descent video and low level HDR, and pallet entries.

    40 bits makes descent HDR video and pallet entries. 50-90 bits makes ideal HDR.

    10 bits is ideal for basic tile graphics number, and number of character patterns.

    A combination of character number and other basic effects in 20 bits, or feild size and basic effects, is ideal.

    10 bit sound is better than 8 bit, but 20/30 bit sound, is more ideal. 20 bit+ frequencies are more ideal scientifically, to sample frequencies (various plants operate in this range).

    20 bits at 2-4mhz single or dual bus memory, certainly works out some high resolutions.

    So, it's possible to make a dual speed, dual energy circuit of this type, with better graphics for the 1970's/1980's, high-end market, and 1980's consumer market.

    I'm also expanding on an old idea for an alternative to the Rom cartridge and disk. If I can ever get to do it.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From S 1@21:1/5 to All on Fri Oct 21 15:45:42 2022
    On Friday, 21 October 2022 at 12:46:50 am UTC+10, S wrote:
    I've come to a revelation on how to possibly do low
    energy silicon transistor scheme much lower than Chucks (not that Chuck reveals details, but it's seems it could be).

    It's possible to implement low energy and high performance circuits side by side, and even dynamically shift between low energy and high performance modes.

    It's possible to even have two sets of instructions, high and low in 32 instructions.

    ----

    On the retro project, I wanted to present a post on the latest thinking of it, but it goes like this. We would have been better off having 10 and 20 bit words in the 1970's processors.

    10 bits word of two 5 bit instructions, adds up to 2048 instructions, that's a descent amount for an low end embedded microcontroller, versus 8 or 4 bits.

    20 bits is 2.5MB which is decent for a microcontroller or computer of the time into the late 1980's.

    A multiple of 10 or 20 bits, makes more sense than 32 or 64 bits. 40 bits maxes out many computers, and 80 bits is a good alternative to 128 bits for many things the public are interested in.

    4 bit 16 colour pixels are too low. 5 bit 32 colour pixels are more ideal. 8 colours at four levels equals 8 levels of white too black, or 32 levels of monochrome which is close to primitive 1980's digital video. Add in multiple 32 colour pallets, then
    that's a good mix. You also don't have to do bit planing, like the Amiga, to get 32 colours.

    8 bit 256 colour pixels are a bit limited. 10 bit 1024 colour pixels are more ideal. In between 3 bits per primary plus two levels, or 4 bits for green ideally, to 1 bit for each primary with 128 levels (certain game styles take advantage of that).
    Again multiple set pallets by switching between pallet modes (5 bits: 2 bits green and red, and 1 bit blue, 2 bits green, 1 bit red and blue, plus 1 bit for two levels etc aswell). 1024 makes some extra bright HDR monochrome possible. As a 1024 sized
    colour pallet, it's ideal.

    20 bit pixels makes reasonable video (8 bit green, 6 bits red and blue, or 7 bits red, and 5 bits blue).

    10-20 bit pallet entries is ideal.

    30 bit colour makes descent video and low level HDR, and pallet entries.

    40 bits makes descent HDR video and pallet entries. 50-90 bits makes ideal HDR.

    10 bits is ideal for basic tile graphics number, and number of character patterns.

    A combination of character number and other basic effects in 20 bits, or feild size and basic effects, is ideal.

    10 bit sound is better than 8 bit, but 20/30 bit sound, is more ideal. 20 bit+ frequencies are more ideal scientifically, to sample frequencies (various plants operate in this range).

    20 bits at 2-4mhz single or dual bus memory, certainly works out some high resolutions.

    So, it's possible to make a dual speed, dual energy circuit of this type, with better graphics for the 1970's/1980's, high-end market, and 1980's consumer market.

    I'm also expanding on an old idea for an alternative to the Rom cartridge and disk. If I can ever get to do it.

    A processor with integrated multimedia and serial bus, maybe smaller than the 8088 in transistor count, running at up to 16 mips in the 1976 or early 1980's period. A possible Sinclair Quantum Chroma or Commodore +, or Atari 9600..

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Brad Eckert@21:1/5 to All on Fri Oct 21 17:29:23 2022
    On Thursday, October 20, 2022 at 7:46:50 AM UTC-7, S wrote:
    I've come to a revelation on how to possibly do low
    energy silicon transistor scheme much lower than Chucks (not that Chuck reveals details, but it's seems it could be).

    It's possible to implement low energy and high performance circuits side by side, and even dynamically shift between low energy and high performance modes.

    It's possible to even have two sets of instructions, high and low in 32 instructions.

    ----

    On the retro project, I wanted to present a post on the latest thinking of it, but it goes like this. We would have been better off having 10 and 20 bit words in the 1970's processors.

    10 bits word of two 5 bit instructions, adds up to 2048 instructions, that's a descent amount for an low end embedded microcontroller, versus 8 or 4 bits.

    20 bits is 2.5MB which is decent for a microcontroller or computer of the time into the late 1980's.

    A multiple of 10 or 20 bits, makes more sense than 32 or 64 bits. 40 bits maxes out many computers, and 80 bits is a good alternative to 128 bits for many things the public are interested in.

    4 bit 16 colour pixels are too low. 5 bit 32 colour pixels are more ideal. 8 colours at four levels equals 8 levels of white too black, or 32 levels of monochrome which is close to primitive 1980's digital video. Add in multiple 32 colour pallets, then
    that's a good mix. You also don't have to do bit planing, like the Amiga, to get 32 colours.

    8 bit 256 colour pixels are a bit limited. 10 bit 1024 colour pixels are more ideal. In between 3 bits per primary plus two levels, or 4 bits for green ideally, to 1 bit for each primary with 128 levels (certain game styles take advantage of that).
    Again multiple set pallets by switching between pallet modes (5 bits: 2 bits green and red, and 1 bit blue, 2 bits green, 1 bit red and blue, plus 1 bit for two levels etc aswell). 1024 makes some extra bright HDR monochrome possible. As a 1024 sized
    colour pallet, it's ideal.

    20 bit pixels makes reasonable video (8 bit green, 6 bits red and blue, or 7 bits red, and 5 bits blue).

    10-20 bit pallet entries is ideal.

    30 bit colour makes descent video and low level HDR, and pallet entries.

    40 bits makes descent HDR video and pallet entries. 50-90 bits makes ideal HDR.

    10 bits is ideal for basic tile graphics number, and number of character patterns.

    A combination of character number and other basic effects in 20 bits, or feild size and basic effects, is ideal.

    10 bit sound is better than 8 bit, but 20/30 bit sound, is more ideal. 20 bit+ frequencies are more ideal scientifically, to sample frequencies (various plants operate in this range).

    20 bits at 2-4mhz single or dual bus memory, certainly works out some high resolutions.

    So, it's possible to make a dual speed, dual energy circuit of this type, with better graphics for the 1970's/1980's, high-end market, and 1980's consumer market.

    I'm also expanding on an old idea for an alternative to the Rom cartridge and disk. If I can ever get to do it.

    I think DRAM back then was in the 200ns to 120ns range, so maybe 5 to 8 MHz. A 20-bit word could hold four 5-bit instructions. Kind of like a mini ShBoom.

    It took forever for the industry to come up with the HyperBus interface, which removes superfluous address pins. Why weren't we sending address and data over the same pins all along? Suppose they sold 20-bit DRAM chips using a shared address/data
    interface. 20 bits of data, 1-bit chip select, 1-bit cmd/data, and two power pins would fit in a 24-pin DIP package.

    I think the game really changed with on-chip cache. That enabled multi-core processors. If the number of cores on a processor had kept up with Moore's Law, we would have thousands of them on a chip by now. Instead, most modern computers have a few cores
    running very fast. To make them run so fast, they run data through a lot of silicon which generates heat, so we have air-conditioned data centers. This reminds me of a place I worked where they opted to not install a $20K battery-backed air conditioner
    in their UPS-backed server room. There was a power outage but the servers kept running, making the room an oven where they baked to death. Heat is not your friend. Pointy-haired bosses aren't much better.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Lorem Ipsum@21:1/5 to Brad Eckert on Fri Oct 21 18:34:27 2022
    On Friday, October 21, 2022 at 8:29:24 PM UTC-4, Brad Eckert wrote:

    I think DRAM back then was in the 200ns to 120ns range, so maybe 5 to 8 MHz. A 20-bit word could hold four 5-bit instructions. Kind of like a mini ShBoom.

    It took forever for the industry to come up with the HyperBus interface, which removes superfluous address pins. Why weren't we sending address and data over the same pins all along? Suppose they sold 20-bit DRAM chips using a shared address/data
    interface. 20 bits of data, 1-bit chip select, 1-bit cmd/data, and two power pins would fit in a 24-pin DIP package.

    SDRAM has separated address bus and data bus because the two are completely different sizes. Early on, the SDRAM chips were typically 1 data bit and used a multiplexed address bus to send control data and a wide address. HyperBus is a RAM module
    interface. Do I have this wrong?

    They have sold 18 bit and 36 bit SRAM devices since they were often used in ways that required only one or two chips. SDRAM was also sold in 4 bit widths for smaller modules with fewer chips (saving cost). They were also mde in 16 bit widths, but that
    was mostly to support video memory I believe and often had two ports, one for the CPU and one for the graphics chip.


    I think the game really changed with on-chip cache. That enabled multi-core processors.

    I'm pretty sure they had on chip cache long before they had multiple CPUs on a chip. I believe it was the '486 that first used on chip cache. Multiple CPUs didn't appear until 10 years later in the Intel Core processors.


    If the number of cores on a processor had kept up with Moore's Law, we would have thousands of them on a chip by now. Instead, most modern computers have a few cores running very fast.

    Moore's Law was never about the number of processors. It was about the number of transistors. Running multiple processors requires higher memory throughput to maintain processor speed without cache stalls. As long as they could find uses for the
    transistors in a single processor, they were better off with that. It was only when it became harder and harder to find useful features for the added transistors to do that they went with multiple CPUs. Two CPUs are still pretty efficient. Quad
    processors start to show the slow down. 8 CPUs have significant slow down relative to their separate performance, so a second memory interface is often added. 16 CPUs on a chip still find some added utility, but the efficiency is starting to become a
    significant liability. At 32 processors, only special applications can properly utilize them.

    --

    Rick C.

    - Get 1,000 miles of free Supercharging
    - Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From S 1@21:1/5 to Brad Eckert on Sat Oct 22 00:16:50 2022
    On Saturday, 22 October 2022 at 10:29:24 am UTC+10, Brad Eckert wrote:
    On Thursday, October 20, 2022 at 7:46:50 AM UTC-7, S wrote:
    I've come to a revelation on how to possibly do low
    energy silicon transistor scheme much lower than Chucks (not that Chuck reveals details, but it's seems it could be).

    It's possible to implement low energy and high performance circuits side by side, and even dynamically shift between low energy and high performance modes.

    It's possible to even have two sets of instructions, high and low in 32 instructions.

    ----

    On the retro project, I wanted to present a post on the latest thinking of it, but it goes like this. We would have been better off having 10 and 20 bit words in the 1970's processors.

    10 bits word of two 5 bit instructions, adds up to 2048 instructions, that's a descent amount for an low end embedded microcontroller, versus 8 or 4 bits.

    20 bits is 2.5MB which is decent for a microcontroller or computer of the time into the late 1980's.

    A multiple of 10 or 20 bits, makes more sense than 32 or 64 bits. 40 bits maxes out many computers, and 80 bits is a good alternative to 128 bits for many things the public are interested in.

    4 bit 16 colour pixels are too low. 5 bit 32 colour pixels are more ideal. 8 colours at four levels equals 8 levels of white too black, or 32 levels of monochrome which is close to primitive 1980's digital video. Add in multiple 32 colour pallets,
    then that's a good mix. You also don't have to do bit planing, like the Amiga, to get 32 colours.

    8 bit 256 colour pixels are a bit limited. 10 bit 1024 colour pixels are more ideal. In between 3 bits per primary plus two levels, or 4 bits for green ideally, to 1 bit for each primary with 128 levels (certain game styles take advantage of that).
    Again multiple set pallets by switching between pallet modes (5 bits: 2 bits green and red, and 1 bit blue, 2 bits green, 1 bit red and blue, plus 1 bit for two levels etc aswell). 1024 makes some extra bright HDR monochrome possible. As a 1024 sized
    colour pallet, it's ideal.

    20 bit pixels makes reasonable video (8 bit green, 6 bits red and blue, or 7 bits red, and 5 bits blue).

    10-20 bit pallet entries is ideal.

    30 bit colour makes descent video and low level HDR, and pallet entries.

    40 bits makes descent HDR video and pallet entries. 50-90 bits makes ideal HDR.

    10 bits is ideal for basic tile graphics number, and number of character patterns.

    A combination of character number and other basic effects in 20 bits, or feild size and basic effects, is ideal.

    10 bit sound is better than 8 bit, but 20/30 bit sound, is more ideal. 20 bit+ frequencies are more ideal scientifically, to sample frequencies (various plants operate in this range).

    20 bits at 2-4mhz single or dual bus memory, certainly works out some high resolutions.

    So, it's possible to make a dual speed, dual energy circuit of this type, with better graphics for the 1970's/1980's, high-end market, and 1980's consumer market.

    I'm also expanding on an old idea for an alternative to the Rom cartridge and disk. If I can ever get to do it.
    I think DRAM back then was in the 200ns to 120ns range, so maybe 5 to 8 MHz. A 20-bit word could hold four 5-bit instructions. Kind of like a mini ShBoom.

    It took forever for the industry to come up with the HyperBus interface, which removes superfluous address pins. Why weren't we sending address and data over the same pins all along? Suppose they sold 20-bit DRAM chips using a shared address/data
    interface. 20 bits of data, 1-bit chip select, 1-bit cmd/data, and two power pins would fit in a 24-pin DIP package.

    I think the game really changed with on-chip cache. That enabled multi-core processors. If the number of cores on a processor had kept up with Moore's Law, we would have thousands of them on a chip by now. Instead, most modern computers have a few
    cores running very fast. To make them run so fast, they run data through a lot of silicon which generates heat, so we have air-conditioned data centers. This reminds me of a place I worked where they opted to not install a $20K battery-backed air
    conditioner in their UPS-backed server room. There was a power outage but the servers kept running, making the room an oven where they baked to death. Heat is not your friend. Pointy-haired bosses aren't much better.

    Love the Dilbert reference.

    Yeah, without any attempt to over speed, it looks like dram is suitable for the upper market segment in mid 1970's. I haven't checked the pricing on options in the early to mid 1980's. I probably should have said Ram to include sram, but wasn't
    thinking.It's really down to misc intrinsic simplistic design ability to go faster at the same energy. 20mhz based on SRAM may have been possible. But, you can go post 2k resolution for desktop publishing at 4mhz 20 bits, which sounds insane.
    Wherever they would have 20 bit ram at the time or not, going SRAM, enabled the memory to be fine in house. If it was me (in modest, hindsight of course), I would have done a segmented memory scheme in the memory architecture, allowing bad addresses to
    be skipped and transfered to another segment, increasing the stamp yield and reducing cost band used multiple die on packaging, to keep rejects down compared to everything on one chip. We had a fab in my state that closed in the 1980's. As Jack Tramiel
    found out, if you own the fab you can pay down the cost and keep producing cheap undercutting parts in house without upgrading the fab process. In China, there are fabs which operate like that with old processes. Because you then can produce more die
    space per unit of costs, you can better afford sram memory as you cut out the middle man. If you got the high end system timing at 4 MHz (maybe even up to 20mhz), it doesn't matter. Unfortunately, the most lines were not being used like that, and were
    not a newer process. 1 micron might have be more ideal, as they are easier to get a low defect rating. I'm looking into an unconventional 1 micron technology not requiring a conventional fab, for the retro chip using.

    Turns out somebody else is on the same scheme I wanted to do many years ago. They are setting up to manufacture chips atom by atom, with probe technology. Well, there you go, people who don't get on board miss out of the big bucks, so to speak. I've
    gone a bit pass that scheme now. But, I think they can manufacture single units. I keep forgetting that link.

    I'll have to dust off those single atom circuit technogy designs, but no memory to keep up, unless you design it with the print process.

    I'll have to see what mechanism they use to feed this, if it is like mine. Feeding these things at high speed and ultra low defect, is the trick.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Brad Eckert@21:1/5 to All on Sat Oct 22 17:55:51 2022
    On Saturday, October 22, 2022 at 12:16:52 AM UTC-7, S 1 wrote:
    On Saturday, 22 October 2022 at 10:29:24 am UTC+10, Brad Eckert wrote:
    On Thursday, October 20, 2022 at 7:46:50 AM UTC-7, S wrote:
    I've come to a revelation on how to possibly do low
    energy silicon transistor scheme much lower than Chucks (not that Chuck reveals details, but it's seems it could be).

    It's possible to implement low energy and high performance circuits side by side, and even dynamically shift between low energy and high performance modes.

    It's possible to even have two sets of instructions, high and low in 32 instructions.

    ----

    On the retro project, I wanted to present a post on the latest thinking of it, but it goes like this. We would have been better off having 10 and 20 bit words in the 1970's processors.

    10 bits word of two 5 bit instructions, adds up to 2048 instructions, that's a descent amount for an low end embedded microcontroller, versus 8 or 4 bits.

    20 bits is 2.5MB which is decent for a microcontroller or computer of the time into the late 1980's.

    A multiple of 10 or 20 bits, makes more sense than 32 or 64 bits. 40 bits maxes out many computers, and 80 bits is a good alternative to 128 bits for many things the public are interested in.

    4 bit 16 colour pixels are too low. 5 bit 32 colour pixels are more ideal. 8 colours at four levels equals 8 levels of white too black, or 32 levels of monochrome which is close to primitive 1980's digital video. Add in multiple 32 colour pallets,
    then that's a good mix. You also don't have to do bit planing, like the Amiga, to get 32 colours.

    8 bit 256 colour pixels are a bit limited. 10 bit 1024 colour pixels are more ideal. In between 3 bits per primary plus two levels, or 4 bits for green ideally, to 1 bit for each primary with 128 levels (certain game styles take advantage of that).
    Again multiple set pallets by switching between pallet modes (5 bits: 2 bits green and red, and 1 bit blue, 2 bits green, 1 bit red and blue, plus 1 bit for two levels etc aswell). 1024 makes some extra bright HDR monochrome possible. As a 1024 sized
    colour pallet, it's ideal.

    20 bit pixels makes reasonable video (8 bit green, 6 bits red and blue, or 7 bits red, and 5 bits blue).

    10-20 bit pallet entries is ideal.

    30 bit colour makes descent video and low level HDR, and pallet entries.

    40 bits makes descent HDR video and pallet entries. 50-90 bits makes ideal HDR.

    10 bits is ideal for basic tile graphics number, and number of character patterns.

    A combination of character number and other basic effects in 20 bits, or feild size and basic effects, is ideal.

    10 bit sound is better than 8 bit, but 20/30 bit sound, is more ideal. 20 bit+ frequencies are more ideal scientifically, to sample frequencies (various plants operate in this range).

    20 bits at 2-4mhz single or dual bus memory, certainly works out some high resolutions.

    So, it's possible to make a dual speed, dual energy circuit of this type, with better graphics for the 1970's/1980's, high-end market, and 1980's consumer market.

    I'm also expanding on an old idea for an alternative to the Rom cartridge and disk. If I can ever get to do it.
    I think DRAM back then was in the 200ns to 120ns range, so maybe 5 to 8 MHz. A 20-bit word could hold four 5-bit instructions. Kind of like a mini ShBoom.

    It took forever for the industry to come up with the HyperBus interface, which removes superfluous address pins. Why weren't we sending address and data over the same pins all along? Suppose they sold 20-bit DRAM chips using a shared address/data
    interface. 20 bits of data, 1-bit chip select, 1-bit cmd/data, and two power pins would fit in a 24-pin DIP package.

    I think the game really changed with on-chip cache. That enabled multi-core processors. If the number of cores on a processor had kept up with Moore's Law, we would have thousands of them on a chip by now. Instead, most modern computers have a few
    cores running very fast. To make them run so fast, they run data through a lot of silicon which generates heat, so we have air-conditioned data centers. This reminds me of a place I worked where they opted to not install a $20K battery-backed air
    conditioner in their UPS-backed server room. There was a power outage but the servers kept running, making the room an oven where they baked to death. Heat is not your friend. Pointy-haired bosses aren't much better.
    Love the Dilbert reference.

    Yeah, without any attempt to over speed, it looks like dram is suitable for the upper market segment in mid 1970's. I haven't checked the pricing on options in the early to mid 1980's. I probably should have said Ram to include sram, but wasn't
    thinking.It's really down to misc intrinsic simplistic design ability to go faster at the same energy. 20mhz based on SRAM may have been possible. But, you can go post 2k resolution for desktop publishing at 4mhz 20 bits, which sounds insane. Wherever
    they would have 20 bit ram at the time or not, going SRAM, enabled the memory to be fine in house. If it was me (in modest, hindsight of course), I would have done a segmented memory scheme in the memory architecture, allowing bad addresses to be skipped
    and transfered to another segment, increasing the stamp yield and reducing cost band used multiple die on packaging, to keep rejects down compared to everything on one chip. We had a fab in my state that closed in the 1980's. As Jack Tramiel found out,
    if you own the fab you can pay down the cost and keep producing cheap undercutting parts in house without upgrading the fab process. In China, there are fabs which operate like that with old processes. Because you then can produce more die space per unit
    of costs, you can better afford sram memory as you cut out the middle man. If you got the high end system timing at 4 MHz (maybe even up to 20mhz), it doesn't matter. Unfortunately, the most lines were not being used like that, and were not a newer
    process. 1 micron might have be more ideal, as they are easier to get a low defect rating. I'm looking into an unconventional 1 micron technology not requiring a conventional fab, for the retro chip using.

    Turns out somebody else is on the same scheme I wanted to do many years ago. They are setting up to manufacture chips atom by atom, with probe technology. Well, there you go, people who don't get on board miss out of the big bucks, so to speak. I've
    gone a bit pass that scheme now. But, I think they can manufacture single units. I keep forgetting that link.

    I'll have to dust off those single atom circuit technogy designs, but no memory to keep up, unless you design it with the print process.

    I'll have to see what mechanism they use to feed this, if it is like mine. Feeding these things at high speed and ultra low defect, is the trick.

    Reliving the 1980s probably isn't the path to riches, but is it fun to muse. Could Forth have helped the uptake of multi-core? Maybe. Transputer Forth was a thing. Forth's extreme factoring helps avoid cache misses. Nowadays one can build systems in
    simulation that run fast enough for proof of concept.

    I had been contemplating a 20-bit cell size and you have provided some interesting rationale. I can add some more based on the theory that the Universe is a simulation. The Universe is made of data and it is structured as a dodecahedron (12-sided
    platonic solid). A dodecahedron has 20 vertices, so 20 bits. Numerologically, 20 is the best number for promoting cooperation.

    I like the idea of 10-bit bytes. But, we have to live with data structures the computer industry has accumulated over the last 50 years. These are generally made of 8-bit, 16-bit, and 32-bit chunks of data. So for general-purpose computing in this
    timeline, where C escaped from a lab and infested the industry, there are compelling reasons to favor those data widths.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From dxforth@21:1/5 to Brad Eckert on Sun Oct 23 13:49:18 2022
    On 23/10/2022 11:55 am, Brad Eckert wrote:
    ...
    I like the idea of 10-bit bytes. But, we have to live with data structures the computer industry has accumulated over the last 50 years. These are generally made of 8-bit, 16-bit, and 32-bit chunks of data. So for general-purpose computing in this
    timeline, where C escaped from a lab and infested the industry, there are compelling reasons to favor those data widths.

    I would have thought it was 1971 and the 4004 that consolidated power-of-two and changed the world forever :)

    https://en.wikipedia.org/wiki/Word_(computer_architecture)#Size_families

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Anton Ertl@21:1/5 to dxforth on Sun Oct 23 07:45:44 2022
    dxforth <dxforth@gmail.com> writes:
    I would have thought it was 1971 and the 4004 that consolidated power-of-two >and changed the world forever :)

    Intel was not that influential at the time, and the 4004 was not
    influential even within Intel. IBM introduced power-of-two with the
    Stretch (IBM7030), and then in the S/360, which replaced all previous
    IBM lines. There was also a force in minicomputers in the late 1960s
    that made people go for 16-bit minis (PDP-11, Nova, ...), and I think
    that is what made power-of-two words and 8-bit bytes win over other
    sizes; my guess is that this force was 4-bit-wide ALU ICs like the
    74181.

    And where mainframes and minis went, micros followed. Not just Intel,
    but also Motorola and others with their 8-bit CPUs.

    https://en.wikipedia.org/wiki/Word_(computer_architecture)#Size_families

    A very nice listing that shows how diverse word size was in early
    years before it converged on power-of-two.

    - anton
    --
    M. Anton Ertl http://www.complang.tuwien.ac.at/anton/home.html
    comp.lang.forth FAQs: http://www.complang.tuwien.ac.at/forth/faq/toc.html
    New standard: https://forth-standard.org/
    EuroForth 2022: https://euro.theforth.net

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Marcel Hendrix@21:1/5 to Anton Ertl on Sun Oct 23 08:17:43 2022
    On Sunday, October 23, 2022 at 9:54:49 AM UTC+2, Anton Ertl wrote:
    dxforth <dxf...@gmail.com> writes:
    [..]
    A very nice listing that shows how diverse word size was in early
    years before it converged on power-of-two.

    I thought the original idea was to use the ECC bits of a standard
    DRAM module to get 20 bit memory. A typical Forthian idea :--)

    -marcel

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Brad Eckert@21:1/5 to dxforth on Sun Oct 23 09:10:02 2022
    On Saturday, October 22, 2022 at 7:49:21 PM UTC-7, dxforth wrote:
    On 23/10/2022 11:55 am, Brad Eckert wrote:
    ...
    I like the idea of 10-bit bytes. But, we have to live with data structures the computer industry has accumulated over the last 50 years. These are generally made of 8-bit, 16-bit, and 32-bit chunks of data. So for general-purpose computing in this
    timeline, where C escaped from a lab and infested the industry, there are compelling reasons to favor those data widths.
    I would have thought it was 1971 and the 4004 that consolidated power-of-two and changed the world forever :)

    https://en.wikipedia.org/wiki/Word_(computer_architecture)#Size_families
    As an aside, the world's first microprocessor was a 20-bit machine. The MP944 pre-dated the 4004 by a year or two. You didn't hear about it because the US Navy kept it classified for 30 years. It controlled the F-14 fighter jet.

    https://en.wikipedia.org/wiki/F-14_CADC

    It's like the Ferrari being in production but the Yugo becoming the mass-market car. In fact, nobody knows what a Ferrari is. So, Intel got to toot its (high-pitched) horn about having invented the microprocessor.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Anton Ertl@21:1/5 to Marcel Hendrix on Sun Oct 23 16:27:36 2022
    Marcel Hendrix <mhx@iae.nl> writes:
    On Sunday, October 23, 2022 at 9:54:49 AM UTC+2, Anton Ertl wrote:
    dxforth <dxf...@gmail.com> writes:
    [..]
    A very nice listing that shows how diverse word size was in early
    years before it converged on power-of-two.

    I thought the original idea was to use the ECC bits of a standard
    DRAM module to get 20 bit memory. A typical Forthian idea :--)

    There have been widely available 30-pin-SIMMs with 8 data bits and
    72-pin-SIMMs with 32/36 data bits, and later DIMMs with 64/72 data
    bits. Very recently there are DDR5 DIMMs with 2x32 or 2x40 data bits.
    But I never heard about modules with 20 data bits.

    - anton
    --
    M. Anton Ertl http://www.complang.tuwien.ac.at/anton/home.html
    comp.lang.forth FAQs: http://www.complang.tuwien.ac.at/forth/faq/toc.html
    New standard: https://forth-standard.org/
    EuroForth 2022: https://euro.theforth.net

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From S@21:1/5 to Brad Eckert on Sun Oct 23 09:47:54 2022
    On Sunday, 23 October 2022 at 10:55:52 am UTC+10, Brad Eckert wrote:
    On Saturday, October 22, 2022 at 12:16:52 AM UTC-7, S 1 wrote:
    On Saturday, 22 October 2022 at 10:29:24 am UTC+10, Brad Eckert wrote:
    On Thursday, October 20, 2022 at 7:46:50 AM UTC-7, S wrote:
    I've come to a revelation on how to possibly do low
    energy silicon transistor scheme much lower than Chucks (not that Chuck reveals details, but it's seems it could be).

    It's possible to implement low energy and high performance circuits side by side, and even dynamically shift between low energy and high performance modes.

    It's possible to even have two sets of instructions, high and low in 32 instructions.

    ----

    On the retro project, I wanted to present a post on the latest thinking of it, but it goes like this. We would have been better off having 10 and 20 bit words in the 1970's processors.

    10 bits word of two 5 bit instructions, adds up to 2048 instructions, that's a descent amount for an low end embedded microcontroller, versus 8 or 4 bits.

    20 bits is 2.5MB which is decent for a microcontroller or computer of the time into the late 1980's.

    A multiple of 10 or 20 bits, makes more sense than 32 or 64 bits. 40 bits maxes out many computers, and 80 bits is a good alternative to 128 bits for many things the public are interested in.

    4 bit 16 colour pixels are too low. 5 bit 32 colour pixels are more ideal. 8 colours at four levels equals 8 levels of white too black, or 32 levels of monochrome which is close to primitive 1980's digital video. Add in multiple 32 colour pallets,
    then that's a good mix. You also don't have to do bit planing, like the Amiga, to get 32 colours.

    8 bit 256 colour pixels are a bit limited. 10 bit 1024 colour pixels are more ideal. In between 3 bits per primary plus two levels, or 4 bits for green ideally, to 1 bit for each primary with 128 levels (certain game styles take advantage of that)
    . Again multiple set pallets by switching between pallet modes (5 bits: 2 bits green and red, and 1 bit blue, 2 bits green, 1 bit red and blue, plus 1 bit for two levels etc aswell). 1024 makes some extra bright HDR monochrome possible. As a 1024 sized
    colour pallet, it's ideal.

    20 bit pixels makes reasonable video (8 bit green, 6 bits red and blue, or 7 bits red, and 5 bits blue).

    10-20 bit pallet entries is ideal.

    30 bit colour makes descent video and low level HDR, and pallet entries.

    40 bits makes descent HDR video and pallet entries. 50-90 bits makes ideal HDR.

    10 bits is ideal for basic tile graphics number, and number of character patterns.

    A combination of character number and other basic effects in 20 bits, or feild size and basic effects, is ideal.

    10 bit sound is better than 8 bit, but 20/30 bit sound, is more ideal. 20 bit+ frequencies are more ideal scientifically, to sample frequencies (various plants operate in this range).

    20 bits at 2-4mhz single or dual bus memory, certainly works out some high resolutions.

    So, it's possible to make a dual speed, dual energy circuit of this type, with better graphics for the 1970's/1980's, high-end market, and 1980's consumer market.

    I'm also expanding on an old idea for an alternative to the Rom cartridge and disk. If I can ever get to do it.
    I think DRAM back then was in the 200ns to 120ns range, so maybe 5 to 8 MHz. A 20-bit word could hold four 5-bit instructions. Kind of like a mini ShBoom.

    It took forever for the industry to come up with the HyperBus interface, which removes superfluous address pins. Why weren't we sending address and data over the same pins all along? Suppose they sold 20-bit DRAM chips using a shared address/data
    interface. 20 bits of data, 1-bit chip select, 1-bit cmd/data, and two power pins would fit in a 24-pin DIP package.

    I think the game really changed with on-chip cache. That enabled multi-core processors. If the number of cores on a processor had kept up with Moore's Law, we would have thousands of them on a chip by now. Instead, most modern computers have a few
    cores running very fast. To make them run so fast, they run data through a lot of silicon which generates heat, so we have air-conditioned data centers. This reminds me of a place I worked where they opted to not install a $20K battery-backed air
    conditioner in their UPS-backed server room. There was a power outage but the servers kept running, making the room an oven where they baked to death. Heat is not your friend. Pointy-haired bosses aren't much better.
    Love the Dilbert reference.

    Yeah, without any attempt to over speed, it looks like dram is suitable for the upper market segment in mid 1970's. I haven't checked the pricing on options in the early to mid 1980's. I probably should have said Ram to include sram, but wasn't
    thinking.It's really down to misc intrinsic simplistic design ability to go faster at the same energy. 20mhz based on SRAM may have been possible. But, you can go post 2k resolution for desktop publishing at 4mhz 20 bits, which sounds insane. Wherever
    they would have 20 bit ram at the time or not, going SRAM, enabled the memory to be fine in house. If it was me (in modest, hindsight of course), I would have done a segmented memory scheme in the memory architecture, allowing bad addresses to be skipped
    and transfered to another segment, increasing the stamp yield and reducing cost band used multiple die on packaging, to keep rejects down compared to everything on one chip. We had a fab in my state that closed in the 1980's. As Jack Tramiel found out,
    if you own the fab you can pay down the cost and keep producing cheap undercutting parts in house without upgrading the fab process. In China, there are fabs which operate like that with old processes. Because you then can produce more die space per unit
    of costs, you can better afford sram memory as you cut out the middle man. If you got the high end system timing at 4 MHz (maybe even up to 20mhz), it doesn't matter. Unfortunately, the most lines were not being used like that, and were not a newer
    process. 1 micron might have be more ideal, as they are easier to get a low defect rating. I'm looking into an unconventional 1 micron technology not requiring a conventional fab, for the retro chip using.

    Turns out somebody else is on the same scheme I wanted to do many years ago. They are setting up to manufacture chips atom by atom, with probe technology. Well, there you go, people who don't get on board miss out of the big bucks, so to speak. I've
    gone a bit pass that scheme now. But, I think they can manufacture single units. I keep forgetting that link.

    I'll have to dust off those single atom circuit technogy designs, but no memory to keep up, unless you design it with the print process.

    I'll have to see what mechanism they use to feed this, if it is like mine. Feeding these things at high speed and ultra low defect, is the trick.
    Reliving the 1980s probably isn't the path to riches, but is it fun to muse. Could Forth have helped the uptake of multi-core? Maybe. Transputer Forth was a thing. Forth's extreme factoring helps avoid cache misses. Nowadays one can build systems in
    simulation that run fast enough for proof of concept.

    I had been contemplating a 20-bit cell size and you have provided some interesting rationale. I can add some more based on the theory that the Universe is a simulation. The Universe is made of data and it is structured as a dodecahedron (12-sided
    platonic solid). A dodecahedron has 20 vertices, so 20 bits. Numerologically, 20 is the best number for promoting cooperation.

    I like the idea of 10-bit bytes. But, we have to live with data structures the computer industry has accumulated over the last 50 years. These are generally made of 8-bit, 16-bit, and 32-bit chunks of data. So for general-purpose computing in this
    timeline, where C escaped from a lab and infested the industry, there are compelling reasons to favour those data widths.

    Well, I explore up to to 10000+ value words, 10/100/1000 etc bits using phase timing processing (10,000 would be between four 10 value bits and two 100 value bits. What could go wrong? Knowing circuites a lot. Every phase transfer requires a timed
    measurement or a quantity measurement. If I hadn't gotten so sick, I would have realised the odds of such measurements stuffing up are too high. I'm not saying it can't be done. Unfortunately, the brain disease got too much to finish the electronic
    mechanism and set of operator circuit designs. I got the list of operators for binary and went to work trying to convert them over to the phase technology. It was also around this time I had to throw in the towel of my other great ambition, a
    professional 3D video camera the size of a film frame (but thicker). That eventually is designed to be transparent. Way out there. I would guess, maybe a $100 million to get that technology developed and out there.

    Anyway, this here is a what if experiment. But the retro chip itself is a useful product, and planned to be a competitive MCU (but on a modern fab process obviously).

    The early home computer industry, would have allowed a shift, plus you can build in 8 bit data modes. In the pro desktop sense, a similar opportunity exists to in the 1970's.

    Brat, do you watch the TOE channel. Some great interviews on various topics, including the simulation thing. To me, I don't regard reality as a simulation, though some things are interesting. However my own proposals are a structured universe model,
    which others put forwards as a computational universe in their own ideas. This is a miss-naming, of structured universe leading to mechanisms of self assembly, the computation, as a recent TOE interview also pointed out.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Brad Eckert@21:1/5 to All on Sun Oct 23 11:37:49 2022
    On Sunday, October 23, 2022 at 9:47:56 AM UTC-7, S wrote:
    On Sunday, 23 October 2022 at 10:55:52 am UTC+10, Brad Eckert wrote:
    On Saturday, October 22, 2022 at 12:16:52 AM UTC-7, S 1 wrote:
    On Saturday, 22 October 2022 at 10:29:24 am UTC+10, Brad Eckert wrote:
    On Thursday, October 20, 2022 at 7:46:50 AM UTC-7, S wrote:
    I've come to a revelation on how to possibly do low
    energy silicon transistor scheme much lower than Chucks (not that Chuck reveals details, but it's seems it could be).

    It's possible to implement low energy and high performance circuits side by side, and even dynamically shift between low energy and high performance modes.

    It's possible to even have two sets of instructions, high and low in 32 instructions.

    ----

    On the retro project, I wanted to present a post on the latest thinking of it, but it goes like this. We would have been better off having 10 and 20 bit words in the 1970's processors.

    10 bits word of two 5 bit instructions, adds up to 2048 instructions, that's a descent amount for an low end embedded microcontroller, versus 8 or 4 bits.

    20 bits is 2.5MB which is decent for a microcontroller or computer of the time into the late 1980's.

    A multiple of 10 or 20 bits, makes more sense than 32 or 64 bits. 40 bits maxes out many computers, and 80 bits is a good alternative to 128 bits for many things the public are interested in.

    4 bit 16 colour pixels are too low. 5 bit 32 colour pixels are more ideal. 8 colours at four levels equals 8 levels of white too black, or 32 levels of monochrome which is close to primitive 1980's digital video. Add in multiple 32 colour
    pallets, then that's a good mix. You also don't have to do bit planing, like the Amiga, to get 32 colours.

    8 bit 256 colour pixels are a bit limited. 10 bit 1024 colour pixels are more ideal. In between 3 bits per primary plus two levels, or 4 bits for green ideally, to 1 bit for each primary with 128 levels (certain game styles take advantage of
    that). Again multiple set pallets by switching between pallet modes (5 bits: 2 bits green and red, and 1 bit blue, 2 bits green, 1 bit red and blue, plus 1 bit for two levels etc aswell). 1024 makes some extra bright HDR monochrome possible. As a 1024
    sized colour pallet, it's ideal.

    20 bit pixels makes reasonable video (8 bit green, 6 bits red and blue, or 7 bits red, and 5 bits blue).

    10-20 bit pallet entries is ideal.

    30 bit colour makes descent video and low level HDR, and pallet entries.

    40 bits makes descent HDR video and pallet entries. 50-90 bits makes ideal HDR.

    10 bits is ideal for basic tile graphics number, and number of character patterns.

    A combination of character number and other basic effects in 20 bits, or feild size and basic effects, is ideal.

    10 bit sound is better than 8 bit, but 20/30 bit sound, is more ideal. 20 bit+ frequencies are more ideal scientifically, to sample frequencies (various plants operate in this range).

    20 bits at 2-4mhz single or dual bus memory, certainly works out some high resolutions.

    So, it's possible to make a dual speed, dual energy circuit of this type, with better graphics for the 1970's/1980's, high-end market, and 1980's consumer market.

    I'm also expanding on an old idea for an alternative to the Rom cartridge and disk. If I can ever get to do it.
    I think DRAM back then was in the 200ns to 120ns range, so maybe 5 to 8 MHz. A 20-bit word could hold four 5-bit instructions. Kind of like a mini ShBoom.

    It took forever for the industry to come up with the HyperBus interface, which removes superfluous address pins. Why weren't we sending address and data over the same pins all along? Suppose they sold 20-bit DRAM chips using a shared address/data
    interface. 20 bits of data, 1-bit chip select, 1-bit cmd/data, and two power pins would fit in a 24-pin DIP package.

    I think the game really changed with on-chip cache. That enabled multi-core processors. If the number of cores on a processor had kept up with Moore's Law, we would have thousands of them on a chip by now. Instead, most modern computers have a
    few cores running very fast. To make them run so fast, they run data through a lot of silicon which generates heat, so we have air-conditioned data centers. This reminds me of a place I worked where they opted to not install a $20K battery-backed air
    conditioner in their UPS-backed server room. There was a power outage but the servers kept running, making the room an oven where they baked to death. Heat is not your friend. Pointy-haired bosses aren't much better.
    Love the Dilbert reference.

    Yeah, without any attempt to over speed, it looks like dram is suitable for the upper market segment in mid 1970's. I haven't checked the pricing on options in the early to mid 1980's. I probably should have said Ram to include sram, but wasn't
    thinking.It's really down to misc intrinsic simplistic design ability to go faster at the same energy. 20mhz based on SRAM may have been possible. But, you can go post 2k resolution for desktop publishing at 4mhz 20 bits, which sounds insane. Wherever
    they would have 20 bit ram at the time or not, going SRAM, enabled the memory to be fine in house. If it was me (in modest, hindsight of course), I would have done a segmented memory scheme in the memory architecture, allowing bad addresses to be skipped
    and transfered to another segment, increasing the stamp yield and reducing cost band used multiple die on packaging, to keep rejects down compared to everything on one chip. We had a fab in my state that closed in the 1980's. As Jack Tramiel found out,
    if you own the fab you can pay down the cost and keep producing cheap undercutting parts in house without upgrading the fab process. In China, there are fabs which operate like that with old processes. Because you then can produce more die space per unit
    of costs, you can better afford sram memory as you cut out the middle man. If you got the high end system timing at 4 MHz (maybe even up to 20mhz), it doesn't matter. Unfortunately, the most lines were not being used like that, and were not a newer
    process. 1 micron might have be more ideal, as they are easier to get a low defect rating. I'm looking into an unconventional 1 micron technology not requiring a conventional fab, for the retro chip using.

    Turns out somebody else is on the same scheme I wanted to do many years ago. They are setting up to manufacture chips atom by atom, with probe technology. Well, there you go, people who don't get on board miss out of the big bucks, so to speak. I'
    ve gone a bit pass that scheme now. But, I think they can manufacture single units. I keep forgetting that link.

    I'll have to dust off those single atom circuit technogy designs, but no memory to keep up, unless you design it with the print process.

    I'll have to see what mechanism they use to feed this, if it is like mine. Feeding these things at high speed and ultra low defect, is the trick.
    Reliving the 1980s probably isn't the path to riches, but is it fun to muse. Could Forth have helped the uptake of multi-core? Maybe. Transputer Forth was a thing. Forth's extreme factoring helps avoid cache misses. Nowadays one can build systems in
    simulation that run fast enough for proof of concept.

    I had been contemplating a 20-bit cell size and you have provided some interesting rationale. I can add some more based on the theory that the Universe is a simulation. The Universe is made of data and it is structured as a dodecahedron (12-sided
    platonic solid). A dodecahedron has 20 vertices, so 20 bits. Numerologically, 20 is the best number for promoting cooperation.

    I like the idea of 10-bit bytes. But, we have to live with data structures the computer industry has accumulated over the last 50 years. These are generally made of 8-bit, 16-bit, and 32-bit chunks of data. So for general-purpose computing in this
    timeline, where C escaped from a lab and infested the industry, there are compelling reasons to favour those data widths.

    Well, I explore up to to 10000+ value words, 10/100/1000 etc bits using phase timing processing (10,000 would be between four 10 value bits and two 100 value bits. What could go wrong? Knowing circuites a lot. Every phase transfer requires a timed
    measurement or a quantity measurement. If I hadn't gotten so sick, I would have realised the odds of such measurements stuffing up are too high. I'm not saying it can't be done. Unfortunately, the brain disease got too much to finish the electronic
    mechanism and set of operator circuit designs. I got the list of operators for binary and went to work trying to convert them over to the phase technology. It was also around this time I had to throw in the towel of my other great ambition, a
    professional 3D video camera the size of a film frame (but thicker). That eventually is designed to be transparent. Way out there. I would guess, maybe a $100 million to get that technology developed and out there.

    Anyway, this here is a what if experiment. But the retro chip itself is a useful product, and planned to be a competitive MCU (but on a modern fab process obviously).

    The early home computer industry, would have allowed a shift, plus you can build in 8 bit data modes. In the pro desktop sense, a similar opportunity exists to in the 1970's.

    Brat, do you watch the TOE channel. Some great interviews on various topics, including the simulation thing. To me, I don't regard reality as a simulation, though some things are interesting. However my own proposals are a structured universe model,
    which others put forwards as a computational universe in their own ideas. This is a miss-naming, of structured universe leading to mechanisms of self assembly, the computation, as a recent TOE interview also pointed out.
    What kind of brain disease? You seem to be handling it rather well.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From dxforth@21:1/5 to Brad Eckert on Mon Oct 24 13:02:41 2022
    On 24/10/2022 3:10 am, Brad Eckert wrote:
    On Saturday, October 22, 2022 at 7:49:21 PM UTC-7, dxforth wrote:
    On 23/10/2022 11:55 am, Brad Eckert wrote:
    ...
    I like the idea of 10-bit bytes. But, we have to live with data structures the computer industry has accumulated over the last 50 years. These are generally made of 8-bit, 16-bit, and 32-bit chunks of data. So for general-purpose computing in this
    timeline, where C escaped from a lab and infested the industry, there are compelling reasons to favor those data widths.
    I would have thought it was 1971 and the 4004 that consolidated power-of-two >> and changed the world forever :)

    https://en.wikipedia.org/wiki/Word_(computer_architecture)#Size_families
    As an aside, the world's first microprocessor was a 20-bit machine. The MP944 pre-dated the 4004 by a year or two. You didn't hear about it because the US Navy kept it classified for 30 years. It controlled the F-14 fighter jet.

    https://en.wikipedia.org/wiki/F-14_CADC

    It's like the Ferrari being in production but the Yugo becoming the mass-market car. In fact, nobody knows what a Ferrari is. So, Intel got to toot its (high-pitched) horn about having invented the microprocessor.

    30 years too late to do the world (or its creators) any good. Forth could have gone the same way - bagged, tagged and forgotten by Moore's employers.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From S@21:1/5 to Brad Eckert on Sun Oct 23 20:18:08 2022
    On Monday, October 24, 2022 at 4:37:50 AM UTC+10, Brad Eckert wrote:
    On Sunday, October 23, 2022 at 9:47:56 AM UTC-7, S wrote:
    On Sunday, 23 October 2022 at 10:55:52 am UTC+10, Brad Eckert wrote:
    On Saturday, October 22, 2022 at 12:16:52 AM UTC-7, S 1 wrote:
    On Saturday, 22 October 2022 at 10:29:24 am UTC+10, Brad Eckert wrote:
    On Thursday, October 20, 2022 at 7:46:50 AM UTC-7, S wrote:
    I've come to a revelation on how to possibly do low
    energy silicon transistor scheme much lower than Chucks (not that Chuck reveals details, but it's seems it could be).

    It's possible to implement low energy and high performance circuits side by side, and even dynamically shift between low energy and high performance modes.

    It's possible to even have two sets of instructions, high and low in 32 instructions.

    ----

    On the retro project, I wanted to present a post on the latest thinking of it, but it goes like this. We would have been better off having 10 and 20 bit words in the 1970's processors.

    10 bits word of two 5 bit instructions, adds up to 2048 instructions, that's a descent amount for an low end embedded microcontroller, versus 8 or 4 bits.

    20 bits is 2.5MB which is decent for a microcontroller or computer of the time into the late 1980's.

    A multiple of 10 or 20 bits, makes more sense than 32 or 64 bits. 40 bits maxes out many computers, and 80 bits is a good alternative to 128 bits for many things the public are interested in.

    4 bit 16 colour pixels are too low. 5 bit 32 colour pixels are more ideal. 8 colours at four levels equals 8 levels of white too black, or 32 levels of monochrome which is close to primitive 1980's digital video. Add in multiple 32 colour
    pallets, then that's a good mix. You also don't have to do bit planing, like the Amiga, to get 32 colours.

    8 bit 256 colour pixels are a bit limited. 10 bit 1024 colour pixels are more ideal. In between 3 bits per primary plus two levels, or 4 bits for green ideally, to 1 bit for each primary with 128 levels (certain game styles take advantage of
    that). Again multiple set pallets by switching between pallet modes (5 bits: 2 bits green and red, and 1 bit blue, 2 bits green, 1 bit red and blue, plus 1 bit for two levels etc aswell). 1024 makes some extra bright HDR monochrome possible. As a 1024
    sized colour pallet, it's ideal.

    20 bit pixels makes reasonable video (8 bit green, 6 bits red and blue, or 7 bits red, and 5 bits blue).

    10-20 bit pallet entries is ideal.

    30 bit colour makes descent video and low level HDR, and pallet entries.

    40 bits makes descent HDR video and pallet entries. 50-90 bits makes ideal HDR.

    10 bits is ideal for basic tile graphics number, and number of character patterns.

    A combination of character number and other basic effects in 20 bits, or feild size and basic effects, is ideal.

    10 bit sound is better than 8 bit, but 20/30 bit sound, is more ideal. 20 bit+ frequencies are more ideal scientifically, to sample frequencies (various plants operate in this range).

    20 bits at 2-4mhz single or dual bus memory, certainly works out some high resolutions.

    So, it's possible to make a dual speed, dual energy circuit of this type, with better graphics for the 1970's/1980's, high-end market, and 1980's consumer market.

    I'm also expanding on an old idea for an alternative to the Rom cartridge and disk. If I can ever get to do it.
    I think DRAM back then was in the 200ns to 120ns range, so maybe 5 to 8 MHz. A 20-bit word could hold four 5-bit instructions. Kind of like a mini ShBoom.

    It took forever for the industry to come up with the HyperBus interface, which removes superfluous address pins. Why weren't we sending address and data over the same pins all along? Suppose they sold 20-bit DRAM chips using a shared address/
    data interface. 20 bits of data, 1-bit chip select, 1-bit cmd/data, and two power pins would fit in a 24-pin DIP package.

    I think the game really changed with on-chip cache. That enabled multi-core processors. If the number of cores on a processor had kept up with Moore's Law, we would have thousands of them on a chip by now. Instead, most modern computers have a
    few cores running very fast. To make them run so fast, they run data through a lot of silicon which generates heat, so we have air-conditioned data centers. This reminds me of a place I worked where they opted to not install a $20K battery-backed air
    conditioner in their UPS-backed server room. There was a power outage but the servers kept running, making the room an oven where they baked to death. Heat is not your friend. Pointy-haired bosses aren't much better.
    Love the Dilbert reference.

    Yeah, without any attempt to over speed, it looks like dram is suitable for the upper market segment in mid 1970's. I haven't checked the pricing on options in the early to mid 1980's. I probably should have said Ram to include sram, but wasn't
    thinking.It's really down to misc intrinsic simplistic design ability to go faster at the same energy. 20mhz based on SRAM may have been possible. But, you can go post 2k resolution for desktop publishing at 4mhz 20 bits, which sounds insane. Wherever
    they would have 20 bit ram at the time or not, going SRAM, enabled the memory to be fine in house. If it was me (in modest, hindsight of course), I would have done a segmented memory scheme in the memory architecture, allowing bad addresses to be skipped
    and transfered to another segment, increasing the stamp yield and reducing cost band used multiple die on packaging, to keep rejects down compared to everything on one chip. We had a fab in my state that closed in the 1980's. As Jack Tramiel found out,
    if you own the fab you can pay down the cost and keep producing cheap undercutting parts in house without upgrading the fab process. In China, there are fabs which operate like that with old processes. Because you then can produce more die space per unit
    of costs, you can better afford sram memory as you cut out the middle man. If you got the high end system timing at 4 MHz (maybe even up to 20mhz), it doesn't matter. Unfortunately, the most lines were not being used like that, and were not a newer
    process. 1 micron might have be more ideal, as they are easier to get a low defect rating. I'm looking into an unconventional 1 micron technology not requiring a conventional fab, for the retro chip using.

    Turns out somebody else is on the same scheme I wanted to do many years ago. They are setting up to manufacture chips atom by atom, with probe technology. Well, there you go, people who don't get on board miss out of the big bucks, so to speak. I'
    ve gone a bit pass that scheme now. But, I think they can manufacture single units. I keep forgetting that link.

    I'll have to dust off those single atom circuit technogy designs, but no memory to keep up, unless you design it with the print process.

    I'll have to see what mechanism they use to feed this, if it is like mine. Feeding these things at high speed and ultra low defect, is the trick.
    Reliving the 1980s probably isn't the path to riches, but is it fun to muse. Could Forth have helped the uptake of multi-core? Maybe. Transputer Forth was a thing. Forth's extreme factoring helps avoid cache misses. Nowadays one can build systems
    in simulation that run fast enough for proof of concept.

    I had been contemplating a 20-bit cell size and you have provided some interesting rationale. I can add some more based on the theory that the Universe is a simulation. The Universe is made of data and it is structured as a dodecahedron (12-sided
    platonic solid). A dodecahedron has 20 vertices, so 20 bits. Numerologically, 20 is the best number for promoting cooperation.

    I like the idea of 10-bit bytes. But, we have to live with data structures the computer industry has accumulated over the last 50 years. These are generally made of 8-bit, 16-bit, and 32-bit chunks of data. So for general-purpose computing in this
    timeline, where C escaped from a lab and infested the industry, there are compelling reasons to favour those data widths.

    Well, I explore up to to 10000+ value words, 10/100/1000 etc bits using phase timing processing (10,000 would be between four 10 value bits and two 100 value bits. What could go wrong? Knowing circuites a lot. Every phase transfer requires a timed
    measurement or a quantity measurement. If I hadn't gotten so sick, I would have realised the odds of such measurements stuffing up are too high. I'm not saying it can't be done. Unfortunately, the brain disease got too much to finish the electronic
    mechanism and set of operator circuit designs. I got the list of operators for binary and went to work trying to convert them over to the phase technology. It was also around this time I had to throw in the towel of my other great ambition, a
    professional 3D video camera the size of a film frame (but thicker). That eventually is designed to be transparent. Way out there. I would guess, maybe a $100 million to get that technology developed and out there.

    Anyway, this here is a what if experiment. But the retro chip itself is a useful product, and planned to be a competitive MCU (but on a modern fab process obviously).

    The early home computer industry, would have allowed a shift, plus you can build in 8 bit data modes. In the pro desktop sense, a similar opportunity exists to in the 1970's.

    Brat, do you watch the TOE channel. Some great interviews on various topics, including the simulation thing. To me, I don't regard reality as a simulation, though some things are interesting. However my own proposals are a structured universe model,
    which others put forwards as a computational universe in their own ideas. This is a miss-naming, of structured universe leading to mechanisms of self assembly, the computation, as a recent TOE interview also pointed out.
    What kind of brain disease? You seem to be handling it rather well.

    A white matter disease. I've actually lost a lot of ability. I'm slowly regaining a degree of ability, but there are gaps. Used to be a lot better. When I was doing my OS design documentation I could design every part in reference to every other
    relevant part, earlier on. Even then, I had to go and look for the text each for the syntax, as I only remembered the structure of the concepts. There were a few physical brain problems I had to over come from various health challenges before this too.
    So, my optimum was not reached. This is just low grade work, I can still do design architecture to some degree. Syntactic memory rules, of you don't have to do anything that requires too much intelligence, which all sort of works with remembering and
    manipulating mechanism concepts etc. But, the two are required together to be a superman. None here, maybe Jecel is, I don't know. Good syntax memory gives you a faux appearance of super Intelligence, but if such a person is truely Intelligent with
    conceptual ability, they will be a true (super) genius. Syntax, and good mylienation of the brain's neural network, only accelerates and enhances ability there. People don't know about this, and think they are intelligent. I'm only intelligent as in
    only part way to the ideal.

    I've just been knocking down toxo plasmosis, but it's a real struggle to control. I have very large high frequency noise ringing In my ears a lot since the toxo really took off. It was getting to the point that even a high speed fan would not completely
    knock it out. I should have gone out and compared it to passing traffic but didn't. It's probably less now, but have to get near some loud sound sources to compare the level. Such nerve noise is supposed to a bad sign, of neurones dieing, but when
    swollen up and off it is less to hardly anything (likely due to brain inflammation blocking the signal). Creepy stuff. But, Intend to get it more when picked up, so confusing.

    Thanks Brad.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Brad Eckert@21:1/5 to Marcel Hendrix on Thu Oct 27 16:47:43 2022
    On Sunday, October 23, 2022 at 8:17:44 AM UTC-7, Marcel Hendrix wrote:
    On Sunday, October 23, 2022 at 9:54:49 AM UTC+2, Anton Ertl wrote:
    dxforth <dxf...@gmail.com> writes:
    [..]
    A very nice listing that shows how diverse word size was in early
    years before it converged on power-of-two.
    I thought the original idea was to use the ECC bits of a standard
    DRAM module to get 20 bit memory. A typical Forthian idea :--)

    -marcel
    Hamming(15,11) is compatible with 16-bit and 32-bit memory chips, so a 22-bit Forth cell would make sense. So how about 11-bit bytes?

    Every process shrink makes ECC more desirable.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Brad Eckert@21:1/5 to All on Thu Oct 27 16:55:51 2022
    On Sunday, October 23, 2022 at 8:18:09 PM UTC-7, S wrote:

    I've just been knocking down toxo plasmosis, but it's a real struggle to control. I have very large high frequency noise ringing In my ears a lot since the toxo really took off. It was getting to the point that even a high speed fan would not
    completely knock it out. I should have gone out and compared it to passing traffic but didn't. It's probably less now, but have to get near some loud sound sources to compare the level. Such nerve noise is supposed to a bad sign, of neurones dieing, but
    when swollen up and off it is less to hardly anything (likely due to brain inflammation blocking the signal). Creepy stuff. But, Intend to get it more when picked up, so confusing.

    If you would like to discuss what I would do about it, which is a bit of a rabbit hole, you can PM by looking me up on Github.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From S@21:1/5 to Brad Eckert on Thu Oct 27 17:18:34 2022
    On Friday, October 28, 2022 at 9:47:44 AM UTC+10, Brad Eckert wrote:
    On Sunday, October 23, 2022 at 8:17:44 AM UTC-7, Marcel Hendrix wrote:
    On Sunday, October 23, 2022 at 9:54:49 AM UTC+2, Anton Ertl wrote:
    dxforth <dxf...@gmail.com> writes:
    [..]
    A very nice listing that shows how diverse word size was in early
    years before it converged on power-of-two.
    I thought the original idea was to use the ECC bits of a standard
    DRAM module to get 20 bit memory. A typical Forthian idea :--)

    -marcel
    Hamming(15,11) is compatible with 16-bit and 32-bit memory chips, so a 22-bit Forth cell would make sense. So how about 11-bit bytes?

    Every process shrink makes ECC more desirable.

    I still favour 10 bit multiples, as it's close to 1000 values of the metric system. My original aim decades back, was for a 20 bit system, ironically. But, that was naive, as the world of personal computers had standardised on 8 bits by then, and you
    didn't have everything on one chip to avoid incompatibilities or avoid having to make custom chips. I suppose that 20 bits is really just 5 4 bit ram chips instead of 4 though. In my phase circuit design, each byte was to be 10, 100 or maybe 1000 values,
    to further enhance decimal usage. I probably mentioned this already, that it's naive too, due to sickness I missed the estimate in my head about the error rate. Perfectly no error rate, but things happen, and any error can gum up processing.

    Concerning the other post. I'm taking a combination of CDS and DMSO and mimosa pudica I think it's called. Seems to be having an effect. I'm in bed sick at the moment, so can't stay unfortunately.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Brad Eckert@21:1/5 to All on Fri Oct 28 17:17:17 2022
    On Thursday, October 27, 2022 at 5:18:36 PM UTC-7, S wrote:
    On Friday, October 28, 2022 at 9:47:44 AM UTC+10, Brad Eckert wrote:
    On Sunday, October 23, 2022 at 8:17:44 AM UTC-7, Marcel Hendrix wrote:
    On Sunday, October 23, 2022 at 9:54:49 AM UTC+2, Anton Ertl wrote:
    dxforth <dxf...@gmail.com> writes:
    [..]
    A very nice listing that shows how diverse word size was in early years before it converged on power-of-two.
    I thought the original idea was to use the ECC bits of a standard
    DRAM module to get 20 bit memory. A typical Forthian idea :--)

    -marcel
    Hamming(15,11) is compatible with 16-bit and 32-bit memory chips, so a 22-bit Forth cell would make sense. So how about 11-bit bytes?

    Every process shrink makes ECC more desirable.
    I still favour 10 bit multiples, as it's close to 1000 values of the metric system. My original aim decades back, was for a 20 bit system, ironically. But, that was naive, as the world of personal computers had standardised on 8 bits by then, and you
    didn't have everything on one chip to avoid incompatibilities or avoid having to make custom chips. I suppose that 20 bits is really just 5 4 bit ram chips instead of 4 though. In my phase circuit design, each byte was to be 10, 100 or maybe 1000 values,
    to further enhance decimal usage. I probably mentioned this already, that it's naive too, due to sickness I missed the estimate in my head about the error rate. Perfectly no error rate, but things happen, and any error can gum up processing.

    Concerning the other post. I'm taking a combination of CDS and DMSO and mimosa pudica I think it's called. Seems to be having an effect. I'm in bed sick at the moment, so can't stay unfortunately.

    For other readers, a Hamming ECC encoder and decoder generator is at http://idoka.ru/verilog-ecc-generator/ and the MATLAB/Octave version has a presence on Github. Very useful.

    ECC is becoming important as bus speeds go up and voltages go down. Have I seen BER be a problem? Not really. But still, ECC is getting to be a thing in automotive and in servers. If you have a multi-core Forth chip actively using off-chip SDRAM, I would
    not count on 0 BER.

    25-bit words can be stored along with ECC bits in 32-bit memory. SDRAM with a 16-bit data bus is the most economical these days. Or, 11-bit words in 16-bit memory (with double-bit-error detection). The latter's decoder logic tree is not as deep so a 22-
    bit cell is a good size for a Forth chip. 20-bit is also good.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From S@21:1/5 to Brad Eckert on Sat Oct 29 02:45:48 2022
    I know where your coming from Brad. On the retro chip, which is really a audio visual level chipset, a few mhz could do, and on more advanced forms, up to several hundred mhz would do on one level, 1-2ghz, maybe even 5ghz. On the MCU chip same sort of
    things, except lower end super cheap chips, mainly in a lower speed range. Both in package memory, to keep price and complexity down, for the market place. If I want to go faster, MQDCA, other magnetic, and optical are coming, with whatever they have,
    and a feild programmable version of those, might render making my own design redundant. I'm going practice error suppression techniques on silicon too. I suppose I'm still stuck in the old Chuck idea of ultra low memory usage, but still, keeping the
    amount of embedded ram down is going help with the price.

    A more mission critical version, it's possible to add the extra bits and circuitry. It would be designed to be completely transparent to the 10 to 20+ bit models

    So, at what bus speeds does this become important? I don't really know what to think. My long term Multi function MCU like array, was aimed at 5ghz, to take on many tasks. The retro design was some strange little hybrid that too on portions of that.
    Maybe it will just replace it. I'm actually thinking of doing a macro circuit type and see if it can get to a few mhz, without fab.

    Anyway, got to go.


    Sorry I'm not completely with it today, sweating chips from this trestment.

    Hopefully this treatment doesn't kill me tonight. But then again! I'm just sick of it.


    On Saturday, October 29, 2022 at 10:17:19 AM UTC+10, Brad Eckert wrote:
    On Thursday, October 27, 2022 at 5:18:36 PM UTC-7, S wrote:
    On Friday, October 28, 2022 at 9:47:44 AM UTC+10, Brad Eckert wrote:
    On Sunday, October 23, 2022 at 8:17:44 AM UTC-7, Marcel Hendrix wrote:
    On Sunday, October 23, 2022 at 9:54:49 AM UTC+2, Anton Ertl wrote:
    dxforth <dxf...@gmail.com> writes:
    [..]
    A very nice listing that shows how diverse word size was in early years before it converged on power-of-two.
    I thought the original idea was to use the ECC bits of a standard
    DRAM module to get 20 bit memory. A typical Forthian idea :--)

    -marcel
    Hamming(15,11) is compatible with 16-bit and 32-bit memory chips, so a 22-bit Forth cell would make sense. So how about 11-bit bytes?

    Every process shrink makes ECC more desirable.
    ..

    ECC is becoming important as bus speeds go up and voltages go down. Have I seen BER be a problem? Not really. But still, ECC is getting to be a thing in automotive and in servers. If you have a multi-core Forth chip actively using off-chip SDRAM, I
    would not count on 0 BER.

    25-bit words can be stored along with ECC bits in 32-bit memory. SDRAM with a 16-bit data bus is the most economical these days. Or, 11-bit words in 16-bit memory (with double-bit-error detection). The latter's decoder logic tree is not as deep so a 22-
    bit cell is a good size for a Forth chip. 20-bit is also good.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)