So, I've basically forgotten a lot from my university days to do with digital electronics, and want to do some things a bit complex on the proposed processor deign. So, any resources out there useful simplified guide for doing a simple 1000 transistorplus core, and designs for memory, rom and storage memory, on the same process? I only am looking at this because I learnt the basics of digital electronic circuit design, and it shouldn't be any more difficult, with the right software.
I want to explore crossing paths to reuse transistors with path depending on selection, maybe by source and destination to establish path and some other tricks, to inactivate alternative paths? I know this is a path to possible problems, especiallywith age or environmental deterioration. This is for an compacted design. I'm also interested in progressively waking and turning off the circuit (or at least sleep) as the signal moves through it, for energy.
So, I've basically forgotten a lot from my university days to do with digital electronics, and want to do some things a bit complex on the proposed processor deign. So, any resources out there useful simplified guide for doing a simple 1000 transistorplus core, and designs for memory, rom and storage memory, on the same process? I only am looking at this because I learnt the basics of digital electronic circuit design, and it shouldn't be any more difficult, with the right software.
On Thursday, July 7, 2022 at 9:58:39 AM UTC-4, Wayne morellini wrote:transistor plus core, and designs for memory, rom and storage memory, on the same process? I only am looking at this because I learnt the basics of digital electronic circuit design, and it shouldn't be any more difficult, with the right software.
So, I've basically forgotten a lot from my university days to do with digital electronics, and want to do some things a bit complex on the proposed processor deign. So, any resources out there useful simplified guide for doing a simple 1000
with age or environmental deterioration. This is for an compacted design. I'm also interested in progressively waking and turning off the circuit (or at least sleep) as the signal moves through it, for energy.I want to explore crossing paths to reuse transistors with path depending on selection, maybe by source and destination to establish path and some other tricks, to inactivate alternative paths? I know this is a path to possible problems, especially
This sounds a bit like async logic design. I don't know about the "crossing paths", that sounds like something that is extremely hard to implement, coming under the heading of extreme optimization. As you may know, optimization is the enemy offlexibility. Optimize in this manner and find you have a simple change to make to fix an error, and you have to do the optimization all over again.
However, if you have nothing but time on your hands, there's no reason to not give it a go. I suggest working with some very simple design first, to solidify your concepts. They certainly can use some degree of solidification.not going to be used, it is the inputs that need to remain stable to prevent power waste in pointless calculations. So enabled registers at the input of each logic section saves that wasted power.
One power optimization that many people don't understand, is to put enabled registers at the input of each section of logic, rather than at the output of each section. Power is dissipated by the logic elements changing. If a logic section's output is
These days, it's not so important to make tiny processors into even smaller processors. Even if you want to put 1000s of them on a single chip, it is better to design for short term goals, than to try to bite off the whole cow at once.
Just some friendly advice. If you follow it, you might just start to get somewhere rather than always talking about a new direction you wish to go.
--
Rick C.
- Get 1,000 miles of free Supercharging
- Tesla referral code - https://ts.la/richard11209
On 7/7/22 8:22 PM, Wayne morellini wrote:Comp.lang.forth?
Are you from around here?That depends on where "here" is.
--
http://davesrocketworks.com
David Schultz
On 7/7/22 8:58 AM, Wayne morellini wrote:transistor plus core, and designs for memory, rom and storage memory, on the same process? I only am looking at this because I learnt the basics of digital electronic circuit design, and it shouldn't be any more difficult, with the right software.
So, I've basically forgotten a lot from my university days to do with digital electronics, and want to do some things a bit complex on the proposed processor deign. So, any resources out there useful simplified guide for doing a simple 1000
You will need more transistors than that. An ALU, some registers, and control logic. It adds up fast. A RAM/register cell is 4 transistors in
CMOS plus the access and decode logic.
My course work is now more than 20 years old but way back when the
standard text was: https://www.amazon.com/CMOS-VLSI-Design-Circuits-Perspective/dp/0321547748
Then of course there was the now wildly outdated course in computer architecture.
--
http://davesrocketworks.com
David Schultz
Are you from around here?
On Friday, July 8, 2022 at 11:28:47 AM UTC+10, David Schultz wrote:
On 7/7/22 8:22 PM, Wayne morellini wrote:Comp.lang.forth?
Are you from around here?That depends on where "here" is.
--
http://davesrocketworks.com
David Schultz
On Friday, July 8, 2022 at 11:28:47 AM UTC+10, David Schultz wrote:
On 7/7/22 8:22 PM, Wayne morellini wrote:Comp.lang.forth?
Are you from around here?That depends on where "here" is.
--
http://davesrocketworks.com
David Schultz
But maybe there is some circuit to hold the output or inputs stable,
but do I want them to turn off completely with the section idea.
Does anybody know if their are restrictions on short haul video
transmission to a in room tv via a tv digital channel, like they
allowed with analogue tv channels?
Wayne morellini <waynemo...@gmail.com> writes:
Does anybody know if their are restrictions on short haul video transmission to a in room tv via a tv digital channel, like theyNobody cares about that any more. Everyone has cable and TV sets have
allowed with analogue tv channels?
HDMI inputs.
On Thu, 7 Jul 2022 18:00:34 -0700 (PDT)Thanks Jan.
Wayne morellini <waynemo...@gmail.com> wrote:
But maybe there is some circuit to hold the output or inputs stable,Transparent latch? (Not available in most FPGAs)
but do I want them to turn off completely with the section idea.
On Friday, July 8, 2022 at 12:38:05 AM UTC+10, gnuarm.del...@gmail.com wrote:transistor plus core, and designs for memory, rom and storage memory, on the same process? I only am looking at this because I learnt the basics of digital electronic circuit design, and it shouldn't be any more difficult, with the right software.
On Thursday, July 7, 2022 at 9:58:39 AM UTC-4, Wayne morellini wrote:
So, I've basically forgotten a lot from my university days to do with digital electronics, and want to do some things a bit complex on the proposed processor deign. So, any resources out there useful simplified guide for doing a simple 1000
with age or environmental deterioration. This is for an compacted design. I'm also interested in progressively waking and turning off the circuit (or at least sleep) as the signal moves through it, for energy.I want to explore crossing paths to reuse transistors with path depending on selection, maybe by source and destination to establish path and some other tricks, to inactivate alternative paths? I know this is a path to possible problems, especially
flexibility. Optimize in this manner and find you have a simple change to make to fix an error, and you have to do the optimization all over again.This sounds a bit like async logic design. I don't know about the "crossing paths", that sounds like something that is extremely hard to implement, coming under the heading of extreme optimization. As you may know, optimization is the enemy of
not going to be used, it is the inputs that need to remain stable to prevent power waste in pointless calculations. So enabled registers at the input of each logic section saves that wasted power.However, if you have nothing but time on your hands, there's no reason to not give it a go. I suggest working with some very simple design first, to solidify your concepts. They certainly can use some degree of solidification.
One power optimization that many people don't understand, is to put enabled registers at the input of each section of logic, rather than at the output of each section. Power is dissipated by the logic elements changing. If a logic section's output is
using which was stable, but that would be part of a special node process, not implementable in a normal process?These days, it's not so important to make tiny processors into even smaller processors. Even if you want to put 1000s of them on a single chip, it is better to design for short term goals, than to try to bite off the whole cow at once.
Just some friendly advice. If you follow it, you might just start to get somewhere rather than always talking about a new direction you wish to go.
--
Rick C.
- Get 1,000 miles of free SuperchargingThanks Rick. The stable outputs makes sense. A register doesn't. But maybe there is some circuit to hold the output or inputs stable, but do I want them to turn off completely with the section idea. However, there was that three state logic Chuck was
- Tesla referral code - https://ts.la/richard11209
I look at doing optimisation, because I'm aiming at a low cost 1000+ trasistor design instead of a large complex design. So, the optimisation can be done a lot quicker to a higher perfection. Competition is cost, performance, low energy and reliability.I'm worried about reliability here of a optimised design. It's something I can't prove. Products will last in field certain times.
I didn't go and work at ITV, because they wanted me to self fund going there, and I didn't trust my family, and that turned out to be so true. Otherwise I could have been over there designing processors too by now.am interested in using space for memory. Where GA144 uses a segmented space and lots of processors. I'm interested in a low segmented space with flatter addressing, with each segment tied to one of more processors tied to input or code work (thanks fur
In this small design, there are ways to design to maximise performance or minimise energy, or design more costly for each. It's a matter of picking one of the first two for me. The smaller processor allows for faster processing timed and lower costs. I
A 1000+ transistor core on its own is pretty useless, as the 16 bit+ lines are going make it cost. So, adding relatively sized internal memory and IO bus configuration and processing, makes it a marketable product. I'm looking at single core design,but also repeating the then existing core, IO pin configuration blocks and memories, to make a multiple core design simply. It's all tedious work to make things right. If you start out wrong, you can be compromised for the life of the product. So, I don'
On Thursday, July 7, 2022 at 9:00:35 PM UTC-4, Wayne morellini wrote:..
On Friday, July 8, 2022 at 12:38:05 AM UTC+10, gnuarm.del...@gmail.com wrote:
On Thursday, July 7, 2022 at 9:58:39 AM UTC-4, Wayne morellini wrote:
reliability. I'm worried about reliability here of a optimised design. It's something I can't prove. Products will last in field certain times.I look at doing optimisation, because I'm aiming at a low cost 1000+ trasistor design instead of a large complex design. So, the optimisation can be done a lot quicker to a higher perfection. Competition is cost, performance, low energy and
Instead of a large, complex design (but no more complex than other designs, so not really complex at all), you wish to create a small, complex design, that is complex in ways most people can even imagine.
Lots of MCUs have very, very little memory, a fraction of 1kB RAM and only a handful kB of Flash. So you don't need to use much memory to design a useful processor. In fact, the real question comes back to application. What is the target application?That will strongly
impact the memory size since the memory has the biggest area impact on other small MCUs, and at 1000 transistors, the memory will totally dominate this design, and so, the cost.
In fact, you will see the memory size will be the only important cost controlling feature. If your CPU is 1,000 or 2,000 or even 4,000 transistors will barely matter at all if you have 8 kB of Flash and 1 kB of RAM.
It's only a multi CPU design where the processor size would be optimized usefully,
If you add more significant memory to each node, you are back to the memory dominating chip size. If you try to share a single memory between all the nodes, you end up with a chip dominated by routing.
Why do you think no one has been able to improve on this to date? Do you think no one is smart enough?
As I've said, many, many times. Until you select a target application you won't be able to pick the optimal trade off between the many, many factors involved.
Wayne morellini <waynemorellini@gmail.com> writes:
Does anybody know if their are restrictions on short haul video
transmission to a in room tv via a tv digital channel, like they
allowed with analogue tv channels?
Nobody cares about that any more. Everyone has cable and TV sets have
HDMI inputs.
As I've said, many, many times. Until you select a target application,
you won't be able to pick the optimal trade off between the many, many >factors involved.
Rick C.
In article <9c5abe4c-d255-43bd...@googlegroups.com>,
Rick C <gnuarm.del...@gmail.com> wrote:
<SNIP>
As I've said, many, many times. Until you select a target application,I'm interested to read your comments, but could you please
you won't be able to pick the optimal trade off between the many, many >factors involved.
snip the text you respond to.
Especially because Waynotelli doesnot restrict line length,
so that proper quoting doesn't work, and I'm tricked in reading
his text, again.
On Friday, July 8, 2022 at 3:17:24 PM UTC-4, none albert wrote:
In article <9c5abe4c-d255-43bd...@googlegroups.com>,
Rick C <gnuarm.del...@gmail.com> wrote:
<SNIP>
I'm more inclined to not reply at all. I have no idea what the guy is thinking. As soon as he starts to make a bit of sense, he goes back into wide open, doing everything while doing nothing mode. I can't find much to respond to.As I've said, many, many times. Until you select a target application, >you won't be able to pick the optimal trade off between the many, many >factors involved.I'm interested to read your comments, but could you please
snip the text you respond to.
Especially because Waynotelli doesnot restrict line length,
so that proper quoting doesn't work, and I'm tricked in reading
his text, again.
On Saturday, July 9, 2022 at 6:27:38 AM UTC+10, gnuarm.del...@gmail.com wrote:of Forth, guys.
On Friday, July 8, 2022 at 3:17:24 PM UTC-4, none albert wrote:
In article <9c5abe4c-d255-43bd...@googlegroups.com>,
Rick C <gnuarm.del...@gmail.com> wrote:
<SNIP>
It's much to do with the talent of the reader. I certainly never have had 10-100x times more talented people react in such ways, usually the opposite. I have however had more talented, say they don't come here because of this behaviour. Viva le deathI'm more inclined to not reply at all. I have no idea what the guy is thinking. As soon as he starts to make a bit of sense, he goes back into wide open, doing everything while doing nothing mode. I can't find much to respond to.As I've said, many, many times. Until you select a target application, >you won't be able to pick the optimal trade off between the many, many >factors involved.I'm interested to read your comments, but could you please
snip the text you respond to.
Especially because Waynotelli doesnot restrict line length,
so that proper quoting doesn't work, and I'm tricked in reading
his text, again.
One doesn't just obstinately dig a little tiny microscopic trench in obscure territory, and declare "There is no gold here!". You have to dig for it, everywhere you can, and declare what you actually can find with reason. Google groups is certainly anissue. You can't seem to get through to them, and so the mobile view continues to have no reply icon for many years, and using the desktop version just flashes and jumps around as you try to edit things. So, you have to write it outside to speed up
You could also take responsibility for reading design factors. But where is your original thought to justify commenting and where is your responsibility for reading to understand. As I said, actual superior people don't act like I read here, they arehumble and read to understand, rather than not to dismiss. As I said, most better types will not turn up because of this. I get to communicate behind the scenes. But, this is virtually the only place to have discussion left so know of, and it's dominated
So, I've basically forgotten a lot from my university days to do with digital electronics, and want to do some things a bit complex on the proposed processor deign.
I want to explore crossing paths to reuse transistors with path depending on selection, maybe by source and destination to establish path and some other tricks, to inactivate alternative paths? I know this is a path to possible problems, especiallywith age or environmental deterioration. This is for an compacted design. I'm also interested in progressively waking and turning off the circuit (or at least sleep) as the signal moves through it, for energy.
On Fri, 8 Jul 2022 10:00:52 -0700 (PDT)minutes, after class, I concluded it would disrupt the economy and energy sectors, and decided it wasn't a good idea. It seems related to other ones out there. I used to annotate what was wrong with
Wayne morellini <waynemo...@gmail.com> wrote:
[]
[Teacher] told us that perpetual motion machines weren't possible, giving the spinning axis motion stuff. Within 30 seconds I came up with a way it could be fine, and with 15 minutes or so, how to do it with conventional technology, and within 45
[]
Wait: you've invent a PMM and won't tell us?
--
Bah, and indeed Humbug.
impractical for very small portable devices. That's why I'm trying to come up with a HDMI attached relay alternative, but in the >meantime digital TV signal is an existing standard that can be adopted in. It's all very messy, but a digital tv signal isa >compromise.. do, the question was, what restrictions are there.
On Sat, 9 Jul 2022 04:34:27 -0700 (PDT)minutes, after class, I concluded it would disrupt the economy and energy sectors, and decided it wasn't a good idea. It seems related to other ones out there. I used to annotate what was wrong with
Wayne morellini <waynemo...@gmail.com> wrote:
On Saturday, July 9, 2022 at 8:16:56 PM UTC+10, Kerr-Mudd, John wrote:
On Fri, 8 Jul 2022 10:00:52 -0700 (PDT)
Wayne morellini <waynemo...@gmail.com> wrote:
[]
[Teacher] told us that perpetual motion machines weren't possible, giving the spinning axis motion stuff. Within 30 seconds I came up with a way it could be fine, and with 15 minutes or so, how to do it with conventional technology, and within 45
video last night of somebody making a device from an old oil company patent, trying to prove to everybody it has no this that or the other. I'm sitting there also going, what about this that or other other" way of faking it. They covered most areas in[]
Wait: you've invent a PMM and won't tell us?
--
Bah, and indeed Humbug.
Yes. It's a misunderstanding of science. But when you say it like that.. We even had a local guy here decades latter, that lived on the same street as a freind, who was famous for inventing something, but it guess nowhere, and people died. Just saw a
patent I don't know of, these people should know that dyes t make sense, which might be a strong indicator of a fake. I used to examine sceptic claims for validity (often very fanciful people, looking for what they want to find to discredit instead ofbe the case) so we don't want to reveal the complete secret until we hit 100 million subscribers. But, the we don't want to get sued for copying a 50 years old patent, is what gets me. Unless there is a special classification of military classified
Wow.implying that light is speed limited by remote interaction speed. But, it is all "if"'s is these hypothesis. Lost my train of thought, a new message notification turned up. I point to diffuse information at distance not interacting. So, you get to need a
have these holographic universe experts running around, and I point to local independent interaction of forces unless there is a method of remote local interactions, which sub implied a way to instantly interact over distance, or speed limits (maybe
But sometimes, and maybe especially here in this NG, you have to get down and apply things.
On Saturday, July 9, 2022 at 8:16:56 PM UTC+10, Kerr-Mudd, John wrote:minutes, after class, I concluded it would disrupt the economy and energy sectors, and decided it wasn't a good idea. It seems related to other ones out there. I used to annotate what was wrong with
On Fri, 8 Jul 2022 10:00:52 -0700 (PDT)
Wayne morellini <waynemo...@gmail.com> wrote:
[]
[Teacher] told us that perpetual motion machines weren't possible, giving the spinning axis motion stuff. Within 30 seconds I came up with a way it could be fine, and with 15 minutes or so, how to do it with conventional technology, and within 45
a video last night of somebody making a device from an old oil company patent, trying to prove to everybody it has no this that or the other. I'm sitting there also going, what about this that or other other" way of faking it. They covered most areas[]
Wait: you've invent a PMM and won't tell us?
--
Bah, and indeed Humbug.
Yes. It's a misunderstanding of science. But when you say it like that.. We even had a local guy here decades latter, that lived on the same street as a freind, who was famous for inventing something, but it guess nowhere, and people died. Just saw
be the case) so we don't want to reveal the complete secret until we hit 100 million subscribers. But, the we don't want to get sued for copying a 50 years old patent, is what gets me. Unless there is a special classification of military classifiedpatent I don't know of, these people should know that dyes t make sense, which might be a strong indicator of a fake. I used to examine sceptic claims for validity (often very fanciful people, looking for what they want to find to discredit instead of
On Friday, July 8, 2022 at 9:15:14 AM UTC-4, Wayne morellini wrote:is a >compromise.. do, the question was, what restrictions are there.
impractical for very small portable devices. That's why I'm trying to come up with a HDMI attached relay alternative, but in the >meantime digital TV signal is an existing standard that can be adopted in. It's all very messy, but a digital tv signal
You might be late for that product concept...hvlocphy=9001065&hvtargid=pla-901877952212&th=1
https://www.amazon.ca/Dummy-Headless-Display-Emulator-Generation/dp/B07FB4VJL9/ref=asc_df_B07FB4VJL9/?tag=googleshopc0c-20&linkCode=df0&hvadid=335118530598&hvpos=&hvnetw=g&hvrand=12878053212799315034&hvpone=&hvptwo=&hvqmt=&hvdev=c&hvdvcmdl=&hvlocint=&
Yes. It's a misunderstanding of science.On your part.
On 07/07/2022 14:58, Wayne morellini wrote:
So, I've basically forgotten a lot from my university days to do with digital electronics, and want to do some things a bit complex on the proposed processor deign.Does that mean you've no experience of digital hardware design at all? You're full of ideas on advanced features to incorporate into a
processor, if you've no experience how can you evaluate what is
practical or not?
with age or environmental deterioration. This is for an compacted design. I'm also interested in progressively waking and turning off the circuit (or at least sleep) as the signal moves through it, for energy.I want to explore crossing paths to reuse transistors with path depending on selection, maybe by source and destination to establish path and some other tricks, to inactivate alternative paths? I know this is a path to possible problems, especially
No doubt you'll interpret this as negativity, but some negativity is
part of the design process, e.g. you design something and should ask yourself "why won't this work" - both for hardware and software.
On 7/9/22 6:34 AM, Wayne morellini wrote:
Yes. It's a misunderstanding of science.On your part.
If it were possible to create a PMM, someone would have done it long
ago. It isn't like people aren't trying in spite of the science. In
order for it to work, big chunks of science as known and applied every
day has to be wrong.
https://web.archive.org/web/20171112054010/http://www.lhup.edu:80/~dsimanek/museum/unwork.htm
--
http://davesrocketworks.com
David Schultz
But sometimes, and maybe especially here in this NG, you have to get down and apply things.
(He says, still unable to get a proper understanding of Forth; yes I've looked at 'Starting Forth', and I'm ok with simple stack stuff, but it seems one has to know a lot of 'words' before you^I can understand a program.
On Saturday, July 9, 2022 at 4:06:58 PM UTC+10, Gerry Jackson wrote:actually working according to Maxwell Equations etc, and here require other quantum effects and material sciences at chemistry. So, I'm just looking at high level digital design (why you think I want to use GA's Glow, or Okad, if a high school kid can
On 07/07/2022 14:58, Wayne morellini wrote:I didn't say either. You study this stuff, and forget a lot of it, but it's fairly simple concepts. Which is how people design CPU's on bread boards with active high res video graphics even. Analogue is a lot harder but still deceptively oversimplified,
So, I've basically forgotten a lot from my university days to do with digital electronics, and want to do some things a bit complex on the proposed processor deign.Does that mean you've no experience of digital hardware design at all? You're full of ideas on advanced features to incorporate into a
processor, if you've no experience how can you evaluate what is
practical or not?
Any descent princess is going have models for all these circuit events.with age or environmental deterioration. This is for an compacted design. I'm also interested in progressively waking and turning off the circuit (or at least sleep) as the signal moves through it, for energy.
I want to explore crossing paths to reuse transistors with path depending on selection, maybe by source and destination to establish path and some other tricks, to inactivate alternative paths? I know this is a path to possible problems, especially
No doubt you'll interpret this as negativity, but some negativity isThat is just reality. The next question, is how do you make it work, otherwise you won't be very good at it. You have weigh up everything. Why do you think I'm working from simplest down. Nobody else here is bothering to do anything.
part of the design process, e.g. you design something and should ask yourself "why won't this work" - both for hardware and software.
The basic set of forth language and misc words were worked out a long time ago (though multiplication is one thing I would like). To get extra code density or performance, you have to figure out how to design the ISA to perform this set of functions.But here, I'm looking at using a subset of those as straight forwards opcodes, with a few opcodes replaced by by the counter DMA system, and one of my ISA techniques. The thing about the negative people around here, when their objectives aren't sincere,
If it were possible to create a PMM, someone would have done it long
ago. It isn't like people aren't trying in spite of the science. In
order for it to work, big chunks of science as known and applied every
day has to be wrong.
On Friday, July 8, 2022 at 10:38:24 PM UTC-4, Wayne morellini wrote:in spite of your not caring what I say, says you do care a lot. The point being you are very easily distracted from whatever your goals are.
On Saturday, July 9, 2022 at 6:27:38 AM UTC+10, gnuarm.del...@gmail.com wrote:As usual, you fail to understand. That's ok. I don't really care much about what you talk about. It's really just talk as far as I can tell. The fact that you waste your time responding with such long, involved posts when it's just to defend yourself,
On Friday, July 8, 2022 at 3:17:24 PM UTC-4, none albert wrote:..
In article <9c5abe4c-d255-43bd...@googlegroups.com>,
Rick C <gnuarm.del...@gmail.com> wrote:
Stop being weird about what I post that you don't like. Ignore it. Then you will do much better here. Or waste your breath discussing what is not worth discussing and fail to get on with what ever it is you are trying to do.
What I have observed here is that the people who talk about the life or death of Forth accomplish little. Others, who just get on with it, do very well. I don't use Forth a lot. I have designed stack processors over the years and have some ideas I'dlike to work on, but have many other priorities at this time. My lack of experience with Forth actually holds me back in not being able to design a good software development tool for one of the processors I want to extend.
On Saturday, July 9, 2022 at 8:00:09 AM UTC-4, Kerr-Mudd, John wrote:
But sometimes, and maybe especially here in this NG, you have to get down and apply things.New here, huh? This has been going on for a while in many posts, across many threads.
On Saturday, July 9, 2022 at 9:20:43 AM UTC-4, Wayne morellini wrote:
On Saturday, July 9, 2022 at 4:06:58 PM UTC+10, Gerry Jackson wrote:
On 07/07/2022 14:58, Wayne morellini wrote:
That is mostly because no one understands what you are talking about.
What I don't understand is why you spend so much time responding to posts here,
Well. It looks like another 3 to 4 hours of life wasted by people with nothing to do, but complain that things are not happening, while stopping people from doing urgent things, stopping them from doing things, as they actually are trying to do them.Which is strange.
On Saturday, July 9, 2022 at 11:19:12 AM UTC-4, Wayne morellini wrote:and ignored, or just ignored and they have no impact.
On Sunday, July 10, 2022 at 12:11:19 AM UTC+10, gnuarm.del...@gmail.com wrote:
On Saturday, July 9, 2022 at 9:20:43 AM UTC-4, Wayne morellini wrote:
On Saturday, July 9, 2022 at 4:06:58 PM UTC+10, Gerry Jackson wrote:
On 07/07/2022 14:58, Wayne morellini wrote:
Ah, I get it now. Paranoia. Whether or not anyone is "trying to undermine" your work, there's nothing anyone can do. These are just words thrown across the aether and have no impact unless someone reads them and takes them to heart. They can be readThat is mostly because no one understands what you are talking about.Be truthful. I have never known you to be intelligent, or are you pretending?
What I don't understand is why you spend so much time responding to posts here,Because of the disruptive deception trying to undermine good work.
Whatever. It's a rainy Saturday and I'm just waiting for my dinner to cook. Opps, I waited too long to hit send and the rain came roaring back, so the sat signal will be lost. This message will have to wait to be posted.
You run Arius, or are you somebody pretending to be that person? It's extremely strange behaviour for somebody is business to be doing this for years, or days.
https://www.arius.com/
On Saturday, July 9, 2022 at 11:33:53 AM UTC-4, Wayne morellini wrote:Which is strange.
Well. It looks like another 3 to 4 hours of life wasted by people with nothing to do, but complain that things are not happening, while stopping people from doing urgent things, stopping them from doing things, as they actually are trying to do them.
No one is complaining. Just pointing out the facts. I don't especially care what you do with your life. If you are not getting anything done, that is not because of me. That is purely on you.
Yes, it is a bit odd to bother with someone who is playing around with processor designs, or more accurately, playing around with the idea of designing a processor.
Whatever. I just finished my shrimp dinner and was checking for something interested in this group. I guess that will need to wait a while longer.
So what is your next step on the road to your stack processor? Or are you just going to harangue me as your only accomplishment today?
--
Rick C.
-+- Get 1,000 miles of free Supercharging
-+- Tesla referral code - https://ts.la/richard11209
So, I've basically forgotten a lot from my university days to do with digital electronics, and want to do some things a bit complex on the proposed processor deign. So, any resources out there useful simplified guide for doing a simple 1000 transistorplus core, and designs for memory, rom and storage memory, on the same process? I only am looking at this because I learnt the basics of digital electronic circuit design, and it shouldn't be any more difficult, with the right software.
I want to explore crossing paths to reuse transistors with path depending on selection, maybe by source and destination to establish path and some other tricks, to inactivate alternative paths? I know this is a path to possible problems, especiallywith age or environmental deterioration. This is for an compacted design. I'm also interested in progressively waking and turning off the circuit (or at least sleep) as the signal moves through it, for energy.
Thanks again.
Wayne.
On Monday, July 11, 2022 at 6:05:45 PM UTC-4, Wayne morellini wrote:
On Monday, July 11, 2022 at 10:59:31 AM UTC+10, gnuarm.del...@gmail.com wrote:
..
Yes, it is a bit odd to bother with someone who is playing around with processor designs, or more accurately, playing around with the idea of designing a processor.
Whatever. I just finished my shrimp dinner and was checking for something interested in this group. I guess that will need to wait a while longer.
So what is your next step on the road to your stack processor? Or are you just going to harangue me as your only accomplishment today?
--
Rick C.
Anything that is not productive, you can ignore. Why do you continue to participate in non-productive conversations??? I don't understand you at all.-+- Get 1,000 miles of free SuperchargingAs we can all see, this is deceptive, as this is the research and discussion portion of design, and I often am harangued around here, faced with strange diversionary comments and questions
-+- Tesla referral code - https://ts.la/richard11209
I had hoped for change.
--
Rick C.
-++ Get 1,000 miles of free Supercharging
-++ Tesla referral code - https://ts.la/richard11209
On Thursday, July 7, 2022 at 11:58:39 PM UTC+10, Wayne morellini wrote:transistor plus core, and designs for memory, rom and storage memory, on the same process? I only am looking at this because I learnt the basics of digital electronic circuit design, and it shouldn't be any more difficult, with the right software.
So, I've basically forgotten a lot from my university days to do with digital electronics, and want to do some things a bit complex on the proposed processor deign. So, any resources out there useful simplified guide for doing a simple 1000
with age or environmental deterioration. This is for an compacted design. I'm also interested in progressively waking and turning off the circuit (or at least sleep) as the signal moves through it, for energy.I want to explore crossing paths to reuse transistors with path depending on selection, maybe by source and destination to establish path and some other tricks, to inactivate alternative paths? I know this is a path to possible problems, especially
Thanks again.
source license that includes only limited parties, for security, and nobody that does business outside of those parties could use. I do hope to eventually displace most all FPGA, that doesn't have a license, which would lower amortisation per unit.Wayne.
Ok. There has been too many repeated attempts to interfere with business of things, which were invalid. This will probably have to go to a closed group eventually for security, and, as the design is at risk itself from interference, potentially open
On Monday, July 11, 2022 at 10:59:31 AM UTC+10, gnuarm.del...@gmail.com wrote:
..
Yes, it is a bit odd to bother with someone who is playing around with processor designs, or more accurately, playing around with the idea of designing a processor.
Whatever. I just finished my shrimp dinner and was checking for something interested in this group. I guess that will need to wait a while longer.
So what is your next step on the road to your stack processor? Or are you just going to harangue me as your only accomplishment today?
--
Rick C.
-+- Get 1,000 miles of free SuperchargingAs we can all see, this is deceptive, as this is the research and discussion portion of design, and I often am harangued around here, faced with strange diversionary comments and questions
-+- Tesla referral code - https://ts.la/richard11209
I had hoped for change.
On Tuesday, July 12, 2022 at 9:28:22 AM UTC+10, gnuarm.del...@gmail.com wrote:should review their own usefulness.
On Monday, July 11, 2022 at 6:05:45 PM UTC-4, Wayne morellini wrote:
On Monday, July 11, 2022 at 10:59:31 AM UTC+10, gnuarm.del...@gmail.com wrote:
..
Yes, it is a bit odd to bother with someone who is playing around with processor designs, or more accurately, playing around with the idea of designing a processor.
Whatever. I just finished my shrimp dinner and was checking for something interested in this group. I guess that will need to wait a while longer.
So what is your next step on the road to your stack processor? Or are you just going to harangue me as your only accomplishment today?
--
Rick C.
Anything that is not productive, you can ignore. Why do you continue to participate in non-productive conversations??? I don't understand you at all.-+- Get 1,000 miles of free SuperchargingAs we can all see, this is deceptive, as this is the research and discussion portion of design, and I often am harangued around here, faced with strange diversionary comments and questions
-+- Tesla referral code - https://ts.la/richard11209
I had hoped for change.
--
Rick C.
-++ Get 1,000 miles of free SuperchargingEverybody. Here is another one. Participating, and starting non productive conversations, now accuses the one who has to deal with them in order to keep things on track and address falsehoods. What is evident, is that the one producing such disruption,
-++ Tesla referral code - https://ts.la/richard11209
Please also note, how when shown wrong, they divert and change to something else, and when nothing note worthy sounding, can be said, these sort of statements are used. Also note, that when their business is pointed out, suddenly they become morereserved. If the way they act and what they say, is valid, then why does it matter?
Please note, this person has been wasting many hours, weeks and years, dragging themselves into such "non-productive conversations" wasting their time.
On Monday, July 11, 2022 at 8:00:26 PM UTC-4, Wayne morellini wrote:..
disruption, should review their own usefulness.Everybody. Here is another one. Participating, and starting non productive conversations, now accuses the one who has to deal with them in order to keep things on track and address falsehoods. What is evident, is that the one producing such
In this regard, you fit in perfectly here. We used to have a Mad Max type who would argue with anyone at the drop of a hat. Then we have Juergen and Peter Forth (haven't seen much of him lately) who would burn down the woods to save the animals orwhatever analogy is appropriate for someone who creates a greater disturbance than the person he is arguing with.
Absolutely nothing I've posted
at any time deserves any response from you..
reserved. If the way they act and what they say, is valid, then why does it matter?Please also note, how when shown wrong, they divert and change to something else, and when nothing note worthy sounding, can be said, these sort of statements are used. Also note, that when their business is pointed out, suddenly they become more
Please note, this person has been wasting many hours, weeks and years, dragging themselves into such "non-productive conversations" wasting their time.What have I said that I am "wrong" about? Why do you continue to rail about my posts when, at this point, they are pretty much all about the fact that you can't help yourself from responding?
I'm ordering you to stop responding to my posts!!!
Yes, indeed. I think you are just the right level of crazy to be in this group.
--
Rick C.
+-- Get 1,000 miles of free Supercharging
+-- Tesla referral code - https://ts.la/richard11209
Please note, this person has been wasting many hours, weeks and years, dragging themselves into such "non-productive conversations" wasting their time.
In article <4786725b-4598-49d3...@googlegroups.com>,
Wayne morellini <waynemo...@gmail.com> wrote:
Please note, this person has been wasting many hours, weeks and years, dragging themselves into such "non-productive conversations" wasting their time.Please can you restrict your lines to 72 as per the usenet etiquette. (Undoubtedly you are asked before.)
Groetjes Albert
--
"in our communism country Viet Nam, people are forced to be
alive and in the western country like US, people are free to
die from Covid 19 lol" duc ha
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst
In article <4786725b-4598-49d3...@googlegroups.com>,
Wayne morellini <waynemo...@gmail.com> wrote:
Please note, this person has been wasting many hours, weeks and years, dragging themselves into such "non-productive conversations" wasting their time.Please can you restrict your lines to 72 as per the usenet etiquette. (Undoubtedly you are asked before.)
Groetjes Albert
--
"in our communism country Viet Nam, people are forced to be
alive and in the western country like US, people are free to
die from Covid 19 lol" duc ha
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst
In article <4786725b-4598-49d3...@googlegroups.com>,
Wayne morellini <waynemo...@gmail.com> wrote:
Please note, this person has been wasting many hours, weeks and years, dragging themselves into such "non-productive conversations" wasting their time.Please can you restrict your lines to 72 as per the usenet etiquette. (Undoubtedly you are asked before.)
Groetjes Albert
--
"in our communism country Viet Nam, people are forced to be
alive and in the western country like US, people are free to
die from Covid 19 lol" duc ha
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst
On Tuesday, July 12, 2022 at 6:01:55 PM UTC+10, none albert wrote:
Please can you restrict your lines to 72 as per the usenet etiquette. (Undoubtedly you are asked before.)
Groetjes Albert
Never heard of this made up verse of etiquette. But it was less than 72 lines, and virtually every reply to an individual assertion is less than 72 lines.
On Tuesday, July 12, 2022 at 6:01:55 PM UTC+10, none albert wrote:
In article <4786725b-4598-49d3...@googlegroups.com>,
Wayne morellini <waynemo...@gmail.com> wrote:
Please note, this person has been wasting many hours, weeks and years, dragging themselves into such "non-productive conversations" wasting their time.Please can you restrict your lines to 72 as per the usenet etiquette. (Undoubtedly you are asked before.)
Groetjes AlbertOk you must mean 72 characters per line. I'm a Google user, so that's practically impossible. You can do
--
"in our communism country Viet Nam, people are forced to be
alive and in the western country like US, people are free to
die from Covid 19 lol" duc ha
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst
What I do, and contact them about putting options in to line break auto wrap Posts, but I find the Google group people are one of the ones who rarely listen, and the email people a bit.
However, one you start quoting people the indentations are going go over.
So, you can take responsibility, and get news readers that wrap lines. Also, explain yourself better. Look
At all the time I spend on examining myself, and how rude some are who don't bother to make effort to
read the explanation built for them
Read the following article on the basis of this etiquette, and how some of those
things are not really current:
https://www.theguardian.com/technology/2001/apr/26/onlinesupplement10
On Tuesday, July 12, 2022 at 6:01:55 PM UTC+10, none albert wrote:
In article <4786725b-4598-49d3...@googlegroups.com>,years, dragging themselves into such "non-productive conversations"
Wayne morellini <waynemo...@gmail.com> wrote:
Please note, this person has been wasting many hours, weeks and
wasting their time.
Please can you restrict your lines to 72 as per the usenet etiquette.
(Undoubtedly you are asked before.)
Groetjes Albert
Ok you must mean 72 characters per line. I'm a Google user, so that's >practically impossible. You can do
What I do, and contact them about putting options in to line break auto wrap >Posts, but I find the Google group people are one of the ones who rarely >listen, and the email people a bit.
https://www.theguardian.com/technology/2001/apr/26/onlinesupplement10This note confirms the use of 8o character lines, preferably 72.
In article <69d17083-30c2-4d70...@googlegroups.com>,
Wayne morellini <waynemo...@gmail.com> wrote:
On Tuesday, July 12, 2022 at 6:01:55 PM UTC+10, none albert wrote:
In article <4786725b-4598-49d3...@googlegroups.com>,years, dragging themselves into such "non-productive conversations"
Wayne morellini <waynemo...@gmail.com> wrote:
Please note, this person has been wasting many hours, weeks and
wasting their time.
Please can you restrict your lines to 72 as per the usenet etiquette.
(Undoubtedly you are asked before.)
Groetjes Albert
Ok you must mean 72 characters per line. I'm a Google user, so that's >practically impossible. You can doConclusion, don't use Google.
What I do, and contact them about putting options in to line break auto wrap >Posts, but I find the Google group people are one of the ones who rarely >listen, and the email people a bit.
https://www.theguardian.com/technology/2001/apr/26/onlinesupplement10This note confirms the use of 8o character lines, preferably 72.
Groetjes Albert
--
"in our communism country Viet Nam, people are forced to be
alive and in the western country like US, people are free to
die from Covid 19 lol" duc ha
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst
In article <69d17083-30c2-4d70...@googlegroups.com>,
Wayne morellini <waynemo...@gmail.com> wrote:
On Tuesday, July 12, 2022 at 6:01:55 PM UTC+10, none albert wrote:
In article <4786725b-4598-49d3...@googlegroups.com>,years, dragging themselves into such "non-productive conversations"
Wayne morellini <waynemo...@gmail.com> wrote:
Please note, this person has been wasting many hours, weeks and
wasting their time.
Please can you restrict your lines to 72 as per the usenet etiquette.
(Undoubtedly you are asked before.)
Groetjes Albert
Ok you must mean 72 characters per line. I'm a Google user, so that's >practically impossible. You can doConclusion, don't use Google.
What I do, and contact them about putting options in to line break auto wrap >Posts, but I find the Google group people are one of the ones who rarely >listen, and the email people a bit.
https://www.theguardian.com/technology/2001/apr/26/onlinesupplement10This note confirms the use of 8o character lines, preferably 72.
Groetjes Albert
--
"in our communism country Viet Nam, people are forced to be
alive and in the western country like US, people are free to
die from Covid 19 lol" duc ha
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst
On Tuesday, July 12, 2022 at 9:16:02 PM UTC+10, none albert wrote:
In article <69d17083-30c2-4d70...@googlegroups.com>,
Wayne morellini <waynemo...@gmail.com> wrote:
On Tuesday, July 12, 2022 at 6:01:55 PM UTC+10, none albert wrote:
In article <4786725b-4598-49d3...@googlegroups.com>,years, dragging themselves into such "non-productive conversations" >wasting their time.
Wayne morellini <waynemo...@gmail.com> wrote:
Please note, this person has been wasting many hours, weeks and
Please can you restrict your lines to 72 as per the usenet etiquette.
(Undoubtedly you are asked before.)
Groetjes Albert
Ok you must mean 72 characters per line. I'm a Google user, so that's >practically impossible. You can doConclusion, don't use Google.
What I do, and contact them about putting options in to line break auto wrap
Posts, but I find the Google group people are one of the ones who rarely >listen, and the email people a bit.
Conclusion, fit in with Google. My accounts tied to it. You can petition them for change, I just asked for a auto line clip feature. Or maybe you can petition the developers of what software you have to put in an auto wrap. It's there job.https://www.theguardian.com/technology/2001/apr/26/onlinesupplement10This note confirms the use of 8o character lines, preferably 72.
Groetjes Albert
--
"in our communism country Viet Nam, people are forced to be
alive and in the western country like US, people are free to
die from Covid 19 lol" duc ha
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst
@Jan, it was already answered in the previous post. Albert had meant 72 characters per line, there is no 72 line rule.
@Jpit, yes, Indeed silly, I would had presumed it was my softwares fault if it didn't wrap, and not consumed about it.
You only let things go off screen for no reason if you can't do your job, don't want to, or just want to annoy people.
Sorry, I have no control over mine of their software, but mine does their job and wraps the paragraphs.
Rather than post for the never ending fashionable unjust whims of certain people, just past for the information
and objectives of the thread for them. They are going to waste your time somehow, rather than get descent software
which auto wraps.
Counting 72 characters as you type is just not credible. Most people can't write properly and edit each character,
Then you end up with a mess. Just get descent software it don't bother reading. It's more likely that one end is using
soft returns for lines breaks and hard returns fir paragraph breaks, and the other is using it some other way, and it's
really an incomplete compatability issue. Anyway, if people don't truely care about the content of messages, they should
retire. There is enough faux intellectuals around that always go on about trivial form and lack substantial substance.
One will make one trivial and trivialising, to drag the conversation back down to their level, the other, will make you substantive.
I wish I knew the Latin for this, but, Read what can be read.
On Tuesday, 12 July 2022 at 13:36:44 UTC+1, Wayne morellini wrote:
On Tuesday, July 12, 2022 at 9:16:02 PM UTC+10, none albert wrote:
In article <69d17083-30c2-4d70...@googlegroups.com>,
Wayne morellini <waynemo...@gmail.com> wrote:
On Tuesday, July 12, 2022 at 6:01:55 PM UTC+10, none albert wrote:
In article <4786725b-4598-49d3...@googlegroups.com>,
Wayne morellini <waynemo...@gmail.com> wrote:
Please note, this person has been wasting many hours, weeks and >years, dragging themselves into such "non-productive conversations" >wasting their time.Please can you restrict your lines to 72 as per the usenet etiquette. >> (Undoubtedly you are asked before.)
Groetjes Albert
Ok you must mean 72 characters per line. I'm a Google user, so that's >practically impossible. You can doConclusion, don't use Google.
What I do, and contact them about putting options in to line break auto wrap
Posts, but I find the Google group people are one of the ones who rarely >listen, and the email people a bit.
Conclusion, fit in with Google. My accounts tied to it. You can petition them for change, I just asked for a auto line clip feature. Or maybe you can petition the developers of what software you have to put in an auto wrap. It's there job.https://www.theguardian.com/technology/2001/apr/26/onlinesupplement10 This note confirms the use of 8o character lines, preferably 72.Groetjes Albert
--
"in our communism country Viet Nam, people are forced to be
alive and in the western country like US, people are free to
die from Covid 19 lol" duc ha
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst
@Jan, it was already answered in the previous post. Albert had meant 72 characters per line, there is no 72 line rule.
@Jpit, yes, Indeed silly, I would had presumed it was my softwares fault if it didn't wrap, and not consumed about it.
You only let things go off screen for no reason if you can't do your job, don't want to, or just want to annoy people.
Sorry, I have no control over mine of their software, but mine does their job and wraps the paragraphs.
Rather than post for the never ending fashionable unjust whims of certain people, just past for the information
and objectives of the thread for them. They are going to waste your time somehow, rather than get descent software
which auto wraps.
Counting 72 characters as you type is just not credible. Most people can't write properly and edit each character,AND ANOTHER SILLY POST OF YOURS.
Then you end up with a mess. Just get descent software it don't bother reading. It's more likely that one end is using
soft returns for lines breaks and hard returns fir paragraph breaks, and the other is using it some other way, and it's
really an incomplete compatability issue. Anyway, if people don't truely care about the content of messages, they should
retire. There is enough faux intellectuals around that always go on about trivial form and lack substantial substance.
One will make one trivial and trivialising, to drag the conversation back down to their level, the other, will make you substantive.
I wish I knew the Latin for this, but, Read what can be read.
AND AS ONE CAN SEE LINES TOO LONG AGAIN.
You just give a monkeys,
so live with the reactions.
On Tuesday, 12 July 2022 at 12:16:02 UTC+1, none albert wrote:..
In article <69d17083-30c2-4d70...@googlegroups.com>,
Wayne morellini <waynemo...@gmail.com> wrote:
On Tuesday, July 12, 2022 at 6:01:55 PM UTC+10, none albert wrote:
In article <4786725b-4598-49d3...@googlegroups.com>,
Wayne morellini <waynemo...@gmail.com> wrote:
Groetjes Albert1234567890 1234567890 1234567890 1234567890 1234567890 1234567890 1234567890 12
--
"in our communism country Viet Nam, people are forced to be
alive and in the western country like US, people are free to
die from Covid 19 lol" duc ha
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst
Why change the setup I use.
This is how long about 72 characters are.
On Tuesday, July 12, 2022 at 10:41:13 PM UTC+10, jpit...@gmail.com wrote:
On Tuesday, 12 July 2022 at 12:16:02 UTC+1, none albert wrote:..
In article <69d17083-30c2-4d70...@googlegroups.com>,
Wayne morellini <waynemo...@gmail.com> wrote:
On Tuesday, July 12, 2022 at 6:01:55 PM UTC+10, none albert wrote:
In article <4786725b-4598-49d3...@googlegroups.com>,
Wayne morellini <waynemo...@gmail.com> wrote:
Groetjes Albert1234567890 1234567890 1234567890 1234567890 1234567890 1234567890 1234567890 12
--
"in our communism country Viet Nam, people are forced to be
alive and in the western country like US, people are free to
die from Covid 19 lol" duc ha
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst
Why change the setup I use.That line is 79 characters long!
This is how long about 72 characters are.
On Tuesday, 12 July 2022 at 14:07:33 UTC+1, Wayne morellini wrote:
On Tuesday, July 12, 2022 at 10:41:13 PM UTC+10, jpit...@gmail.com wrote:
On Tuesday, 12 July 2022 at 12:16:02 UTC+1, none albert wrote:..
In article <69d17083-30c2-4d70...@googlegroups.com>,
Wayne morellini <waynemo...@gmail.com> wrote:
On Tuesday, July 12, 2022 at 6:01:55 PM UTC+10, none albert wrote:
In article <4786725b-4598-49d3...@googlegroups.com>,
Wayne morellini <waynemo...@gmail.com> wrote:
Groetjes Albert1234567890 1234567890 1234567890 1234567890 1234567890 1234567890 1234567890 12
--
"in our communism country Viet Nam, people are forced to be
alive and in the western country like US, people are free to
die from Covid 19 lol" duc ha
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst
Well, I tried to make it easy for you to get to 72Why change the setup I use.That line is 79 characters long!
This is how long about 72 characters are.
- and it worked as I can see, so a real 72 is slightly shorter
you might be able to fathom.
What I have observed here is that the people who talk about the life or death of Forth accomplish little. Others, who just get on with it, do very well.
Well said! And InMyOpinion, the smaller the design commitee, the better. (I'm not good at politics.)
What enables me is cheap FPGA eval kits and free Icarus Verilog design and simulation tools (and doc).
Trade secret1: You don't need to be a Grand Master Of Verilog to express the desired results.
Trade secret2: Test your original hardware modules with at least 1 test bench before you trust them.
These pertain to the entire design, whether CPU, (initialized)RAM, or IO modules.
I posit that since I have designed reliable SoCs on FPGA eval boards, anyone else can, with due diligence.
Due diligence starts with knowing what you want, and what the board, FPGA fabric and tools you are using provide.
We have vast opportunities.
Jimbo is not James Bond.
- Myron Plichota
What I have observed here is that the people who talk about the life or death of Forth accomplish little. Others, who just get on with it, do very well.
On Thursday, July 14, 2022 at 2:17:26 AM UTC+10, Myron Plichota wrote:
What I have observed here is that the people who talk about the life or death of Forth accomplish little. Others, who just get on with it, do very well.
Well said! And InMyOpinion, the smaller the design commitee, the better. (I'm not good at politics.)
What enables me is cheap FPGA eval kits and free Icarus Verilog design and simulation tools (and doc).
Trade secret1: You don't need to be a Grand Master Of Verilog to express the desired results.
Trade secret2: Test your original hardware modules with at least 1 test bench before you trust them.
These pertain to the entire design, whether CPU, (initialized)RAM, or IO modules.
I posit that since I have designed reliable SoCs on FPGA eval boards, anyone else can, with due diligence.
Due diligence starts with knowing what you want, and what the board, FPGA fabric and tools you are using provide.
We have vast opportunities.
Jimbo is not James Bond.You realise who has done 0 to doing a public forth processor outside undermining and holding people back?
- Myron Plichota
That you aren't good at politics, otherwise you would recognise the proverbial Nazi's in the room, who have
hooked you on a leash. Seriously, who are the only people who haven't really contributed anything much?
Nobody much, is interested in virtually unsellable FPGA, where a proper forth design would be much better. If you
want fpga, there, are processors on some, and smaller FPGA CPU images to use than forth. If you want to
make an FPGA into a Forth processor, you have a very slow expensive power hungry forth processor. An
FPGA is often about the functionality programmed in, requiring little in an administrative soft CPU. In most
instances where this CPU would sell, and fpga one wouldn't, because of their inefficient sloppy style.
What enables me is cheap FPGA eval kits and free Icarus Verilog design
and simulation tools (and doc).
Trade secret1: You don't need to be a Grand Master Of Verilog to express
the desired results.
Trade secret2: Test your original hardware modules with at least 1 test
bench before you trust them.
These pertain to the entire design, whether CPU, (initialized)RAM,
or IO modules.
On Wednesday, July 13, 2022 at 6:17:26 PM UTC+2, Myron Plichota wrote:
What enables me is cheap FPGA eval kits and free Icarus Verilog design
and simulation tools (and doc).
Trade secret1: You don't need to be a Grand Master Of Verilog to express the desired results.In your opinion: would it be worth the (i.e. my) effort to translate algorithms
Trade secret2: Test your original hardware modules with at least 1 test bench before you trust them.
These pertain to the entire design, whether CPU, (initialized)RAM,
or IO modules.
into an FPGA implementation? I need FP (double precision) and it would need to be faster than a current PC (e.g. very-wide instructions, and/or > 64 mini-cores,
cheaper than PC with equivalent throughput)?
I have wanted to do this since 1985, and the necessity has not diminished since then.
-marcel
I doubt that any FPGA results would surpass Big Silicon FPU performance :)
In your opinion: would it be worth the (i.e. my) effort to translate algorithms into an FPGA implementation? I need FP (double precision)
and it would need to be faster than a current PC
Marcel Hendrix <m...@iae.nl> writes:
In your opinion: would it be worth the (i.e. my) effort to translate algorithms into an FPGA implementation? I need FP (double precision)There are definitely applications where using an FPGA is a big win,
and it would need to be faster than a current PC
because of the high amount of parallelism available. But, you likely
have to design the algorithm specifically for FPGA implementation.
It may be simpler to use a GPU, if your algorithm can be arranged to
make good use of one.
On Thursday, July 14, 2022 at 2:54:15 AM UTC+2, Paul Rubin wrote:
Marcel Hendrix <m...@iae.nl> writes:
In your opinion: would it be worth the (i.e. my) effort to translateThere are definitely applications where using an FPGA is a big win,
algorithms into an FPGA implementation? I need FP (double precision)
and it would need to be faster than a current PC
because of the high amount of parallelism available. But, you likely
have to design the algorithm specifically for FPGA implementation.
It may be simpler to use a GPU, if your algorithm can be arranged to
make good use of one.
I would need a sponsor (NVIDIA H100 GPU == $36,550).
Marcel Hendrix <m...@iae.nl> writes:
I would need a sponsor (NVIDIA H100 GPU == $36,550).Can you use a smaller gpu? Or several? Look on tensordock.com for
rentals starting at 0.32 USD/hour.
On 14/07/2022 15:52, Marcel Hendrix wrote:
On Thursday, July 14, 2022 at 2:54:15 AM UTC+2, Paul Rubin wrote:
Marcel Hendrix <m...@iae.nl> writes:
In your opinion: would it be worth the (i.e. my) effort to translateThere are definitely applications where using an FPGA is a big win,
algorithms into an FPGA implementation? I need FP (double precision)
and it would need to be faster than a current PC
because of the high amount of parallelism available. But, you likely
have to design the algorithm specifically for FPGA implementation.
It may be simpler to use a GPU, if your algorithm can be arranged to
make good use of one.
I would need a sponsor (NVIDIA H100 GPU == $36,550).iForth Pty Ltd
I would need a sponsor (NVIDIA H100 GPU == $36,550).
On Wednesday, July 13, 2022 at 12:43:41 PM UTC-4, Wayne morellini wrote:
On Thursday, July 14, 2022 at 2:17:26 AM UTC+10, Myron Plichota wrote:
What I have observed here is that the people who talk about the life or death of Forth accomplish little. Others, who just get on with it, do very well.
Well said! And InMyOpinion, the smaller the design commitee, the better. (I'm not good at politics.)
What enables me is cheap FPGA eval kits and free Icarus Verilog design and simulation tools (and doc).
Trade secret1: You don't need to be a Grand Master Of Verilog to express the desired results.
Trade secret2: Test your original hardware modules with at least 1 test bench before you trust them.
These pertain to the entire design, whether CPU, (initialized)RAM, or IO modules.
I posit that since I have designed reliable SoCs on FPGA eval boards, anyone else can, with due diligence.
Due diligence starts with knowing what you want, and what the board, FPGA fabric and tools you are using provide.
We have vast opportunities.
Jimbo is not James Bond.You realise who has done 0 to doing a public forth processor outside undermining and holding people back?
- Myron Plichota
That you aren't good at politics, otherwise you would recognise the proverbial Nazi's in the room, who have
hooked you on a leash. Seriously, who are the only people who haven't really contributed anything much?
Nobody much, is interested in virtually unsellable FPGA, where a proper forth design would be much better. If youMore fool me, I thought I was being 100% positive about grassroots effort in Forth CPU design.
want fpga, there, are processors on some, and smaller FPGA CPU images to use than forth. If you want to
make an FPGA into a Forth processor, you have a very slow expensive power hungry forth processor. An
FPGA is often about the functionality programmed in, requiring little in an administrative soft CPU. In most
instances where this CPU would sell, and fpga one wouldn't, because of their inefficient sloppy style.
If you (or anyone else) wish to peruse one of my entirely public (and copyright free) designs, unzip
https://drive.google.com/file/d/1cWZmDik5PlWaEd-srekTiF51chDR8b7_/view?usp=sharing
Jimbo is not James Bond.
- Myron Plichota
On Thursday, July 14, 2022 at 4:00:10 AM UTC+10, Myron Plichota wrote:
On Wednesday, July 13, 2022 at 12:43:41 PM UTC-4, Wayne morellini wrote:
On Thursday, July 14, 2022 at 2:17:26 AM UTC+10, Myron Plichota wrote:
What I have observed here is that the people who talk about the life or death of Forth accomplish little. Others, who just get on with it, do very well.
Well said! And InMyOpinion, the smaller the design commitee, the better. (I'm not good at politics.)
What enables me is cheap FPGA eval kits and free Icarus Verilog design and simulation tools (and doc).
Trade secret1: You don't need to be a Grand Master Of Verilog to express the desired results.
Trade secret2: Test your original hardware modules with at least 1 test bench before you trust them.
These pertain to the entire design, whether CPU, (initialized)RAM, or IO modules.
I posit that since I have designed reliable SoCs on FPGA eval boards, anyone else can, with due diligence.
Due diligence starts with knowing what you want, and what the board, FPGA fabric and tools you are using provide.
We have vast opportunities.
Jimbo is not James Bond.You realise who has done 0 to doing a public forth processor outside undermining and holding people back?
- Myron Plichota
That you aren't good at politics, otherwise you would recognise the proverbial Nazi's in the room, who have
hooked you on a leash. Seriously, who are the only people who haven't really contributed anything much?
I love you man! I agree withNobody much, is interested in virtually unsellable FPGA, where a proper forth design would be much better. If youMore fool me, I thought I was being 100% positive about grassroots effort in Forth CPU design.
want fpga, there, are processors on some, and smaller FPGA CPU images to use than forth. If you want to
make an FPGA into a Forth processor, you have a very slow expensive power hungry forth processor. An
FPGA is often about the functionality programmed in, requiring little in an administrative soft CPU. In most
instances where this CPU would sell, and fpga one wouldn't, because of their inefficient sloppy style.
If you (or anyone else) wish to peruse one of my entirely public (and copyright free) designs, unzip
https://drive.google.com/file/d/1cWZmDik5PlWaEd-srekTiF51chDR8b7_/view?usp=sharing
Jimbo is not James Bond.
- Myron Plichota
your effort, but my point
was that you shouldn't be
positive about people so
positive, that they hold
positive people back. We
see this in most threads on
the subject, it is just killing
things for decades.
Imagine 100's of billions of
dollars just vanishing into
the distance. That's the
difference negative people have made to forth
hardware Now, it's nearly
pointless. GA should make
a different language and
call it Glow! (Glow for
schools, include some icon
and object based
programming modelling).
Just to try to stay away
from this sort of
negativity. People with low
real forwards talent, that
just want to jump in on
things.
Now, for FPGA. Until a
programmable circuit costs
closer to the same as custom
silicon, in high volume and
performs just as well, it is
two different markets. FPGA
lower entry point for lower
volumes often, and custom
for lower price at higher
volumes with higher
performance. FPGA, is there,
I'm more interested in the
custom silicon. If I can get
close enough with a
programmable circuit device
on metrics, that would be ok,
then. If the difference in
energy, speed, price, were 10
times or less (such as, for
example, each were 3.33
times different, combined,
that would be great. Which is
why I'm thinking a more
conventional single
programmable hard gate
array device, similar to
performance difference as
they did with Novix) might
get on that range. You can
literally manufacture
hundreds of millions of
blanks, and everybody buy a
pack according to the size
they need, and keep using the
same pack for projects for
years. When I did my
research into doing in house
fabrication years ago, I
determined such a scheme
es the way to go.
It's a shame Chuck didn't just
go to Mos technologies years
ago. Sure Jack would
probably try to give him a
pretty resistible deal, but they
could have afforded to
upgrade the plant to smaller
node processes, and
manufacture cheap and sold
a stack of chips, at 10mhz
maybe, on the old lines, and
keep up to 3x clock
advantage on a new process.
The whole bottom end of the
industry could still be
dominated by a superior
efficiency it.
When we look at what can be
done cheaper, it's pretty
poor. In matter of fact, those
4 bit Chinese processor
factories could be used to
make an ultra low spec misc,
as part of the product line.
They could literally make
smart watch MCU lines,
cheaper than wear os
devices. People lack
inventive practical
imaginations. Such a design.
Would be suitable for
wireless autonomous
interactive smart tags.
Starting at around a cent or
something. Program and
install, and you are off. An
advanced tag is an Access or
Info Panel, even a $1000
dollar one, controlled by the
same sort of technology.
Military for $10,000 all
endurance panel.
The truth of the matter. You ask some people, and
you won't make a single
dollar, even loose millions.
They will say it won't
succeed, that won't succeed.
All in their superiority, which
has no evidence. We all seen
tech companies fall over left
right and centre due to
incompetence over the
decades. You sit back and
say why are they doing this or
that for years, and they fall
over, not being able to see
how wrong those decisions
were, and not selecting the
right decisions which made
other companies succeed.
I've done enough with
companies to know, even
when you hand them things
on a platter, they still manage
to stuff things up. You can't
even help them for their own
good.
It may be simpler to use a GPU, if your algorithm can be arranged toI would need a sponsor (NVIDIA H100 GPU == $36,550).
make good use of one.
On Thursday, 14 July 2022 at 08:23:54 UTC+1, Wayne morellini wrote:
On Thursday, July 14, 2022 at 4:00:10 AM UTC+10, Myron Plichota wrote:
On Wednesday, July 13, 2022 at 12:43:41 PM UTC-4, Wayne morellini wrote:
On Thursday, July 14, 2022 at 2:17:26 AM UTC+10, Myron Plichota wrote:
Can you please reset you mental countar from 25 to 75 characters per line. Lines will then be 3x longer than you did,
and normal people can more easily read them.
Thank you.
Or are you doing this on purpose to annoy us?
Marcel Hendrix <m...@iae.nl> writes:
You made me curious... it's an April Fool's joke?It may be simpler to use a GPU, if your algorithm can be arranged toI would need a sponsor (NVIDIA H100 GPU == $36,550).
make good use of one.
https://wccftech.com/nvidia-h100-hopper-gpu-monster-graphics-card-with-100-billion-transistors-across-2-dies-43008-cuda-cores-and-48-gb-hbm4-memory/
On Thursday, July 14, 2022 at 6:04:58 PM UTC+10, jpit...@gmail.com wrote:
On Thursday, 14 July 2022 at 08:23:54 UTC+1, Wayne morellini wrote:
On Thursday, July 14, 2022 at 4:00:10 AM UTC+10, Myron Plichota wrote:
On Wednesday, July 13, 2022 at 12:43:41 PM UTC-4, Wayne morellini wrote:
On Thursday, July 14, 2022 at 2:17:26 AM UTC+10, Myron Plichota wrote:
Can you please reset you mental countar from 25 to 75 characters per line. Lines will then be 3x longer than you did,No, it fits the screen. Is easy to cut.
and normal people can more easily read them.
Thank you.
Or are you doing this on purpose to annoy us?
Makes it easier to read.
On Thursday, 14 July 2022 at 17:04:10 UTC+1, Wayne morellini wrote:
On Thursday, July 14, 2022 at 6:04:58 PM UTC+10, jpit...@gmail.com wrote:
On Thursday, 14 July 2022 at 08:23:54 UTC+1, Wayne morellini wrote:
On Thursday, July 14, 2022 at 4:00:10 AM UTC+10, Myron Plichota wrote:
On Wednesday, July 13, 2022 at 12:43:41 PM UTC-4, Wayne morellini wrote:
On Thursday, July 14, 2022 at 2:17:26 AM UTC+10, Myron Plichota wrote:
So you are sayingCan you please reset you mental countar from 25 to 75 characters per line.No, it fits the screen. Is easy to cut.
Lines will then be 3x longer than you did,
and normal people can more easily read them.
Thank you.
Or are you doing this on purpose to annoy us?
Makes it easier to read.
that you already have a very narrow screen
for an even narrower Forth chip?
Can we still take you serious?
So, I've basically forgotten a lot from my university days to do with digital electronics, and want to do some things a bit complex on the proposed processor deign. So, any resources out there useful simplified guide for doing a simple 1000 transistorplus core, and designs for memory, rom and storage memory, on the same process? I only am looking at this because I learnt the basics of digital electronic circuit design, and it shouldn't be any more difficult, with the right software.
I want to explore crossing paths to reuse transistors with path depending on selection, maybe by source and destination to establish path and some other tricks, to inactivate alternative paths? I know this is a path to possible problems, especiallywith age or environmental deterioration. This is for an compacted design. I'm also interested in progressively waking and turning off the circuit (or at least sleep) as the signal moves through it, for energy.
Thanks again.
Wayne.
On Thursday, July 7, 2022 at 11:58:39 PM UTC+10, Wayne morellini wrote:transistor plus core, and designs for memory, rom and storage memory, on the same process? I only am looking at this because I learnt the basics of digital electronic circuit design, and it shouldn't be any more difficult, with the right software.
So, I've basically forgotten a lot from my university days to do with digital electronics, and want to do some things a bit complex on the proposed processor deign. So, any resources out there useful simplified guide for doing a simple 1000
with age or environmental deterioration. This is for an compacted design. I'm also interested in progressively waking and turning off the circuit (or at least sleep) as the signal moves through it, for energy.I want to explore crossing paths to reuse transistors with path depending on selection, maybe by source and destination to establish path and some other tricks, to inactivate alternative paths? I know this is a path to possible problems, especially
Thanks again.
Wayne.
Syncing forth processor project threads.
On Sunday, September 4, 2022 at 3:06:16 PM UTC+10, Wayne morellini wrote:transistor plus core, and designs for memory, rom and storage memory, on the same process? I only am looking at this because I learnt the basics of digital electronic circuit design, and it shouldn't be any more difficult, with the right software.
On Thursday, July 7, 2022 at 11:58:39 PM UTC+10, Wayne morellini wrote:
So, I've basically forgotten a lot from my university days to do with digital electronics, and want to do some things a bit complex on the proposed processor deign. So, any resources out there useful simplified guide for doing a simple 1000
with age or environmental deterioration. This is for an compacted design. I'm also interested in progressively waking and turning off the circuit (or at least sleep) as the signal moves through it, for energy.I want to explore crossing paths to reuse transistors with path depending on selection, maybe by source and destination to establish path and some other tricks, to inactivate alternative paths? I know this is a path to possible problems, especially
Thanks again.
Wayne.
Syncing forth processor project threads.Forth processor project
Is it time for another Forth chip?
https://groups.google.com/u/2/g/comp.lang.forth/c/6adve-Z1ppU
Designing a Forth Processor?
https://groups.google.com/u/2/g/comp.lang.forth/c/9lpG9yey_NQ
A low cost chip prototyping technique.
https://groups.google.com/u/2/g/comp.lang.forth/c/s27tSebmF-I
Comments: ColorForth binary in JavaScript!
https://groups.google.com/u/2/g/comp.lang.forth/c/3py7TwKu6b0
Looking for some advice on Offete p8, p16, p24, p32, p64. Ep16, ep24, ep32, and others.
https://groups.google.com/u/2/g/comp.lang.forth/c/EMgCYdV8NR8
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 300 |
Nodes: | 16 (2 / 14) |
Uptime: | 52:30:26 |
Calls: | 6,712 |
Calls today: | 5 |
Files: | 12,243 |
Messages: | 5,355,179 |
Posted today: | 1 |