• More precision of my philosophy about the weakness of Generative Pre-tr

    From Amine Moulay Ramdane@21:1/5 to All on Sun Jan 1 12:30:54 2023
    Hello,


    More precision of my philosophy about the weakness of Generative Pre-trained Transformer and more of my thoughts..

    I am a white arab from Morocco, and i think i am smart since i have also invented many scalable algorithms and algorithms..


    I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, and i mean that it is "above" 115 IQ,
    so i think i am discovering the pattern with my fluid intelligence that explains the weakness of Generative Pre-trained Transformer of like ChatGPT, and it is that ChatGPT can discover the patterns using
    the existing patterns from the data or knowledge, so it is like using
    the smartness of the data, but ChatGPT can not use the smartness of the human brain that also comes with human consciousness that optimizes more smartness, so it can not invent highly smart patterns or things from like is doing it a highly smart human
    from his brain, so i think that ChatGPT will still be not capable of this kind of highly smart creativity, but still it remains really powerful and really useful, so i invite you to read my following previous thoughts that make you understand my views:


    More precision of my philosophy about the mechanisms of attention and self-attention of Transformers AI models and more of my thoughts..


    I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, and i mean that it is "above" 115 IQ, i think i am understanding deep learning, but i say that Transformers are deep learning + self-attention and
    attention, and this attention and self-attention permit to grasp "context" and "antecedents", for example when you say the following sentence:


    "The animal didn't cross the street because it was too tired"


    So we can ask how the artificial intelligence of ChatGPT that uses
    Generative Pre-trained Transformer will understand that the "it" in
    the above sentence is not the street but the animal, so i say that
    it is with self-attention and attention mechanisms of artificial intelligence and with the training with more and more data that the transformer can "detect" the pattern of the "it" refers to the "animal" in the above sentence, so self-attention and
    attention of the artificial intelligence of ChatGPT that we call Generative Pre-trained Transformer permit to grasp "context" and "antecedents" too, it is also like logically inferring the patterns using self-attention and attention from the context of
    the many many sentences from the data, and since the data is exponentially growing and since the artificial intelligence is also generative, so i think it will permit to make the artificial intelligence of the transformer much more powerful, so as you
    notice that the data is King , and the "generative" word of the Generative Pre-trained Transformer refers to the model's ability to generate text, and of course we are now noticing that it is making ChatGPT really useful and powerful, and of course i say
    that ChatGPT will still much more improve , and read my following previous thoughts so that to understand my views about it:


    More of my philosophy about transformers and about the next GPT-4 and about ChatGPT and more of my thoughts..



    The capabilities of transformer architectures, as in GPT of ChatGPT that is called Generative Pre-trained Transformer, are truly remarkable, as they allow machine learning models to surpass human reading comprehension and cognitive abilities in many ways.
    These models are trained on massive amounts of text data, including entire corpora such as the English Wikipedia or the entire internet, which enables them to become highly advanced language models (LMs) with a deep understanding of language and the
    ability to perform complex predictive analytics based on text analysis. The result is a model that is able to approximate human-level text cognition, or reading, to an exceptional degree - not just simple comprehension, but also the ability to make
    sophisticated connections and interpretations about the text, because the network of Transformer pay “attention” to multiple sentences, enabling it to grasp "context" and "antecedents". These transformer models represent a significant advancement
    in the field of natural language processing and have the potential to revolutionize how we interact with and understand language.

    GPT-4 is significantly larger and more powerful than GPT-3, with 170 trillion parameters compared to GPT-3’s 175 billion parameters(and even GPT-3.5 of the new ChatGPT has 175 billion parameters). This allows GPT-4 to process and generate text with
    greater accuracy and fluency, so with feedback from users and a more powerful GPT-4 model coming up and by being trained on a substantially larger amount of data , ChatGPT that will use GPT-4 may "significantly" improve in the future. So i think ChatGPT
    will still become much more powerful. And i invite you to read my previous thoughts about my experience with the new ChatGPT:


    More of my philosophy about my experience with ChatGPT and about artificial intelligence and more of my thoughts..


    I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, and i mean that it is "above" 115 IQ,
    so in those two last days i have just tested ChatGPT so that to see if
    this new artificial intelligence launched by OpenAI in November 2022 is efficient, and i think that it is really useful, since i think by testing it that it can score well on the human average smartness, but if you want it to be highly smart by inventing
    highly smart things , it will not be able to do it, but if you want ChatGPT to be highly smart on what it has learned from the existing smartness of the human knowledge that it has been trained on, i think it can also score high in many times of it, also
    ChatGPT can in many times make much less errors than humans, so i think that ChatGPT is really useful, and i think that ChatGPT will be improved much more by increasing the size of
    its transformer (A transformer is a deep learning model that adopts the mechanism of self-attention) , and i also think that ChatGPT will be
    improved much more when it will be trained on a substantially larger amount of data, considering an article that DeepMind just published a few days ago demonstrating that the performance of these models can be drastically improved by scaling data more
    aggressively than parameters ( Read it here: https://arxiv.org/pdf/2203.15556.pdf ), and it is
    why i am optimistic about the performance of ChatGPT and i think that it will be much more improved.


    Thank you,
    Amine Moulay Ramdane.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From V@21:1/5 to All on Thu Mar 30 08:30:08 2023
    WW91IHRhbGsgdG9vIG11Y2guLi4uLi4NClRyeSB0byBiZSBtb3JlIGludHJvdmVydGl2ZS4NCuKg gOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKg gOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKg gOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKg gOKggOKggA0K4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA 4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA 4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA 4qCA4qCA4qCA4qCA4qCA4qCADQrioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDi oIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDi oIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDi oIDioIDioIDioIDioIDioIDioIDioIDioIDioIANCuKggOKggOKggOKggOKggOKggOKggOKggOKg gOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKg gOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKg gOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggA0K4qCA4qCA4qCA4qCA4qCA 4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA 4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA 4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCADQrioIDi oIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDi oIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDi oIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDi oIDioIANCuKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKg gOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKg gOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKg gOKggOKggOKggOKggOKggA0K4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA 4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA 4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA 4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCADQrioIDioIDioIDioIDioIDioIDioIDioIDioIDi oIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDi oIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDi oIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIANCuKggOKggOKggOKggOKggOKg gOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKg gOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKg gOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggA0K4qCA4qCA 4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA 4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA 4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA 4qCADQrioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDi oIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDi oIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDi oIDioIDioIDioIDioIANCuKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKg gOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKg gOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKg gOKggOKggOKggOKggOKggOKggOKggOKggA0K4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA 4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA 4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA 4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCA4qCADQrioIDioIDioIDioIDioIDioIDi oIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDi oIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDi oIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIDioIANCuKggOKggOKg gOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKg gOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKg gOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKggOKg gA0KDQoNCg0KT24gU3VuZGF5LCBKYW51YXJ5IDEsIDIwMjMgYXQgMTA6MzA6NTbigK9QTSBVVEMr MiwgQW1pbmUgTW91bGF5IFJhbWRhbmUgd3JvdGU6DQo+IEhlbGxvLCANCj4gDQo+IA0KPiBNb3Jl IHByZWNpc2lvbiBvZiBteSBwaGlsb3NvcGh5IGFib3V0IHRoZSB3ZWFrbmVzcyBvZiBHZW5lcmF0 aXZlIFByZS10cmFpbmVkIFRyYW5zZm9ybWVyIGFuZCBtb3JlIG9mIG15IHRob3VnaHRzLi4gDQo+ IA0KPiBJIGFtIGEgd2hpdGUgYXJhYiBmcm9tIE1vcm9jY28sIGFuZCBpIHRoaW5rIGkgYW0gc21h cnQgc2luY2UgaSBoYXZlIGFsc28gDQo+IGludmVudGVkIG1hbnkgc2NhbGFibGUgYWxnb3JpdGht cyBhbmQgYWxnb3JpdGhtcy4uIA0KPiANCj4gDQo+IEkgdGhpbmsgaSBhbSBoaWdobHkgc21hcnQg c2luY2UgSSBoYXZlIHBhc3NlZCB0d28gY2VydGlmaWVkIElRIHRlc3RzIGFuZCBpIGhhdmUgc2Nv cmVkICJhYm92ZSIgMTE1IElRLCBhbmQgaSBtZWFuIHRoYXQgaXQgaXMgImFib3ZlIiAxMTUgSVEs IA0KPiBzbyBpIHRoaW5rIGkgYW0gZGlzY292ZXJpbmcgdGhlIHBhdHRlcm4gd2l0aCBteSBmbHVp ZCBpbnRlbGxpZ2VuY2UgdGhhdCANCj4gZXhwbGFpbnMgdGhlIHdlYWtuZXNzIG9mIEdlbmVyYXRp dmUgUHJlLXRyYWluZWQgVHJhbnNmb3JtZXIgb2YgbGlrZSBDaGF0R1BULCBhbmQgaXQgaXMgdGhh dCBDaGF0R1BUIGNhbiBkaXNjb3ZlciB0aGUgcGF0dGVybnMgdXNpbmcgDQo+IHRoZSBleGlzdGlu ZyBwYXR0ZXJucyBmcm9tIHRoZSBkYXRhIG9yIGtub3dsZWRnZSwgc28gaXQgaXMgbGlrZSB1c2lu ZyANCj4gdGhlIHNtYXJ0bmVzcyBvZiB0aGUgZGF0YSwgYnV0IENoYXRHUFQgY2FuIG5vdCB1c2Ug dGhlIHNtYXJ0bmVzcyBvZiB0aGUgaHVtYW4gYnJhaW4gdGhhdCBhbHNvIGNvbWVzIHdpdGggaHVt YW4gY29uc2Npb3VzbmVzcyB0aGF0IG9wdGltaXplcyBtb3JlIHNtYXJ0bmVzcywgc28gaXQgY2Fu IG5vdCBpbnZlbnQgaGlnaGx5IHNtYXJ0IHBhdHRlcm5zIG9yIHRoaW5ncyBmcm9tIGxpa2UgaXMg ZG9pbmcgaXQgYSBoaWdobHkgc21hcnQgaHVtYW4gZnJvbSBoaXMgYnJhaW4sIHNvIGkgdGhpbmsg dGhhdCBDaGF0R1BUIHdpbGwgc3RpbGwgYmUgbm90IGNhcGFibGUgb2YgdGhpcyBraW5kIG9mIGhp Z2hseSBzbWFydCBjcmVhdGl2aXR5LCBidXQgc3RpbGwgaXQgcmVtYWlucyByZWFsbHkgcG93ZXJm dWwgYW5kIHJlYWxseSB1c2VmdWwsIHNvIGkgaW52aXRlIHlvdSB0byByZWFkIG15IGZvbGxvd2lu ZyBwcmV2aW91cyB0aG91Z2h0cyB0aGF0IG1ha2UgeW91IHVuZGVyc3RhbmQgbXkgdmlld3M6IA0K PiANCj4gDQo+IE1vcmUgcHJlY2lzaW9uIG9mIG15IHBoaWxvc29waHkgYWJvdXQgdGhlIG1lY2hh bmlzbXMgb2YgYXR0ZW50aW9uIGFuZCBzZWxmLWF0dGVudGlvbiBvZiBUcmFuc2Zvcm1lcnMgQUkg bW9kZWxzIGFuZCBtb3JlIG9mIG15IHRob3VnaHRzLi4gDQo+IA0KPiANCj4gSSB0aGluayBpIGFt IGhpZ2hseSBzbWFydCBzaW5jZSBJIGhhdmUgcGFzc2VkIHR3byBjZXJ0aWZpZWQgSVEgdGVzdHMg YW5kIGkgaGF2ZSBzY29yZWQgImFib3ZlIiAxMTUgSVEsIGFuZCBpIG1lYW4gdGhhdCBpdCBpcyAi YWJvdmUiIDExNSBJUSwgaSB0aGluayBpIGFtIHVuZGVyc3RhbmRpbmcgZGVlcCBsZWFybmluZywg YnV0IGkgc2F5IHRoYXQgVHJhbnNmb3JtZXJzIGFyZSBkZWVwIGxlYXJuaW5nICsgc2VsZi1hdHRl bnRpb24gYW5kIGF0dGVudGlvbiwgYW5kIHRoaXMgYXR0ZW50aW9uIGFuZCBzZWxmLWF0dGVudGlv biBwZXJtaXQgdG8gZ3Jhc3AgImNvbnRleHQiIGFuZCAiYW50ZWNlZGVudHMiLCBmb3IgZXhhbXBs ZSB3aGVuIHlvdSBzYXkgdGhlIGZvbGxvd2luZyBzZW50ZW5jZTogDQo+IA0KPiANCj4gIlRoZSBh bmltYWwgZGlkbid0IGNyb3NzIHRoZSBzdHJlZXQgYmVjYXVzZSBpdCB3YXMgdG9vIHRpcmVkIiAN Cj4gDQo+IA0KPiBTbyB3ZSBjYW4gYXNrIGhvdyB0aGUgYXJ0aWZpY2lhbCBpbnRlbGxpZ2VuY2Ug b2YgQ2hhdEdQVCB0aGF0IHVzZXMgDQo+IEdlbmVyYXRpdmUgUHJlLXRyYWluZWQgVHJhbnNmb3Jt ZXIgd2lsbCB1bmRlcnN0YW5kIHRoYXQgdGhlICJpdCIgaW4gDQo+IHRoZSBhYm92ZSBzZW50ZW5j ZSBpcyBub3QgdGhlIHN0cmVldCBidXQgdGhlIGFuaW1hbCwgc28gaSBzYXkgdGhhdCANCj4gaXQg aXMgd2l0aCBzZWxmLWF0dGVudGlvbiBhbmQgYXR0ZW50aW9uIG1lY2hhbmlzbXMgb2YgYXJ0aWZp Y2lhbCBpbnRlbGxpZ2VuY2UgYW5kIHdpdGggdGhlIHRyYWluaW5nIHdpdGggbW9yZSBhbmQgbW9y ZSBkYXRhIHRoYXQgdGhlIHRyYW5zZm9ybWVyIGNhbiAiZGV0ZWN0IiB0aGUgcGF0dGVybiBvZiB0 aGUgIml0IiByZWZlcnMgdG8gdGhlICJhbmltYWwiIGluIHRoZSBhYm92ZSBzZW50ZW5jZSwgc28g c2VsZi1hdHRlbnRpb24gYW5kIGF0dGVudGlvbiBvZiB0aGUgYXJ0aWZpY2lhbCBpbnRlbGxpZ2Vu Y2Ugb2YgQ2hhdEdQVCB0aGF0IHdlIGNhbGwgR2VuZXJhdGl2ZSBQcmUtdHJhaW5lZCBUcmFuc2Zv cm1lciBwZXJtaXQgdG8gZ3Jhc3AgImNvbnRleHQiIGFuZCAiYW50ZWNlZGVudHMiIHRvbywgaXQg aXMgYWxzbyBsaWtlIGxvZ2ljYWxseSBpbmZlcnJpbmcgdGhlIHBhdHRlcm5zIHVzaW5nIHNlbGYt YXR0ZW50aW9uIGFuZCBhdHRlbnRpb24gZnJvbSB0aGUgY29udGV4dCBvZiB0aGUgbWFueSBtYW55 IHNlbnRlbmNlcyBmcm9tIHRoZSBkYXRhLCBhbmQgc2luY2UgdGhlIGRhdGEgaXMgZXhwb25lbnRp YWxseSBncm93aW5nIGFuZCBzaW5jZSB0aGUgYXJ0aWZpY2lhbCBpbnRlbGxpZ2VuY2UgaXMgYWxz byBnZW5lcmF0aXZlLCBzbyBpIHRoaW5rIGl0IHdpbGwgcGVybWl0IHRvIG1ha2UgdGhlIGFydGlm aWNpYWwgaW50ZWxsaWdlbmNlIG9mIHRoZSB0cmFuc2Zvcm1lciBtdWNoIG1vcmUgcG93ZXJmdWws IHNvIGFzIHlvdSBub3RpY2UgdGhhdCB0aGUgZGF0YSBpcyBLaW5nICwgYW5kIHRoZSAiZ2VuZXJh dGl2ZSIgd29yZCBvZiB0aGUgR2VuZXJhdGl2ZSBQcmUtdHJhaW5lZCBUcmFuc2Zvcm1lciByZWZl cnMgdG8gdGhlIG1vZGVsJ3MgYWJpbGl0eSB0byBnZW5lcmF0ZSB0ZXh0LCBhbmQgb2YgY291cnNl IHdlIGFyZSBub3cgbm90aWNpbmcgdGhhdCBpdCBpcyBtYWtpbmcgQ2hhdEdQVCByZWFsbHkgdXNl ZnVsIGFuZCBwb3dlcmZ1bCwgYW5kIG9mIGNvdXJzZSBpIHNheSB0aGF0IENoYXRHUFQgd2lsbCBz dGlsbCBtdWNoIG1vcmUgaW1wcm92ZSAsIGFuZCByZWFkIG15IGZvbGxvd2luZyBwcmV2aW91cyB0 aG91Z2h0cyBzbyB0aGF0IHRvIHVuZGVyc3RhbmQgbXkgdmlld3MgYWJvdXQgaXQ6IA0KPiANCj4g DQo+IE1vcmUgb2YgbXkgcGhpbG9zb3BoeSBhYm91dCB0cmFuc2Zvcm1lcnMgYW5kIGFib3V0IHRo ZSBuZXh0IEdQVC00IGFuZCBhYm91dCBDaGF0R1BUIGFuZCBtb3JlIG9mIG15IHRob3VnaHRzLi4g DQo+IA0KPiANCj4gDQo+IFRoZSBjYXBhYmlsaXRpZXMgb2YgdHJhbnNmb3JtZXIgYXJjaGl0ZWN0 dXJlcywgYXMgaW4gR1BUIG9mIENoYXRHUFQgdGhhdCBpcyBjYWxsZWQgR2VuZXJhdGl2ZSBQcmUt dHJhaW5lZCBUcmFuc2Zvcm1lciwgYXJlIHRydWx5IHJlbWFya2FibGUsIGFzIHRoZXkgYWxsb3cg bWFjaGluZSBsZWFybmluZyBtb2RlbHMgdG8gc3VycGFzcyBodW1hbiByZWFkaW5nIGNvbXByZWhl bnNpb24gYW5kIGNvZ25pdGl2ZSBhYmlsaXRpZXMgaW4gbWFueSB3YXlzLiBUaGVzZSBtb2RlbHMg YXJlIHRyYWluZWQgb24gbWFzc2l2ZSBhbW91bnRzIG9mIHRleHQgZGF0YSwgaW5jbHVkaW5nIGVu dGlyZSBjb3Jwb3JhIHN1Y2ggYXMgdGhlIEVuZ2xpc2ggV2lraXBlZGlhIG9yIHRoZSBlbnRpcmUg aW50ZXJuZXQsIHdoaWNoIGVuYWJsZXMgdGhlbSB0byBiZWNvbWUgaGlnaGx5IGFkdmFuY2VkIGxh bmd1YWdlIG1vZGVscyAoTE1zKSB3aXRoIGEgZGVlcCB1bmRlcnN0YW5kaW5nIG9mIGxhbmd1YWdl IGFuZCB0aGUgYWJpbGl0eSB0byBwZXJmb3JtIGNvbXBsZXggcHJlZGljdGl2ZSBhbmFseXRpY3Mg YmFzZWQgb24gdGV4dCBhbmFseXNpcy4gVGhlIHJlc3VsdCBpcyBhIG1vZGVsIHRoYXQgaXMgYWJs ZSB0byBhcHByb3hpbWF0ZSBodW1hbi1sZXZlbCB0ZXh0IGNvZ25pdGlvbiwgb3IgcmVhZGluZywg dG8gYW4gZXhjZXB0aW9uYWwgZGVncmVlIC0gbm90IGp1c3Qgc2ltcGxlIGNvbXByZWhlbnNpb24s IGJ1dCBhbHNvIHRoZSBhYmlsaXR5IHRvIG1ha2Ugc29waGlzdGljYXRlZCBjb25uZWN0aW9ucyBh bmQgaW50ZXJwcmV0YXRpb25zIGFib3V0IHRoZSB0ZXh0LCBiZWNhdXNlIHRoZSBuZXR3b3JrIG9m IFRyYW5zZm9ybWVyIHBheSDigJxhdHRlbnRpb27igJ0gdG8gbXVsdGlwbGUgc2VudGVuY2VzLCBl bmFibGluZyBpdCB0byBncmFzcCAiY29udGV4dCIgYW5kICJhbnRlY2VkZW50cyIuIFRoZXNlIHRy YW5zZm9ybWVyIG1vZGVscyByZXByZXNlbnQgYSBzaWduaWZpY2FudCBhZHZhbmNlbWVudCBpbiB0 aGUgZmllbGQgb2YgbmF0dXJhbCBsYW5ndWFnZSBwcm9jZXNzaW5nIGFuZCBoYXZlIHRoZSBwb3Rl bnRpYWwgdG8gcmV2b2x1dGlvbml6ZSBob3cgd2UgaW50ZXJhY3Qgd2l0aCBhbmQgdW5kZXJzdGFu ZCBsYW5ndWFnZS4gDQo+IA0KPiBHUFQtNCBpcyBzaWduaWZpY2FudGx5IGxhcmdlciBhbmQgbW9y ZSBwb3dlcmZ1bCB0aGFuIEdQVC0zLCB3aXRoIDE3MCB0cmlsbGlvbiBwYXJhbWV0ZXJzIGNvbXBh cmVkIHRvIEdQVC0z4oCZcyAxNzUgYmlsbGlvbiBwYXJhbWV0ZXJzKGFuZCBldmVuIEdQVC0zLjUg b2YgdGhlIG5ldyBDaGF0R1BUIGhhcyAxNzUgYmlsbGlvbiBwYXJhbWV0ZXJzKS4gVGhpcyBhbGxv d3MgR1BULTQgdG8gcHJvY2VzcyBhbmQgZ2VuZXJhdGUgdGV4dCB3aXRoIGdyZWF0ZXIgYWNjdXJh Y3kgYW5kIGZsdWVuY3ksIHNvIHdpdGggZmVlZGJhY2sgZnJvbSB1c2VycyBhbmQgYSBtb3JlIHBv d2VyZnVsIEdQVC00IG1vZGVsIGNvbWluZyB1cCBhbmQgYnkgYmVpbmcgdHJhaW5lZCBvbiBhIHN1 YnN0YW50aWFsbHkgbGFyZ2VyIGFtb3VudCBvZiBkYXRhICwgQ2hhdEdQVCB0aGF0IHdpbGwgdXNl IEdQVC00IG1heSAic2lnbmlmaWNhbnRseSIgaW1wcm92ZSBpbiB0aGUgZnV0dXJlLiBTbyBpIHRo aW5rIENoYXRHUFQgd2lsbCBzdGlsbCBiZWNvbWUgbXVjaCBtb3JlIHBvd2VyZnVsLiBBbmQgaSBp bnZpdGUgeW91IHRvIHJlYWQgbXkgcHJldmlvdXMgdGhvdWdodHMgYWJvdXQgbXkgZXhwZXJpZW5j ZSB3aXRoIHRoZSBuZXcgQ2hhdEdQVDogDQo+IA0KPiANCj4gTW9yZSBvZiBteSBwaGlsb3NvcGh5 IGFib3V0IG15IGV4cGVyaWVuY2Ugd2l0aCBDaGF0R1BUIGFuZCBhYm91dCBhcnRpZmljaWFsIGlu dGVsbGlnZW5jZSBhbmQgbW9yZSBvZiBteSB0aG91Z2h0cy4uIA0KPiANCj4gDQo+IEkgdGhpbmsg aSBhbSBoaWdobHkgc21hcnQgc2luY2UgSSBoYXZlIHBhc3NlZCB0d28gY2VydGlmaWVkIElRIHRl c3RzIGFuZCBpIGhhdmUgc2NvcmVkICJhYm92ZSIgMTE1IElRLCBhbmQgaSBtZWFuIHRoYXQgaXQg aXMgImFib3ZlIiAxMTUgSVEsIA0KPiBzbyBpbiB0aG9zZSB0d28gbGFzdCBkYXlzIGkgaGF2ZSBq dXN0IHRlc3RlZCBDaGF0R1BUIHNvIHRoYXQgdG8gc2VlIGlmIA0KPiB0aGlzIG5ldyBhcnRpZmlj aWFsIGludGVsbGlnZW5jZSBsYXVuY2hlZCBieSBPcGVuQUkgaW4gTm92ZW1iZXIgMjAyMiBpcyBl ZmZpY2llbnQsIGFuZCBpIHRoaW5rIHRoYXQgaXQgaXMgcmVhbGx5IHVzZWZ1bCwgc2luY2UgaSB0 aGluayBieSB0ZXN0aW5nIGl0IHRoYXQgaXQgY2FuIHNjb3JlIHdlbGwgb24gdGhlIGh1bWFuIGF2 ZXJhZ2Ugc21hcnRuZXNzLCBidXQgaWYgeW91IHdhbnQgaXQgdG8gYmUgaGlnaGx5IHNtYXJ0IGJ5 IGludmVudGluZyBoaWdobHkgc21hcnQgdGhpbmdzICwgaXQgd2lsbCBub3QgYmUgYWJsZSB0byBk byBpdCwgYnV0IGlmIHlvdSB3YW50IENoYXRHUFQgdG8gYmUgaGlnaGx5IHNtYXJ0IG9uIHdoYXQg aXQgaGFzIGxlYXJuZWQgZnJvbSB0aGUgZXhpc3Rpbmcgc21hcnRuZXNzIG9mIHRoZSBodW1hbiBr bm93bGVkZ2UgdGhhdCBpdCBoYXMgYmVlbiB0cmFpbmVkIG9uLCBpIHRoaW5rIGl0IGNhbiBhbHNv IHNjb3JlIGhpZ2ggaW4gbWFueSB0aW1lcyBvZiBpdCwgYWxzbyBDaGF0R1BUIGNhbiBpbiBtYW55 IHRpbWVzIG1ha2UgbXVjaCBsZXNzIGVycm9ycyB0aGFuIGh1bWFucywgc28gaSB0aGluayB0aGF0 IENoYXRHUFQgaXMgcmVhbGx5IHVzZWZ1bCwgYW5kIGkgdGhpbmsgdGhhdCBDaGF0R1BUIHdpbGwg YmUgaW1wcm92ZWQgbXVjaCBtb3JlIGJ5IGluY3JlYXNpbmcgdGhlIHNpemUgb2YgDQo+IGl0cyB0 cmFuc2Zvcm1lciAoQSB0cmFuc2Zvcm1lciBpcyBhIGRlZXAgbGVhcm5pbmcgbW9kZWwgdGhhdCBh ZG9wdHMgdGhlIG1lY2hhbmlzbSBvZiBzZWxmLWF0dGVudGlvbikgLCBhbmQgaSBhbHNvIHRoaW5r IHRoYXQgQ2hhdEdQVCB3aWxsIGJlIA0KPiBpbXByb3ZlZCBtdWNoIG1vcmUgd2hlbiBpdCB3aWxs IGJlIHRyYWluZWQgb24gYSBzdWJzdGFudGlhbGx5IGxhcmdlciBhbW91bnQgb2YgZGF0YSwgY29u c2lkZXJpbmcgYW4gYXJ0aWNsZSB0aGF0IERlZXBNaW5kIGp1c3QgcHVibGlzaGVkIGEgZmV3IGRh eXMgYWdvIGRlbW9uc3RyYXRpbmcgdGhhdCB0aGUgcGVyZm9ybWFuY2Ugb2YgdGhlc2UgbW9kZWxz IGNhbiBiZSBkcmFzdGljYWxseSBpbXByb3ZlZCBieSBzY2FsaW5nIGRhdGEgbW9yZSBhZ2dyZXNz aXZlbHkgdGhhbiBwYXJhbWV0ZXJzICggUmVhZCBpdCBoZXJlOiBodHRwczovL2FyeGl2Lm9yZy9w ZGYvMjIwMy4xNTU1Ni5wZGYgKSwgYW5kIGl0IGlzIA0KPiB3aHkgaSBhbSBvcHRpbWlzdGljIGFi b3V0IHRoZSBwZXJmb3JtYW5jZSBvZiBDaGF0R1BUIGFuZCBpIHRoaW5rIHRoYXQgaXQgd2lsbCBi ZSBtdWNoIG1vcmUgaW1wcm92ZWQuIA0KPiANCj4gDQo+IFRoYW5rIHlvdSwgDQo+IEFtaW5lIE1v dWxheSBSYW1kYW5lLg0K

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)