• Possibly OT: HDMI to VGA

    From scbs29@21:1/5 to All on Mon Aug 14 12:29:16 2023
    Hello all
    I have just had to purchase a new monitor which only has inputs for HDMI and VGA.
    Since I have never been really satisfied with HDMI because this pc and 2 differnt monitors together with my previous pc
    and 2 different monitors all reproduced text of low quality even with Cleartype being used. The text
    to me seemed slightly out of focus and at times I found it somewhat difficullt to read. With my previous
    monitor using either DV-I to DV-I or HDMI to DV-I cured this problem with the text then being very
    crisp and clear.
    SInce as I mentioned above my new monitor only has HDMI and VGA, can anyone advise me as to the quality
    of graphics I would expect if I use an HDMI to VGA cable ?
    TIA

    Intel Core i7 10700F @ 2.90GHz (16 CPUs) Comet Lake 14nm Technology
    Memory: Kingston Fury Renegade 32768MB RAM 32.0GB Dual-Channel
    DDR4 @ 1196MHz (17-17-17-39)
    12Gb NVIDIA GeForce RTX 3060 (Gigabyte)
    Windows 10 Pro, 64-bit
    Mobo: Gigabyte Technology Co., Ltd. Model: H410M S2H V2Direct X 12
    GameMax GP Series 550w PSU
    DirectX 12

    --
    remove fred before emailing

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andy Burns@21:1/5 to All on Mon Aug 14 13:38:45 2023
    scbs29 wrote:

    I have just had to purchase a new monitor which only has inputs for
    HDMI and VGA. Since I have never been really satisfied with HDMI
    because this pc and 2 differnt monitors together with my previous pc
    and 2 different monitors all reproduced text of low quality

    HDMI (or DVI or DP) shouldn't ever be worse than VGA, in general most
    would consider them clearer.

    even with Cleartype being used. The text to me seemed slightly out of
    focus and at times I found it somewhat difficullt to read. With my
    previous monitor using either DV-I to DV-I or HDMI to DV-I cured this
    problem with the text then being very
    crisp and clear.

    DVI-D and HDMI video signals are identical, so that a passive adapter
    (i.e. simple plugs/sockets/cable) is all that's needed to convert
    between them in either direction (excluding any audio signal).

    SInce as I mentioned above my new monitor only has HDMI and VGA, can
    anyone advise me as to the quality of graphics I would expect if I
    use an HDMI to VGA cable ?

    I'd say it can't be better than just using the HDMI, and likely a little
    worse because you're using a DAC inside an active cable, followed by an
    ADC within the monitor.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ed Cryer@21:1/5 to Andy Burns on Mon Aug 14 15:10:41 2023
    Andy Burns wrote:

    scbs29 wrote:

    I have just had to purchase a new monitor which only has inputs for
    HDMI and VGA. Since I have never been really satisfied with HDMI
    because this pc and 2 differnt monitors together with my previous pc
    and 2 different monitors all reproduced text of low quality

    HDMI (or DVI or DP) shouldn't ever be worse than VGA, in general most
    would consider them clearer.

    even with Cleartype being used. The text to me seemed slightly out of
    focus and at times I found it somewhat difficullt to read. With my
    previous monitor using either DV-I to DV-I or HDMI to DV-I cured this
    problem with the text then being very
    crisp and clear.

    DVI-D and HDMI video signals are identical, so that a passive adapter
    (i.e. simple plugs/sockets/cable) is all that's needed to convert
    between them in either direction (excluding any audio signal).

    SInce as I mentioned above my new monitor only has HDMI and VGA, can
    anyone advise me as to the quality of graphics I would expect if I
    use an HDMI to VGA cable ?

    I'd say it can't be better than just using the HDMI, and likely a little worse because you're using a DAC inside an active cable, followed by an
    ADC within the monitor.


    I'll definitely back that from my own experience of using HDMI on TV's.
    Even now I use as monitors Samsung, JVC & Panasonic TV's, and the text
    looks very clear and sharp on all of them.

    There's something amiss here that we've not yet fathomed.

    Ed

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul@21:1/5 to All on Mon Aug 14 10:50:38 2023
    On 8/14/2023 7:29 AM, scbs29 wrote:
    Hello all
    I have just had to purchase a new monitor which only has inputs for HDMI and VGA.
    Since I have never been really satisfied with HDMI because this pc and 2 differnt monitors together with my previous pc
    and 2 different monitors all reproduced text of low quality even with Cleartype being used. The text
    to me seemed slightly out of focus and at times I found it somewhat difficullt to read. With my previous
    monitor using either DV-I to DV-I or HDMI to DV-I cured this problem with the text then being very
    crisp and clear.
    SInce as I mentioned above my new monitor only has HDMI and VGA, can anyone advise me as to the quality
    of graphics I would expect if I use an HDMI to VGA cable ?
    TIA

    Intel Core i7 10700F @ 2.90GHz (16 CPUs) Comet Lake 14nm Technology
    Memory: Kingston Fury Renegade 32768MB RAM 32.0GB Dual-Channel
    DDR4 @ 1196MHz (17-17-17-39)
    12Gb NVIDIA GeForce RTX 3060 (Gigabyte)
    Windows 10 Pro, 64-bit
    Mobo: Gigabyte Technology Co., Ltd. Model: H410M S2H V2Direct X 12
    GameMax GP Series 550w PSU
    DirectX 12


    I have a few different HDMI to VGA and DP to VGA adapters here.
    These are active adapters, that run off the +5V pin on the digital
    output connector. The VGA end is most likely to be missing its +5V pin
    (they stopped passing +5V at some point, which might potentially
    have been used to power the EDID read-only flash chip inside the monitor).

    Running HDMI to VGA seems to work fine at 1280x1024 and 1440x900. I don't currently have anything set up here, to test higher than that. The monitors
    on Daily Driver and Test Machine, are tiny.

    The highest VGA right off a video card, may have been 2048x2048 or so.
    Nobody runs stuff like that. There weren't really monitors for it.
    There were some over-cranked Trinitron setups, pushing 85Hz or so and
    pushed past their scan rate specs, but it's a wonder they didn't catch fire. Perhaps 1920x1200 is about as cheeky as any user would have got, with VGA. There would be too many VGA cable effects, to be trying 2048x2048. It
    would be an unusable setting, even if there was a way to view it.

    I don't know about anything higher than my tiny monitors, in terms of quality reports.

    When I bought my first HDMI to VGA adapter, I asked the clerk "if there were any returns on these for quality reasons". And his answer was no, and that visually "they seemed to be fine". And that's not a "wine expert" opinion, that's a "beer drinker" "yes, I could still read the screen" type report.
    And for the use-cases I use these various adapters for, I've not seen anything to refute his statement. "They seem to be fine" is how I would rate them to date.

    DVI-I = DVI-D + VGA signals, all on the same connector (use DVI-I to VGA adapter for VGA DB15 out)

    DVI-D = Just the digital signals, 165MHz max (1650 Mbit/sec per gun)
    Dual lane is available, for 2560x1600 resolution or less, still 165Mhz.

    HDMI = 330MHz for early versions (equivalent to dual-lane DVI-D
    and capable of 2560x1600 @ 60Hz or so, see Wiki DVI article for details)
    Later versions of HDMI supported much higher clocks than that.
    Same with DisplayPort, much higher clocks now.

    Transmission errors on HDMI or DP (using too long a digital cable), cause colored snow.
    There are no 2D X-Y distortion effects. It is VGA that has 2D ringing, ghosting, HSYNC/VSYNC tilted or distorted pictures. Analog transmission (VGA) is a "quality bitch".

    In addition, if you use a 25ft long VGA cable, the high frequencies roll off and the fonts become... "soft". That's the easiest way to make a fuzzy font,
    is a too long VGA cable. If you use a 25ft VGA cable, then use a 1024x768 native monitor, and "all will seem fine" :-/

    This is different than using the digital scaler inside the monitor, and adapting 1280x1024 to run on a 1366x768 monitor. The geometry of the fonts
    will then be ruined, as the scaler can only make integer size decisions,
    and if the scaling should have made a font 13.6 pixels high, the scaler
    can do 13 or 14 but not in-between. Some of the newest scaling technologies (inside video cards), are adaptive and recognize content types, and the
    way the scaling is done can be modified for better results. The scaler inside
    a monitor has "no hints whatsoever" to work with. It's not going to pull a rabbit out of a hat and make shitty fonts look fine. This is why we
    run monitors native! So the scaler will NOT engage. There will still be
    a four frame time delay through the monitor, but the scaler operation
    will be NULL. Same pixels in, as out. That's the way we want it to run.
    Making a mistake on the mode line formulation (speccing native mode),
    might trigger the monitor digital scaler, to do something stupid.

    Paul

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From VanguardLH@21:1/5 to scbs29@fred.talktalk.net on Mon Aug 14 11:33:49 2023
    scbs29 <scbs29@fred.talktalk.net> wrote:

    I have just had to purchase a new monitor which only has inputs for
    HDMI and VGA. Since I have never been really satisfied with HDMI
    because this pc and 2 differnt monitors together with my previous pc
    and 2 different monitors all reproduced text of low quality even with Cleartype being used. The text to me seemed slightly out of focus and
    at times I found it somewhat difficullt to read. With my previous
    monitor using either DV-I to DV-I or HDMI to DV-I cured this problem
    with the text then being very crisp and clear.

    SInce as I mentioned above my new monitor only has HDMI and VGA, can anyone advise me as to the quality
    of graphics I would expect if I use an HDMI to VGA cable ?
    TIA

    Intel Core i7 10700F @ 2.90GHz (16 CPUs) Comet Lake 14nm Technology
    Memory: Kingston Fury Renegade 32768MB RAM 32.0GB Dual-Channel
    DDR4 @ 1196MHz (17-17-17-39)
    12Gb NVIDIA GeForce RTX 3060 (Gigabyte)
    Windows 10 Pro, 64-bit
    Mobo: Gigabyte Technology Co., Ltd. Model: H410M S2H V2Direct X 12
    GameMax GP Series 550w PSU
    DirectX 12

    What is the brand and model of monitor you currently have connected to
    your PC (which is producing fuzzy text)?

    At what screen resolution are you running on that monitor? Regardless
    if your video card can run a higher resolution, you should use the
    native resolution of the monitor. Anything else will produce side
    effects, like blurred pixels (due to interpolation) or color tinging.

    For your NVIDIA GeForce RTX 3060, I see specs listed resolutions of
    1920x1080, 2560x1440, and 3840x2160. nVideo states 7680x4320, but they
    refer to their RTX 3060 "family", so there may be submodels with
    differing maximum resolutions. No way to lookup the *native* resolution
    of an unidentified monitor.

    If you buy a video card that exceeds the native resolution of your
    monitor, you got an underpowered monitor. Or, conversely, you didn't
    get a monitor to meet the full potential of your video card. No matter
    how mismatched the video card is to the monitor, always run the monitor
    at its native resolution. Any other resolution causes artifacts.

    The video card you got is for 8K. Did you also get an 8K monitor? An
    8K monitor costs $3800 versus a 4K monitor is far cheaper.

    8K: https://www.newegg.com/dell-up3218k-5-8k-uhd/p/0JC-0004-00SU5
    4K: https://www.newegg.com/p/pl?N=100160979%20601305587&Order=1

    I've never been drawn by the market glitz of 4K and 8K. No need to
    spend the money on superfluous higher resolution that I can recognize
    for difference. My old Dell S2719DM monitor (2560x1440 native) is what
    is called 3K: higher than 2K (2048x1080), but lower than 4K (3840x2160).
    My old AMD RX 580 video card can go up to a max of 7680x4320 on its DP
    port, but only up to 4096x3112 on its HDMI port. It doesn't have a VGA
    port, but it did then VGA is capped at 2048x1536. Despite my video
    card's resolution can exceed my monitor for both DP and HDMI connects, I
    run the video card at 2560x1440 to be at the *native* resolution of the monitor. Anything else causes bluriness, color tinging, or other
    artifacts. You want your video card or monitor trying to use
    interpolation on mismatched resolutions.

    Clear Text cannot overcome mismatched resolution from the video card to
    the monitor's native resolution.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From VanguardLH@21:1/5 to VanguardLH on Mon Aug 14 11:39:43 2023
    VanguardLH <V@nguard.LH> wrote:

    ... You DO NOT want your video card or monitor trying to use
    interpolation on mismatched resolutions. ...

    Clear Text cannot overcome mismatched resolution from the video card to
    the monitor's native resolution.

    Use a video card that can meet or exceed the native resolution of your
    monitor, but run that video card at the monitor's native resolution.

    As noted above, the DP and HDMI ports on my card have different max resolutions, but both exceed my monitor, so I can use either port to run
    the video card at my monitor's native resolution. You do NOT want to go
    to a VGA port (capped at 2048x1536) unless that is the native resolution
    of your monitor, or its native resolution is lower to which you can set
    your video card's VGA port.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From =?UTF-8?B?Li4ud8Khw7HCp8KxwqTDsSA=?@21:1/5 to All on Mon Aug 14 14:05:59 2023
    scbs29 wrote on 14-Aug-23 7:29 AM:
    Hello all
    I have just had to purchase a new monitor which only has inputs for HDMI and VGA.
    Since I have never been really satisfied with HDMI because this pc and 2 differnt monitors together with my previous pc
    and 2 different monitors all reproduced text of low quality even with Cleartype being used. The text
    to me seemed slightly out of focus and at times I found it somewhat difficullt to read. With my previous
    monitor using either DV-I to DV-I or HDMI to DV-I cured this problem with the text then being very
    crisp and clear.
    SInce as I mentioned above my new monitor only has HDMI and VGA, can anyone advise me as to the quality
    of graphics I would expect if I use an HDMI to VGA cable ?
    TIA

    Intel Core i7 10700F @ 2.90GHz (16 CPUs) Comet Lake 14nm Technology
    Memory: Kingston Fury Renegade 32768MB RAM 32.0GB Dual-Channel
    DDR4 @ 1196MHz (17-17-17-39)
    12Gb NVIDIA GeForce RTX 3060 (Gigabyte)
    Windows 10 Pro, 64-bit
    Mobo: Gigabyte Technology Co., Ltd. Model: H410M S2H V2Direct X 12
    GameMax GP Series 550w PSU
    DirectX 12


    One (or more) pieces of info is missing.
    => Your monitors model and max resolution.

    In almost all scenarios HDMI should be superior to VGA.
    - VGA resolution is capped at 2048×1536
    - VGA is not true high definition
    - Early VGA hardware did not support 1080p or higher



    --
    ...w¡ñ§±¤ñ

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul@21:1/5 to VanguardLH on Mon Aug 14 18:49:21 2023
    On 8/14/2023 12:39 PM, VanguardLH wrote:
    VanguardLH <V@nguard.LH> wrote:

    ... You DO NOT want your video card or monitor trying to use
    interpolation on mismatched resolutions. ...

    Clear Text cannot overcome mismatched resolution from the video card to
    the monitor's native resolution.

    Use a video card that can meet or exceed the native resolution of your monitor, but run that video card at the monitor's native resolution.

    As noted above, the DP and HDMI ports on my card have different max resolutions, but both exceed my monitor, so I can use either port to run
    the video card at my monitor's native resolution. You do NOT want to go
    to a VGA port (capped at 2048x1536) unless that is the native resolution
    of your monitor, or its native resolution is lower to which you can set
    your video card's VGA port.


    It's an RTX 3060. That's fairly recent vintage stuff. Better
    than what I've got.

    https://www.techpowerup.com/gpu-specs/geforce-rtx-3060-12-gb.c3682

    Outputs:
    1x HDMI 2.1
    3x DisplayPort 1.4a

    https://web.archive.org/web/20191117065644/https://www.hdmi.org/spec/hdmi2_1

    8K @ 60p and 4K @ 120p

    That will drive "anything in the showroom, at native resolution".
    The DP is quite similar.

    Displayport 1.4a

    1.4a Year 2018 4K @ 120 Hz and 8K @ 60 Hz (with DSC)

    *******

    OK, I tested it :-)

    iCAN HDMI v1.3 to VGA + 3.5mm Audio (HY-205-V0)
    Headphone jack, TOSLink (I don't use either)
    Couldn't find a Startech (and now they'd be too expensive),
    so that was the one I grabbed at the time. Grabbed to suit
    new video card on Daily Driver machine.

    1080p @ 60FPS max.

    Monitor - Acer 1920x1080 pseudo-IPS

    Results:

    1) On VGA, there is no geometry distortion, no twitching or twinkling.
    Cranking down the brightness on the monitor, it looks pretty good.
    I took pictures, but generally, pictures are useless (aliasing
    between camera sensor pixels and Moire from screen pixels).

    2) On HDMI, picture was garishly bright. Checked monitor OSD and setting
    was as used in (1) so nothing changed.

    This means, the NVidia control panel will need gamma, brightness/contrast
    adjustment, to make the picture as nice as the VGA converter gamma and such.

    HDMI picture was just as stable as (1). Only the color and brightness
    needs to be looked into (and I'm not messing with that, because I need to
    put the normal monitor back on the machine. Machine is dual control, so
    a lot of the cabling remained undisturbed, and it's all back as it was.

    Summary: "The only thing we have to fear, is fear itself, or, tripping on a cable"

    Paul

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Newyana2@21:1/5 to scbs29@fred.talktalk.net on Mon Aug 14 20:42:38 2023
    "scbs29" <scbs29@fred.talktalk.net> wrote

    | The text
    | to me seemed slightly out of focus and at times I found it somewhat difficullt to read. With my previous
    | monitor using either DV-I to DV-I or HDMI to DV-I cured this problem with
    the text then being very
    | crisp and clear.

    Is it possible that the resolution is not set optimum?
    I don't know about quality differences, but I have noticed
    that the 3 modes show up different resolution options
    on the same computer with the same monitor. If the
    resolution is not a good fit to the monitor you can get
    fuzzy.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From scbs29@21:1/5 to All on Tue Aug 15 10:22:25 2023
    Thankyou for all of the advice.
    I am now using a Samsung S24C366EAU 24" wide screen monitor set at 1920x1080 which is the native resolution. Before this
    I used a Samsung monitor, but cannot remember the model, again at 1920x1080 native resolution. As I said, I experienced
    the problem with the old monitor as well as the new one.
    According to the monitor spec the connections are
    HDMI
    Connects to a PC using the D-SUB cable
    It appears that I am better off sticking with the HDMI, combined with possibly playing with the Cleartype settings to
    try and improve the text.

    --
    remove fred before emailing

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ed Cryer@21:1/5 to All on Tue Aug 15 18:19:18 2023
    scbs29 wrote:
    Thankyou for all of the advice.
    I am now using a Samsung S24C366EAU 24" wide screen monitor set at 1920x1080 which is the native resolution. Before this
    I used a Samsung monitor, but cannot remember the model, again at 1920x1080 native resolution. As I said, I experienced
    the problem with the old monitor as well as the new one.
    According to the monitor spec the connections are
    HDMI
    Connects to a PC using the D-SUB cable
    It appears that I am better off sticking with the HDMI, combined with possibly playing with the Cleartype settings to
    try and improve the text.


    Now that you've got HDMI working ok, try scaling up to 175% resolution,
    which is what I stay with. The script is even clearer here than at 100%.

    Ed

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Brian Gregory@21:1/5 to All on Tue Aug 15 21:26:18 2023
    On 15/08/2023 10:22, scbs29 wrote:
    Thankyou for all of the advice.
    I am now using a Samsung S24C366EAU 24" wide screen monitor set at 1920x1080 which is the native resolution. Before this
    I used a Samsung monitor, but cannot remember the model, again at 1920x1080 native resolution. As I said, I experienced
    the problem with the old monitor as well as the new one.
    According to the monitor spec the connections are
    HDMI
    Connects to a PC using the D-SUB cable
    It appears that I am better off sticking with the HDMI, combined with possibly playing with the Cleartype settings to
    try and improve the text.

    Check also settings on the monitor.
    There may be a silly default on for something like "sharpness" or
    "sharpening" which is messing up your text when using HDMI.

    --
    Brian Gregory (in England).

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From VanguardLH@21:1/5 to Paul on Wed Aug 16 11:30:22 2023
    Paul <nospam@needed.invalid> wrote:

    It's an RTX 3060. That's fairly recent vintage stuff. Better
    than what I've got.

    Yep, already noted that, and the resolutions it can support. The OP, in another subthread, said he is using a Samsung S24C366EAU. That is a 24"
    curved monitor, but with only 1920x1080 (full HD) for its native
    resolution. His video card is far more robust than is his monitor.

    HD 1280 x 720 pixels
    Full HD: 1920 x 1080 pixels <-- OP has this on 24"
    2K 2048 x 1080 pixels
    4K or Ultra HD 3840 x 2160 pixels
    8K 7680 x 4320 pixels
    10K 10240 x 4320 pixels


    Big monitor, but low resolution (relative to the monitor size), seems to indicate the pixels will be large. The monitor is okay for viewing at a distance, but close up the pixels will be too large to have sharp text.
    With its curved surface, it's meant for immersion when watching video.
    But its large size and relatively low native resolution means it isn't
    for text.

    When visiting a brick and mortar computer store, you'll see all the
    monitors hooked to a video source that is constantly changing. Makes
    the monitors look better since showing a static image, like text, would illustrate which monitors are poor for reading text.

    I remember seeing a monitor at a great price with good specs, but it was
    fuzzy on text. They had it hooked to a common video source, so the
    screen was constantly changing. I changed the hook up to a computer
    with a static screen showing windows with some showing text. I kept
    adjusting the monitor's controls figuring someone had set it up wrong.
    Then eventually I noticed the .41 dot size in the specs. Geez, way too
    big for use as a desktop monitor where you're only maybe 19" away, or
    less, from the monitor, and when viewing text. There were other folks
    watching me doing the test, and when I realized the dot size was so
    huge, I said something like "No wonder it's so fuzzy. Won't ever get
    this monitor to look sharp for text." Everyone walked away from this
    great "deal". No one bought it after seeing what I found out. Of
    course, the store reconnected the crappy monitor to the video source to
    hide how poor was this cheap monitor in displaying static images or
    text.

    A 16:9 24" monitor is probably 20.92" wide by 11.77" tall. 1920 pixels
    across 20.92" is about 92 pixels per inch, or about 0.28 mm dot size.
    Not bad, but the bigger the monitor the fuzzier it will look at the same resolution (1920x1080). My monitor is 27" with 2560x1440 native
    resolution. It's a 16:9 ratio, so width is 23.53" giving a dot size of
    about 2560 pixels over 23.53", or about 109 pixels per inch, or 0.00919"
    per pixel, or 0.23 mm dot size. If I were to buy a new monitor now, I'd
    want 0.21 or even down to 0.19 mm dot size.

    Oops, the measurements I gave is called dot or pixel pitch: distance
    between pixel centers. That would be the maximum dot size, but it could
    be different (smaller) if the pixels had gaps between them. At 8 feet,
    a 1 mm dot size is okay for viewing moving images. The lower the pixel
    pitch, the closer you can view the monitor. The higher the pixel pitch,
    the farther away you view the monitor.

    Another problem users encounter is they buy a higher resolution monitor
    (that isn't much different in size from their old monitor), but all the
    text gets smaller. That's because the same number of pixels are used
    for form the text characters, but the higher resolution means the pixels
    are smaller so then are also all the text characters. Users have to up
    the DPI to make text look bigger to be legible. There are still some
    old, and even new, apps that are not DPI-aware which means an element in
    which text is display is not resizable, so the bigger text inside the same-sized element ends up lopping off the top or bottom of the text, or truncating the text.

    Higher pixel density costs more. The Samsung S24C366EAU is their
    "Essential Monitor" costing about $190 MSRP from Samsung, but down to
    $120 at other e-tailers. Not sure why, but curved screens seems to
    lower the native resolution of the monitor, so get a flat screen. A 24" 2560x1440 monitor, and staying with the OP's choice of the Samsung
    brand, will cost another $180 to $260 more (Newegg prices, https://www.newegg.com/p/pl?N=101702297%20600030619%204814%20600557170%2050001077).

    As with PSUs, don't go cheap on monitors, either. Quality costs more.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul@21:1/5 to Brian Gregory on Wed Aug 16 17:57:38 2023
    On 8/15/2023 4:26 PM, Brian Gregory wrote:
    On 15/08/2023 10:22, scbs29 wrote:
    Thankyou for all of the advice.
    I am now using a Samsung S24C366EAU 24" wide screen monitor set at 1920x1080 which is the native resolution. Before this
    I used a Samsung monitor, but cannot remember the model, again at 1920x1080 native resolution. As I said, I experienced
    the problem with the old monitor as well as the new one.
    According to the monitor spec the connections are
    HDMI
    Connects to a PC using the D-SUB cable
    It appears that I am better off sticking with the HDMI, combined with possibly playing with the Cleartype settings to
    try and improve the text.

    Check also settings on the monitor.
    There may be a silly default on for something like "sharpness" or "sharpening" which is messing up your text when using HDMI.


    Manual

    https://downloadcenter.samsung.com/content/UM/202304/20230403035244001/S36C_WEB_EU_L25_20221227.zip

    Agreed. The unit has too many settings.
    Fiddler on the roof.

    Paul

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)