Second monitor resolution issues

theterminator

Wise Old Owl
In my office I am connecting my laptop with a 20” acer monitor V206HQL. The connection is through a VGA to Hdmi converter since laptop doesn’t have vga port and monitor doesn’t have hdmi. My laptop’s native resolution is 1366*768 & monitor is 1600*900. But upon selecting 1366*768 or 1600*900 the monitor says Input type not supported.
When going into the monitors adapter settings in Windows 10 settings in laptop, the active resolution & input signal is both 1024*768.

*uploads.tapatalk-cdn.com/20210603/16126e2c2ccae72540d66843be157802.jpg

*uploads.tapatalk-cdn.com/20210603/9396f1215246550ab683a1e7bb3526a4.jpg

How can I set the resolution on monitor to 1600*900 so that I could use the bigger screen.

Any resolution higher than 1024*768 I select from the List all modes option results in message of Input Type not supported on the monitor.
 

Vyom

The Power of x480
Staff member
Admin
Have you tried lowering the refresh rate while setting higher resolution?
 

RumbaMon19

Feel Pain.
HDMI is digital, while VGA is Analog. When that HDMI to VGA converter is used, it has to convert digital to analog signal. Now in HDMI the resolution set depends on the display, To whatever it supports and hence only those options are available. But here the resolution depends on the converter. Most converters only support standard resolutions , like 1080 or 1440 or 2160. Here in your case, when the card is told to increase the resolution, it tries to do it, but the signal generated is not supported by the monitor. So either the monitor does not support the resolution(Which is maximum 1080 in most VGA monitors) Or the refresh rate is too high for the monitor. Now these cards usually dont support changing refresh rates(Happens in my case). So you need to scale it instead. As you are setting it to 1600 or 1366 like resolutions, these are not pretty standard.
 
OP
T

theterminator

Wise Old Owl
HDMI is digital, while VGA is Analog. When that HDMI to VGA converter is used, it has to convert digital to analog signal. Now in HDMI the resolution set depends on the display, To whatever it supports and hence only those options are available. But here the resolution depends on the converter. Most converters only support standard resolutions , like 1080 or 1440 or 2160. Here in your case, when the card is told to increase the resolution, it tries to do it, but the signal generated is not supported by the monitor. So either the monitor does not support the resolution(Which is maximum 1080 in most VGA monitors) Or the refresh rate is too high for the monitor. Now these cards usually dont support changing refresh rates(Happens in my case). So you need to scale it instead. As you are setting it to 1600 or 1366 like resolutions, these are not pretty standard.

The monitors native resolution is 1600*900. My office computer is set to that. I want the same when I connect my laptop to it but it changes to 1024*768. Even at 1366*768 it says not supported.
 
Top Bottom