EzDevInfo.com

vga interview questions

Top vga frequently asked interview questions

Dual-monitors with one VGA port?

Would it be possible to use some sort of splitter to make Windows display to the other monitor as if I had two ports (both unique, not cloning from the primary.)?


Source: (StackOverflow)

HDMI vs Component vs VGA vs DVI vs DisplayPort

What are the pros and cons of each of these different display adapters and cables?

From what I can understand, HDMI offers the ability to send audio along the same cable as well as the ability to do progressive scan.

I've Googled but I can't seem to find any real answers. Why would someone care to run 1280x1024 over HDMI or DVI instead of VGA? What about component?

All I hear is one is digital and one is analog, but I can't find what that means from a feature/benefit stand point.


Source: (StackOverflow)

Advertisements

Maximum resolution through VGA/DVI/HDMI(/etc)?

I know the original VGA standard was meant to output 640x480 and that other standards over the original VGA connector are developed to output a higher resolution. (SVGA, XGA, etc.) But I was wondering if there's a specific limit to the resolution that the VGA connector can take.

Furthermore, are, and if so how are for example DVI and HDMI limited on resolution?


Source: (StackOverflow)

How to Use 3 Monitors

Right now my setup has a nice big 24" flatscreen in the center with a 19" flatscreen to the left. But I have a big gaping hole on the right.

I have a 3rd monitor to put there, but I'm not sure how to get the computer to recognize it. Do I need a graphics card with 3 ports? Can I span the monitors over non SLI-Linked graphics cards? Is it possible to plug my 3rd monitor into the on-board VGA port and have it work?


Source: (StackOverflow)

Is VGA port hot-pluggable?

In meetings, I often see people detaching the VGA connector from one running laptop and connecting it to another, while the projector is still on.

Is this 100% risk free, and OK by design of the VGA standard? If there's a risk involved in hot-plugging VGA, can it be removed by turning off or suspending either laptop, display, or both?

I see this being done all the time without causing disaster, so clearly I'm not interested in answers stating "we do it all the time, so it should be OK!".

I want to know if there's a risk - real or in theory - that something breaks when doing this.

EDIT: I did an internet search on the topic, and I never found a clear statement as to why it is safe or unsafe to hot swap VGA devices. The typical form is a forum question asking basically the same question as I did, and the following types of statements

  • Yes it's hot swappable! I do it all the time!
  • It involves some kind of risk, so don't do it!
  • You're some kind of moron if you think there's a risk, so just do it!

But no explanation as to why it safe or not...

Joe Taylors answer below contains a link to a forum post and answers that basically give me the same statements as mentioned above. But again, no good explanation why.

So I looked for an actual manual for a projector, and found "Lenovo C500 Projector User’s Guide". It states on page 3-1:

Connecting devices

Computers and video devices can be connected to the projector at the same time. Check the user’s manual of the connecting device to confirm that it has the appropriate output connector.

[image]

Attention: As a safety precaution, disconnect all power to the projector and devices before making connections.

But again, no good explanation.


Source: (StackOverflow)

How can I connect a DVI monitor to the VGA port in my laptop?

Is there a device I can buy that will convert the analog signal to digital?


Source: (StackOverflow)

Monitor (DVI-D) to Laptop (VGA)

I am simply trying to use my DVI monitor with my laptop that only accepts VGA. I've been doing some research for an adapter from DVI female to VGA male and it seems that not only do I need an adapter, but I also need a converter, which costs around 100 dollars.

Is there another alternative? I have an LG monitor that uses DVI-D and currently has a DVI cable attached. Is it possible for me to find and use a cable that is DVI to VGA that would work with my laptop?

I have also read somewhere that sometimes the video card does the converting so people only need the adapters. I am currently using an NVIDIA Geforce GT 540M, so would just purchasing the adapter and not a converter be okay for my purpose?


Source: (StackOverflow)

What is the pixel clock setting on my monitor actually doing?

I am experiencing display interference on a dell 24" flat panel monitor.I find that if I adjust the pixel clock settings up or down in the monitor's on-screen menus, the interference goes away for a while.

The monitor is attached to a Macbook Pro using a mini display to VGA adapter. I have found that in a different house, I get the interference problem less so it might be related to electricity supply or possibly even ethernet powerline (total guess).

What does the pixel clock setting actually do and does this behaviour point to a likely cause of the interference?


Source: (StackOverflow)

Installing new OS without screen?

I got an old laptop which works perfectly except for a broken screen. I would like to install Linux on it and use it as a home server. The problem is that the VGA output works only after it boots into the OS, so I cannot access the BIOS to change the boot sequence.

Any ideas?

Thanks in advance


Source: (StackOverflow)

3 displays with Dell Laptop (2 monitors + laptop display)

I have a Dell Latitude 8420. It is connected to a dock.

Is there any way to have two monitors connected to this setup and also use the laptop screen too? I want the desktop extended on the 3 displays. I am using Windows 7 Professional.

How would things change if I wanted to have 3 external monitors and not use the laptop display?


Source: (StackOverflow)

How to force Fedora 12 to start to text mode?

I have a problem with my newly installed ATI driver so my Fedora 12 boots to a frozen graphic mode display.

I need to force it to boot to text mode interface to start adjusting my X configuration. What to do?

This is a new Fedora 12 installation without any change to GRUB etc.

Thanks!


Source: (StackOverflow)

HDMI to VGA and Display Port to VGA adapters

I stumbled upon a problem after I upgraded my hardware, I was so excited about it, but when I realized that I can't connect my monitors this disappointed me.

I have 3 FullHD 22 inch monitors with a resolution of 1920x1080 and all 3 with VGA ONLY inputs (yeah, I bought them more than 6 years ago), and a Sapphire Radeon 7970 with

  • 1 x DVI-I
  • 1 x DVI-D
  • 1 x Display Port
  • 1 x HDMI

ports (before that there was a simplier Radeon with 2 DVI-I's and a VGA port). As I saw on the hills of the internet there is no way to connect these monitors from a DVI-D, Display Port or HDMI to VGA with some Passive adapters? Or I can? I am not afraid of doing some DIY things to make it work, cause I don't really want now to try sell these monitors and find ones that come with normal modern inputs, because this is a lot of time and some higher expenses.

For now I am using only one monitor connected to DVI-I with an DVI-I to VGA adapter and it works fine, so remained other 2 to connect. What can be the solution for now?


Source: (StackOverflow)

Monitor image became offset to the right. How to fix this?

Got this odd problem where the image in the monitor is offset to the right. It’s offset a lot.

I have an external monitor, which is connected to a laptop. I don’t use the laptop’s display, only the external monitor. The monitor is Asus AL2216W (found a manual). It’s connected through VGA. The video card is Mobile Intel 965 Express Chipset.

Today, I needed to connect the monitor to another laptop. It worked. When I’ve connected the monitor back to my main laptop, the image in the monitor is offset.

Here's what I've tried so far:

  1. My monitor has a button for auto adjustment. I didn’t remove the offset
  2. My monitor has a menu for manual adjustment for the horizontal position. But, the offset is so large that this adjustment doesn’t have enough range.
  3. I have unplugged and powered down the monitor. The offset is still there.
  4. If I change the resolution to 1600 by 1200, the image becomes properly centered, and there is no offset. But, when I change it back to 1680 by 1050, the offset is back.
  5. Connected the monitor to yet another laptop. The offset is still there; same amount of offset. That implies that the problem is in the monitor, and not in the video card.

Are there more things I could try?

update: I've bought another monitor with 1680 by 1050 resolution. It's a different model, although I don't know if that matters. It worked right away. The mystery of the offset will, probably, remain unsolved.


Source: (StackOverflow)

How can I get my dual monitors to work in Windows 7, after no online solutions work?

I will start by saying this is the first time I have tried to perform this particular task, however, I am an IT professional, and I do know my way around a computer.

The problem is I can't "extend" my desktop to two monitors, no matter what I try.

Things I've Tried:

Plugging both VGA connectors (one needs a DVI converter) into the motherboard video card.

Fails Because--Windows 7 Doesn't detect a second monitor.

Plugging both VGA connectors (one needs a DVI converter) into an external video card (MSI RX300HM(V032)).

Fails Because--Windows 7 Doesn't detect a second monitor.

Plugging one VGA connector into the external video card and the other VGA connector into the motherboard video card (using no DVI converters).

Fails Because--Windows 7 Doesn't detect a second monitor.

Making sure I have the most updated drivers for the video card (and monitors, where Windows will detect it), then trying all of the above, again.

Fails Because--All of the same reasons, stated above.

Checking the monitors and making sure they work separately.

They both work fine. In fact, when I plug them both into the external video card, they both display the same "duplicate" screen.

Forcing Windows to output to "No Display Detected" but it just puts "what it thinks" is the second monitor, off in the blank void to the left of my other two monitors.

Downloading ATI CCC and trying to manage it that way.

Fails Because--There is no "Desktop Manager link (the link I am lead to believe will lead to the configuration screen for dual monitors and such). Also, this is not the only place therein that I checked. I scoured that interface for configurations suitable to change such settings, but all to no avail.

It may be relevant (although, hopefully not) to note that the two monitors aren't the same brand (one is Acer and the other is Philips).

Also, it is important to note that I "rebooted" my machine between all of the major steps listed above.

What else could be the issue?

Aren't all relatively new video cards supporting dual monitors these days? (I am beginning to think they don't, as that seems the only reasonable explanation to my dilemma).


Source: (StackOverflow)

How much video quality is lost over VGA vs DVI?

I have a netbook that I want to connect an external monitor to. The netbook only has a VGA output. How much quality is lost due to using a VGA interface, as opposed to DVI one? Will the display be noticeably more "blurry"?

If so, can someone explain why this is? A pixel is still a pixel, so is the color information getting lost in the D->A then A->D conversion?


Source: (StackOverflow)