English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Hi,

I am thinking of purchasing a new video card. The one I am looking at has the correct specs but is "dual display" . I only want to use it on a single monitor. How does this all work. I am new to this sort of stuff.

Thanks so much!

2007-06-10 08:58:16 · 5 answers · asked by Anonymous in Computers & Internet Hardware Desktops

5 answers

The term "dual display" just means that it supports up to two monitors. Most mid to high end cards do these days. But it'll work just fine if you only have one monitor.

2007-06-10 09:06:13 · answer #1 · answered by Anonymous · 2 0

it supports dual displays which means you can output to an additional display if you want to its not a "has to be dual display" option its just for people who want to use that option for say hooking up your media center pc to a tv output (which i do to display it on my projector screen at an amazing 109" display,but remember this most higher end gaming and video cards include the dual or multiple display option.
One of the reasons that computers are getting cheaper is because monitors are getting cheaper. The video hardware that drives them is also more affordable than ever before. Microsoft realized that when they released Windows 98 and built in support for multiple monitors. That support continued with 98SE, 2000 and ME. If you use NT, you're not out of luck, but this article won't help you. Your best alternative is to use a dual headed card, like the Matrox G450, one with two monitor outputs.

Getting dual monitors to work correctly seems to be a hit or miss affair. In many cases, it's just a matter adding a second video card and monitor and rebooting the computer. In other cases, you need to do some cyber gymnastics to get things working correctly.

The process is the same no matter which version of Windows you are using. First, be sure that your system is working correctly. Boot into safe mode and verify that you have only one video adapter and monitor showing in the Device Manager. If there are more, but you only have one adapter and monitor, you'll need to remove the extras.

Once you're sure that your video setup is working fine, turn off the system and look inside. Remember that you'll need a video card for each monitor that you plan to run under Windows. That means that your system must have a free expansion slot for each one. You may find that you have problems with IRQ sharing or DMA channels, depending on your motherboard and video card, although many video cards seem relatively intolerant of IRQ and DMA conflicts. Install your video card in the appropriate slot, and then connect the monitor.

Restart your system. If all goes well, your computer will boot the same way that it always has…the second monitor will still be dark. If your system prompts you to login, do it. You may need to install drivers for your second video card, depending upon whether or not Windows has built in drivers or not. Just follow the on-screen instructions.

2007-06-10 11:19:44 · answer #2 · answered by nottheone4u 2 · 0 0

I am thinking about doing a dual display, I believe you can run one monitor.
http://www.datpcstore.com/shop/video-cards-c-69.html

2007-06-10 09:33:35 · answer #3 · answered by datpcstore 2 · 0 0

You cant have 3 video show instruments going at as quickly as till you have a SLI in a position motherboard which we could u have 2 photos taking part in cards (larger end), meaning you may have 4 video show instruments going at as quickly as, you additionally can use 3 video show instruments in case you needed to. yet to try this you decide on a sparkling laptop, or improve.

2016-11-10 00:48:29 · answer #4 · answered by ? 4 · 0 0

you can have one monitor in use.. you don't need to have a second one connected

2007-06-10 09:23:16 · answer #5 · answered by Carling 7 · 0 0

fedest.com, questions and answers