EzDevInfo.com

intel-graphics interview questions

Top intel-graphics frequently asked interview questions

Is Intel/Nvidia Optimus technology supposed to switch instantly between GPUs?

I have a brand new Sony Vaio Flip 15, and it takes at least 1 second to switch between GPUs, during which time my screen goes off. Is this expected? Or should the switch be instant and happen between frames?

Suppose I don't know anything about computers (and likewise I might not even be asking a question here): I would think my laptop is broken/glitchy because every time I plug or unplug the power adapter the screen goes off for 1 whole second (I'm oblivious to the fact that GPUs are switching).

Is the switch supposed to be fast?


Source: (StackOverflow)

Installing Nvidia drivers on Linux Mint 15

I'm a Linux newbie and any help and links are appreciated.

My problem is that I cannot find any stable way to install proprietary nVidia drivers on my system (Linux Mint 15). Is there some tutorial existing about how can I make it with simple apt-get install?

UPDATE: Just to clarify, if it matters, "System Info" says that I have two video cards:

  • Graphics Card 0: Intel Corporation 3rd Gen Core processor Graphics Controller
  • Graphics Card 1: NVIDIA Corporation GF108M [GeForce GT 635M]

Source: (StackOverflow)

Advertisements

How do I run the integrated video adaptor alongside the GPU on a Ivy Bridge system?

I've got an Asus P8Z77 motherboard with a core i3775 processor, and a Geforce 660 video card. I'm running Windows 8.

While I primarily use the latter, there's situations where I want to use the HD4000 video adaptor built into the core processor - for example where I want to take advantage of quicksync. How would I do this?


Source: (StackOverflow)

DisplayPort monitor turns off when 2 monitors are used

I have the motherboard "Gigabyte Z77x-ud3h" with a 3rd Gen Intel CPU with HD4000 graphics. I am using Ubuntu 12.10.

When I connect one monitor (Dell 3011) with DisplayPort and a second monitor (Iyama) with HDMI, after a while the Dell goes into power saving mode.

When I connect just the Dell monitor (DisplayPort), it works, and also when Dell is connected by DVI, but then I don't get full resolution.

Why does the HDMI monitor go to sleep after a while when another monitor is simultaneously connected with DisplayPort?


Source: (StackOverflow)

Is there a fix for Intel graphics driver problem and sp1 on Windows 7

Windows Update still hasn't given me SP1 yet and I have igdumd64.dll driver is there a fix yet?


Source: (StackOverflow)

Can I force my laptop to use NVIDIA instead of Intel?

I have a Dell N5110 laptop with integrated Nvidia GeForce GT 525M and an Intel HD Graphics 3000 card; My problem is the default card is the Intel one, and it provides a very bad performance with Windows 8. So can I force to use the Nvidia one all the time including "Desktop Windows Manager" ?


Source: (StackOverflow)

Is it possible to use three screens with Intel HD 3000 integrated graphics?

I'm getting a new Lenovo T520 with Intel integrated graphics.

Is it possible to use three screens with Intel HD 3000 (the laptop internal screen + two external screens)?


Source: (StackOverflow)

How do I know which of my laptop GPUs are running games?

My laptop has 2 GPUs, one of them is the default Intel HD Graphics 3000, and the other the Nvidia Geforce GT 630M. Obviously I want to use the 630M to play my games. How can I assure this is happening? When I go into Start → Run → dxdiag and look at the Display tab, it says that I am using the Intel HD Graphics Family for the display. Does this mean it is being used for games too?


Source: (StackOverflow)

Intel HD4000 with 4k HDMI output on Linux

I wonder if I can output 4k resolution (3840 × 2160) @ 30Hz with a Intel HD4000 graphics using Linux and a single HDMI output.

I have a Thinkpad Edge E330 with HDMI 1.4. The CPU is an i5-3210M.

I know that there is a problem with the pixel clock (which is limited to 165MHz) which limits the output maximal output power. Now I know, that there are patches for Mac OS X.

Where is the pixel clock limited? In the kernel? In the graphics driver? Can it be removed/fixed in general?

Is the HD4000 powerful enough to be able to drive 4k @ 30Hz in general?


Source: (StackOverflow)

Laptop + 2 Displays Issue

I'd like to extend the display of my ASUS N53SV-SZ038V to two external monitors.

The laptop has one HDMI output and one VGA output. I have each connected to an external monitor. The problem is that it only allows two displays to show at a time (either laptop display + 1 monitor or 2 monitors). This is highlighted below:

Display Settings

The laptop has integrated Intel HD Graphics 3000 on the i7-2630QM as well as an NVIDIA GT540M. Whilst I don't think that either can support 3 displays directly, I was hoping that I might be able to offload one of the displays (or more) onto the graphics card.

After going into the NVIDIA control panel, I saw the below and thought there might be a way to rearrange it, but it appears not:

NVIDIA Control Panel

It would be much appreciated if someone could provide some guidance on whether or not this can be achieved, or indeed any other suggestions.


Source: (StackOverflow)

Connecting a 2560x1440 display to a laptop?

Having read Jeff Atwood's blog post on Korean 27" IPS LCDs, I've been wondering to what extent these are useful in a notebook + large display situation.

I own a Lenovo Thinkpad Edge E320 with 2nd gen. integrated Intel graphics. According to the spec from Intel, this should support HDMI version 1.4, and, using DisplayPort, resolutions up to 2560x1600. HDMI version 1.4 supports resolutions up to 4096×2160, however, according to c't (German), the HDMI interface used with Intel chips only supports 1920x1200. The same goes for the DVI output - dual-link DVI-D, apparently, is not supported by Intel.

It would appear that my laptop cannot digitally drive this kind of resolution. Now what about other laptops?

According to the article in c't above, AMD's integrated graphics chips have the same limitation as Intel's.

NVIDIA graphics cards, apparently, only offer resolutions up to 1900x1200 over HDMI out of the box, but it's possible, when using Linux at least, to trick the driver into enabling higher resolutions. Is this still true? What's the situation on Windows and OSX?

I found no information on whether discrete AMD chips support ultra-high resolutions over HDMI.

Owners of laptops with (Mini) DisplayPort / Thunderbolt won't have any issues with displays this large, but if you're planning to go for a display with dual-link DVI-D input only (like the Korean ones), you're going to need an adapter, which will set you back something like €70-€100 (since the protocols are incompatible).

The big question mark in this equation is VGA: a lot of laptops have it, and I don't see any reason to think this resolution is not supported by the hardware (an oft-quoted figure appears to be 2048x1536@75Hz, so 2560x1440@60Hz should be possible, right?), but are the drivers likely to cause problems?

Perhaps more critically, you'd need a VGA to dual-link DVI-D adapter that converts analog to digital signals. Do these exist? How good are they? How expensive are they? Is there a performance penalty involved?

Please correct me if I'm wrong on any points.

In summary, what are the requirements on a laptop to drive an external LCD at 2560x1440, in particular one that supports dual-link DVI-D only, and what tools and adapters can be used to lower the bar?


Source: (StackOverflow)

Intel hd or Mobility Radeon? [closed]

I have decided to get a new Dell Studio 15 laptop. They have an option of adding a "ATI Mobility Radeon™ HD 4570, 512MB" card for an extra $100 instead of the included "Intel HD Graphics" card.

How do the two compare? The laptop comes with an Intel Core i3 processor, maybe the Intel graphics card would be more compatible with the processor since they're made by the same manufacturer?


Source: (StackOverflow)

In Linux, how do I correctly configure display geometry with multiple monitors on multiple GPUs (Intel and nVidia)?

I want a triple monitor setup to work correctly.

My setup is as follows:

  • Linux Mint 16 x64
  • Intel Core i5-2500k
  • GeForce GTX 560 Ti Cu II
  • A monitor on the far right connected to the motherboard (integrated graphics on the i5)
  • A central monitor connected to the graphics card
  • A monitor on the far left connected to the graphics card

I'm using the following xorg.config

Section "ServerFlags"
    Option "DefaultServerLayout" "PrimaryLayout"
    Option "Xinerama" "off"
EndSection

Section "Module"
    Load "glx"
EndSection

Section "InputDevice"
    Identifier     "Mouse"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/psaux"
    Option         "Emulate3Buttons" "no"
    Option         "ZAxisMapping" "4 5"
EndSection

Section "InputDevice"
    Identifier     "Keyboard"
    Driver         "kbd"
EndSection

Section "Device"
    Identifier "Intel HD Graphics 3000"
    Driver     "intel"
EndSection

Section "Device"
    Identifier     "Geforce GTX 560 Ti"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    Screen 0
EndSection

Section "Monitor"
    Identifier "AOC"
    Option "Primary" "true"
EndSection

Section "Monitor"
    Identifier "Samsung"
EndSection

Section "Monitor"
    Identifier "ViewSonic"
EndSection

Section "Screen"
    Identifier "Samsung"
    Device "Intel HD Graphics 3000"
    Monitor "Samsung"
    SubSection "Display"
        Depth 24
    EndSubSection
EndSection

Section "Screen"
    Identifier "AOC"
    Device "Geforce GTX 560 Ti"
    Monitor "AOC"
    SubSection "Display"
        Depth 24
    EndSubSection
EndSection

Section "Screen"
    Identifier "ViewSonic"
    Device "Geforce GTX 560 Ti"
    Monitor "ViewSonic"
    SubSection "Display"
        Depth 24
    EndSubSection
EndSection

Section "ServerLayout"
    Identifier    "PrimaryLayout"
    Screen        "AOC" 0 0 
    Screen        "ViewSonic" LeftOf "AOC"
    Screen        "Samsung" RightOf "AOC"
    InputDevice   "Keyboard" "CoreKeyboard"
    InputDevice   "Mouse" "CorePointer"
EndSection

Section "ServerLayout"
    Identifier "SingleLayout"
    Screen "AOC" 0 0
    InputDevice "Keyboard" "CoreKeyboard"
    InputDevice "Mouse" "CorePointer"
EndSection

Which has the following effect:

  • The far right monitor doesn't work
  • The central and left monitors work as expected

A little bit more info:

  • I'm on kernel 3.11.0-12-generic
  • I'm using nvidia proprietary driver version 331.67

Source: (StackOverflow)

What GPU does the Intel N3700 have?

I have an Intel N3700 CPU, but I can't figure out which GPU I have.

CPU-Z says "Intel HD Graphics," but which version?

Here is a screenshot from CPU-Z:

enter image description here


Source: (StackOverflow)

Fix overscan in Linux with Intel graphics Vizio HDTV

I am connecting my server to my HDTV so that I can conveniently display it there. My VIZIO HDTV cuts off all 4 edges. I already realize it is not optimal to be running a GUI on a server; this server will not have much external traffic so I prefer it for convenience.

I have already spent countless hours searching for a fix, but all I could find required an ATI or NVIDIA graphics card, or didn’t work. In Windows, the Intel driver has a setting for underscan, though it seems only to be available by a glitch.

Here’s my specs:

  • Ubuntu Linux (Quantal 12.10) (Likely to switch to Arch)
  • This is a home server computer, with KDE for managing(for now, at least)
  • Graphics: Intel HD Graphics 4000 from Ivy Bridge
  • Motherboard: ASRock Z77 Extreme4
  • CPU: Intel Core i5-3450

My monitors:

  1. Dell LCD monitor
  2. Vizio VX37L_HDTV10A 37" on HDMI input

I have tried all of the following from both HDMI⇨HDMI and DVI⇨HDMI cables connected to the ports on my motherboard:

  1. Setting properties in xrandr
  2. Making sure drivers are all up to date
  3. Trying several different modes

The TV was “cheap”; max resolution 1080i. I am able to get a 1920x1080 modeline, in both GNU/Linux and Windows, without difficulty. There is no setting in the menu to fix the overscan (I have tried all of them, I realize it’s not always called overscan). I have been in the service menu for the TV, which still does not contain an option to fix it. No aspect ratio settings, etc. The TV has a VGA connector but I am unsure if it would fix it, as I don’t have a VGA cable long enough, and am not sure it would get me the 1920x1080 resolution which I want. Using another resolution does not fix the problem.

I tried custom modelines with the dimensions of my screen’s viewable area, but it wouldn’t let me use them.

Ubuntu apparently doesn’t automatically generate an xorg.conf file for use. I read somewhere that modifying it may help solve it. I tried X -configure several times(with reboots, etc.) but it consistently gave the following error messages:

In log file:


(WW) Falling back to old probe method for vesa
Number of created screens does not match number of detected devices.
Configuration failed.

In output:


(++) Using config file: "/root/xorg.conf.new"
(==) Using system config directory "/usr/share/X11/xorg.conf.d"
Number of created screens does not match number of detected devices.
Configuration failed.
Server terminated with error (2).
Closing log file.

Tried using 'overscan' prop in xrandr:

root@xxx:/home/xxx# xrandr --output HDMI1 --set overscan off
X Error of failed request: BadName (named color or font does not exist)
Major opcode of failed request: 140 (RANDR)
Minor opcode of failed request: 11 (RRQueryOutputProperty)
Serial number of failed request: 42
Current serial number in output stream: 42

'overscan on', 'underscan off', 'underscan on' were all also tried.

Originally tried with Ubuntu 12.04, but failed and so updated to 12.10 when it was released. All software is up to date.

Update: I just bought a new TV and the new one has plenty of options for fixing this, so for me it's resolved. Still interested to know of a solution for this absurd problem that shouldn't be though.


Source: (StackOverflow)