gpu interview questions
Top gpu frequently asked interview questions
I have a laptop, which has a screen resolution of 1366x768, as most laptops have. Is there any way to increase it? The laptop is an HP Pavilion dv6, with Intel HD 3000 graphics and Radeon HD 6490M GPU.
Edit: I want to increase the resolution beyond 1366 X 768, as I have a application that requests that, and I want to run the application on this laptop only. The Application Request the Resolution of 1280x900 or higher.
Source: (StackOverflow)
Are there any programs available for monitoring and measuring GPU temperature? I'd prefer something free, and for either Windows XP or Windows 7.
Source: (StackOverflow)
As far as I can tell GPUs can easily run at much higher temperatures than CPUs without problems. Aren't they both made out of the same materials? Why are GPUs capable of operating at temperatures that would kill CPUs?
Source: (StackOverflow)
I temporarily disable ECC memory protection on a NVIDIA K20m (device 0 in my node) and now I cannot bring it back to work again.
Before that it was working properly with ECC enabled.
So, here is what I did:
I disabled ECC with
nvidia-smi -i 0 --ecc-config=0
and rebooted. When it came up it showed 100% GPU utilization and it wouldn't start any kernels (it actually already failed when creating the context). The reason was a double bit error. I reset it with
nvidia-smi -i 0 --reset-ecc-errors=0
and rebooted the node. After the reboot the device utilization was 0% and I could start jobs as usual. A few hours later the device showed again a 100% GPU utilization. This time it didn't report a double bit error (not even a single bit error). However, since I couldn't run any job I rebooted the node and it came up with 100% GPU utilization, I can't use it, but reports no bit errors. What's the matter this it?
GPU 0000:02:00.0
Product Name : Tesla K20m
Display Mode : Disabled
Persistence Mode : Enabled
Driver Model
Current : N/A
Pending : N/A
Serial Number : 0324512044699
GPU UUID : GPU-9bfe1aba-1628-a406-3ed5-2af49462a997
VBIOS Version : 80.10.11.00.0B
Inforom Version
Image Version : 2081.0208.01.07
OEM Object : 1.1
ECC Object : 3.0
Power Management Object : N/A
GPU Operation Mode
Current : Compute
Pending : Compute
PCI
Bus : 0x02
Device : 0x00
Domain : 0x0000
Device Id : 0x102810DE
Bus Id : 0000:02:00.0
Sub System Id : 0x101510DE
GPU Link Info
PCIe Generation
Max : 2
Current : 2
Link Width
Max : 16x
Current : 16x
Fan Speed : N/A
Performance State : P0
Clocks Throttle Reasons
Idle : Not Active
User Defined Clocks : Not Active
SW Power Cap : Not Active
HW Slowdown : Not Active
Unknown : Not Active
Memory Usage
Total : 4799 MB
Used : 12 MB
Free : 4787 MB
Compute Mode : Default
Utilization
Gpu : 100 %
Memory : 0 %
Ecc Mode
Current : Enabled
Pending : Enabled
ECC Errors
Volatile
Single Bit
Device Memory : 0
Register File : 0
L1 Cache : 0
L2 Cache : 0
Texture Memory : 0
Total : 0
Double Bit
Device Memory : 0
Register File : 0
L1 Cache : 0
L2 Cache : 0
Texture Memory : 0
Total : 0
Aggregate
Single Bit
Device Memory : 0
Register File : 0
L1 Cache : 0
L2 Cache : 0
Texture Memory : 0
Total : 0
Double Bit
Device Memory : 0
Register File : 0
L1 Cache : 0
L2 Cache : 0
Texture Memory : 0
Total : 0
Temperature
Gpu : 30 C
Power Readings
Power Management : Supported
Power Draw : 49.51 W
Power Limit : 225.00 W
Default Power Limit : 225.00 W
Min Power Limit : 150.00 W
Max Power Limit : 225.00 W
Clocks
Graphics : 758 MHz
SM : 758 MHz
Memory : 2600 MHz
Applications Clocks
Graphics : 705 MHz
Memory : 2600 MHz
Max Clocks
Graphics : 758 MHz
SM : 758 MHz
Memory : 2600 MHz
Compute Processes : None
Source: (StackOverflow)
I am curious on whether I could slap a high end video card into a server and use it as a high end gaming machine.
The server is a HP DL360 G5 with x2 Intel Quad Core Xeon processors, 16 GB of RAM and 6 72 GB SAS drives.
The reason I was curious is the server has a 16x PCI slot and I am pretty sure I could drop a full size graphics card into it.
Is it even possible to use a server as a gaming machine? Would the performance be better or worse than a "gaming" machine.
Source: (StackOverflow)
Some times my video card driver (Intel GMA X4500HD) crashes and gets restarted automatically. But more often (almost every time I watch a flash-powered online video for a period of time) it just gets slower and slower and extremely slow until I restart my computer. This looks like a resource leak in the video driver code and I'd like to try restarting it alone without restarting the whole system. How do I restart it (or cause it to crash immediately to be restarted automatically by the OS) manually?
Source: (StackOverflow)
I have laptop which has working Ethernet port but I always use WiFi . I am wondering if it is possible to run and use a graphics card (with external power supply) connected to the Ethernet port (with some kind of PCI emulation to emulate the Ethernet GPU as a PCI one).
A Cat6 cable can do 10 Gbps, which should be enough for a GPU to run and play games.
Could this be possible?
Source: (StackOverflow)
There's DOSBox for really old games, and some games work fine in Windows 8, but for that era of games that ran on Windows 95/98/XP, we've been somewhat out of luck if the game needs to utilize a GPU.
With the Hyper-V system on Windows 8, can we virtualize older versions of Windows well enough to play these games with a decent framerate, utilizing the host hardware?
Source: (StackOverflow)
I have VMware Workstation running on Ubuntu host with a Ubuntu Guest.
Is it possible to directly access the GPU from the VM?
I want to run CUDA on the VM
Source: (StackOverflow)
What is the difference between both these things? Are they same? Is there a difference between their capabilities?
Source: (StackOverflow)
I have an ATI Radeon HD 4870 graphics card. I have read that I can use this to significantly speed up encoding DVD to AVI.
Does anyone know how this can be done?
Source: (StackOverflow)
I know what a CPU is(I think). It's the thing who's speed is measured in GigaHertz(these days).
However, you hear a lot about a GPU, and letting the GPU take over, not letting the CPU but the GPU do it, GPU-based rendering, etc...
What is this GPU anyway? How can I access it and use it to my advantage? What am I missing out on here?
Source: (StackOverflow)
It seems to me that these days lots of calculations are done on the GPU. Obviously graphics are done there, but using CUDA and the like, AI, hashing algorithms (think bitcoins) and others are also done on the GPU. Why can't we just get rid of the CPU and use the GPU on its own? What makes the GPU so much faster than the CPU?
Source: (StackOverflow)
This is quite possibly a stupid question, but the GPU cooler I bought leaves me with little choice but to position heat sinks such that they straddle and join multiple voltage regulators. Is this a bad (ie. disastrous) path to take?
The heat sinks are standard aluminium so they will obviously conduct. However, the voltage regulators seem to have some kind of cover over them.
The instructions that came with the cooler are completely unhelpful in this regard.
Here is a picture of the voltage regulators on my board:
And here is how I was planning on configuring the heat sinks:
Source: (StackOverflow)
A little more than a week ago, I acquired an ASUS Strix GTX970 4GB OC. I've been noticing alot of sporadic stuttering whilst playing games and I'm kind of puzzled to what it could be. Very frustrating.
I've monitored my system with HWiNFO64 and saw no abnormalities in CPU, GPU and RAM usage. Nor did I see any in CPU and GPU temperature. A friend of mine said I should look for abnormalities in temperature as he suspected that throttling would be the issue. I've also run Furmark but did not observe any faults either.
What I am noticing while booting up or playing games, is that the VRAM usage indicator on ASUS's GPU Tweak 2 goes no further than 6% an most of the times just idles at 0%, which is why I suspect the GPU to not function correctly. Could there be something I've missed?
Could it be the the powersupply that I recently purchased? I felt the need to purchase a new powersupply as my old one only delivered only 25A on the 12V rail. The minimum seemed to have been 28A and it said 38A was recommended on the card's box. I had no money to buy me a really good one, so temporarily went for a Corsair CX600M. I know that the chance of the powersupply being the culprit is very slim, but I thought I should state the potentially necessary.
I'll state the specifications of my system below to give you guys more clarity on this matter and help you answer my (very) vague question:
- Antec GX300 case
- ASUS P8H67 Motherboard
- Intel i5 2400 processor
- ASUS Strix GTX970 4GB OC
- Mushkin 240GB SSD
- Samsung 500GB 5400rpm HDD
- Corsair CX600M PSU
The system was recently clean-installed with Windows 7 Ultimate edition. Games tested include GTA V, MS Flight Simulator X and CS:GO.
EDIT
As suggested by Psycogeek, I have checked the current Link Width in the Main Board section of CPU-Z and saw something that might be remarkable:
The card installed into the lower PCI-E slot of the motherboard. Could this slot be defective? Should I try the other slot?
Source: (StackOverflow)