size interview questions
Top size frequently asked interview questions
I've just finished my Android widget. Now I need to have different sizes of this wiget for the user to choose from. for example I need a medium, small and large size widget. so when the user install the app and hold the the home screen then choose widget, in the widget menu I want him to see three widget with the same app name but with the size. something like this:
helloSmall
helloMedium
helloLarge
I have the medium one ready but how can I make the small and the large in the same app? knowing that all three sizes contain the same exact data and actions just the size and the background are different.
Thanks.
Source: (StackOverflow)
In Python,
how big can an array/list get? I need an array about 12000 elements large... is that okay? - will I still be able to run array/list methods such as sorting, etc?
Thanks so much,
Ed
Source: (StackOverflow)
Suppose I'm writing a couple of files to disk, between 2MB and 5GB.
What are sensible buffer values for the FileStream ?
Is it sensible to work with buffersizes of several megabytes, or should I stick to kilobyte-buffers ?
Source: (StackOverflow)
In C++, I'm wondering why the bool type is 8 bits long (on my system), where only one bit is enough to hold the boolean value ?
I used to believe it was for performance reasons, but then on a 32 bits or 64 bits machine, where registers are 32 or 64 bits wide, what's the performance advantage ?
Or is it just one of these 'historical' reasons ?
Source: (StackOverflow)
In C programming, you can pass any kind of pointer you like as an argument to free, how does it know the size of the allocated memory to free? Whenever I pass a pointer to some function, I have to also pass the size (ie an array of 10 elements needs to receive 10 as a parameter to know the size of the array), but I do not have to pass the size to the free function. Why not, and can I use this same technique in my own functions to save me from needing to cart around the extra variable of the array's length?
Source: (StackOverflow)
How do I use jQuery to determine the size of the browser viewport, and to redetect this if the page is resized? I need to make an IFRAME size into this space (coming in a little on each margin).
For those who don't know, the browser viewport is not the size of the document/page. It is the visible size of your window before the scroll.
Source: (StackOverflow)
Possible Duplicate:
How do you determine the size of a file in C?
How can I find out the size of a file? I opened with an application written in C.
I would like to know the size, because I want to put the content of the loaded file into a string, which I alloc using malloc()
. Just writing malloc(10000*sizeof(char));
is IMHO a bad idea.
Source: (StackOverflow)
This question already has an answer here:
Is there a built-in function for getting the size of a file object in bytes? I see some people do something like this:
def getSize(fileobject):
fileobject.seek(0,2) # move the cursor to the end of the file
size = fileobject.tell()
return size
file = open('myfile.bin', 'rb')
print getSize(file)
But from my experience with Python, it has a lot of helper functions so I'm guessing maybe there is one built-in.
Source: (StackOverflow)
I can't seem to find a definitive answer on this and I want to make sure I understand this to the "n'th level" :-)
a = { "a" => "Hello", "b" => "World" }
a.count # 2
a.size # 2
a.length # 2
a = [ 10, 20 ]
a.count # 2
a.size # 2
a.length # 2
So which to use? If I want to know if a has more than one element then it doesn't seem to matter but I want to make sure I understand the real difference. This applies to arrays too. I get the same results.
Also, I realize that count/size/length have different meanings with ActiveRecord. I'm mostly interested in pure Ruby (1.92) right now but if anyone wants to chime in on the difference AR makes that would be appreciated as well.
Thanks!
Source: (StackOverflow)
I can understand that many years ago there would be this kind of limitation, but nowadays surely this limit could easily be increased. We have naming conventions for objects, but there is always a case that turns up where we hit this limit - especially in naming foreign keys.
Does anybody actually know why this isn't a bigger size - or is it bigger in 11g?
Apparently the answer is that it will break currently scripts that aren't defensively coded. I say that is a very worrying thing, Oracle is trying to be the database, surely this is the kind of thing that you must constantly improve, otherwise your product will die the death of a thousand cuts.
Whenever I see this kind of objection in-house, I think it is time to bite the bullet and sort it out. If people are running scripts that they do not check or maintain when they upgrade Oracle versions, then let them suffer the consequences of that choice. Provide them a compatibility flag, up the size to 4000, then save me the wasted time when I'm creating objects of having to constantly count to 30 to check the name is 'OK'.
Source: (StackOverflow)
I know I can get the size of the primary screen by using
System.Windows.SystemParameters.PrimaryScreenWidth;
System.Windows.SystemParameters.PrimaryScreenHeight;
But how do I get the size of the current screen? (Multi-Screen users do not always use the primary screen and not all screens are using the same resolution, right?)
It would be nice to be able to acces the size from XAML, but doing so from code (C#) would suffice.
Source: (StackOverflow)
This question already has an answer here:
I have a background that I need fit in all screen sizes. I have three folders, hdpi
, ldpi
and mdpi
for drawables, but in the emulator there isn't any referense to what resolution hdpi
is and what mdpi
and ldpi
are.
Source: (StackOverflow)
This question goes out to the C gurus out there:
In C, it is possible to declare a pointer as follows:
char (* p)[10];
.. which basically states that this pointer points to an array of 10 chars. The neat thing about declaring a pointer like this is that you will get a compile time error if you try to assign a pointer of an array of different size to p. It will also give you a compile time error if you try to assign the value of a simple char pointer to p. I tried this with gcc and it seems to work with ANSI, C89 and C99.
It looks to me like declaring a pointer like this would be very useful - particularly, when passing a pointer to a function. Usually, people would write the prototype of such a function like this:
void foo(char * p, int plen);
If you were expecting a buffer of an specific size, you would simply test the value of plen. However, you cannot be guaranteed that the person who passes p to you will really give you plen valid memory locations in that buffer. You have to trust that the person who called this function is doing the right thing. On the other hand:
void foo(char (*p)[10]);
..would force the caller to give you a buffer of the specified size.
This seems very useful but I have never seen a pointer declared like this in any code I have ever ran across.
My question is: Is there any reason why people do not declare pointers like this? Am I not seeing some obvious pitfall?
Thanks in advance
Source: (StackOverflow)
Simple question here: I'm just trying to get the size of my legend using matplotlib.pyplot to be smaller (ie, the text to be smaller). The code I'm using goes something like this:
plot.figure()
plot.scatter(k,sum_cf, color = 'black', label='Sum of Cause Fractions')
plot.scatter(k, data[:, 0], color = 'b', label = 'Dis 1: cf = .6, var = .2')
plot.scatter(k, data[:, 1], color = 'r', label = 'Dis 2: cf = .2, var = .1')
plot.scatter(k, data[:, 2], color = 'g', label = 'Dis 3: cf = .1, var = .01')
plot.legend(loc=2)
Source: (StackOverflow)
template<typename T, size_t n>
size_t array_size(const T (&)[n])
{
return n;
}
The part that I don't get is the parameters for this template function. What happens with the array when I pass it through there that gives n
as the number of elements in the array?
Source: (StackOverflow)