screen
Web site screenshot tool based on PHP and PhantomJS
The new iPhone 5 display has a new aspect ratio and a new resolution (640 x 1136 pixels).
What is required to develop new or transition already existing applications to the new screen size?
What should we keep in mind to make applications "universal" for both the older displays and the new widescreen aspect ratio?
Source: (StackOverflow)
Lets say I have a currently running screen session I am interacting with through putty. I've realized that the scrollback buffer is too small and would like to increase it without starting a new screen session.
Is there a way to do this?
Source: (StackOverflow)
I would like to pipe standard output of a program while keeping it on screen.
With a simple example (echo
use here is just for illustration purpose) :
$ echo 'ee' | foo
ee
<- the output I would like to see
I know tee could copy stdout to file but that's not what I want.
$ echo 'ee' | tee output.txt | foo
I tried
$ echo 'ee' | tee /dev/stdout | foo
but it does not work since tee output to /dev/stdout
is piped to foo
Source: (StackOverflow)
I have a document-based app, where each document has one window with an NSScrollView that does some (fairly continuous) drawing using only Cocoa.
To call the drawing, I am using a CVDisplayLink, outlined in the code below:
- (void)windowControllerDidLoadNib:(NSWindowController *) aController {
//other stuff...
[self prepareDisplayLink]; //For some reason putting this in awakeFromNib crashes
}
//Prep the display link.
- (void)prepareDisplayLink {
CVDisplayLinkCreateWithActiveCGDisplays(&displayLink);
CVDisplayLinkSetCurrentCGDisplay(displayLink, ((CGDirectDisplayID)[[[[[self windowForSheet]screen]deviceDescription]objectForKey:@"NSScreenNumber"]intValue]));
CVDisplayLinkSetOutputCallback(displayLink, &MyDisplayLinkCallback, self);
}
//Callback to draw frame
static CVReturn MyDisplayLinkCallback(CVDisplayLinkRef displayLink, const CVTimeStamp* now, const CVTimeStamp* outputTime, CVOptionFlags flagsIn, CVOptionFlags* flagsOut, void* displayLinkContext)
{
NSAutoreleasePool *pool =[[NSAutoreleasePool alloc]init];
CVReturn result = [(ScrollView*)displayLinkContext getFrameForTime:outputTime];
[pool drain];
return result;
}
//Drawing function:
- (CVReturn)getFrameForTime:(const CVTimeStamp*)outputTime
{
[scrollView lockFocusIfCanDraw];
[self addToCurrentPostion:(dist/time)*CVDisplayLinkGetActualOutputVideoRefreshPeriod(displayLink)]; //Redraws the scrollview];
[scrollView unlockFocus];
return kCVReturnSuccess;
}
//Set the display when the window moves:
- (void)windowDidMove:(NSNotification *)notification {
if ([notification object] == [self windowForSheet]) {
CVDisplayLinkSetCurrentCGDisplay(displayLink, ((CGDirectDisplayID)[[[[[self windowForSheet]screen]deviceDescription]objectForKey:@"NSScreenNumber"]intValue]));
}
}
//Start or stop the animation:
- (IBAction)toggleAnim:(id)sender {
if (CVDisplayLinkIsRunning(displayLink)) {
CVDisplayLinkStop(displayLink);
}
else {
CVDisplayLinkStart(displayLink);
}
}
Rendering Code:
- (void)addToCurrentPostion:(float)amnt {
fCurrentPosition += amnt; //fCurrentPositon is a float ivar
if (scrollView) [[scrollView contentView]scrollToPoint:NSMakePoint(0,(int)fCurrentPosition)];
if (scrollView) [scrollView reflectScrolledClipView:[scrollView contentView]];
}
This works great, and the animation is buttery.....on one screen.
As soon as I move one document off the main screen, onto a second monitor, the animation becomes about as smooth as a car with square wheels.
The animation becomes poor in all documents when any one (or more) documents are on the second screen.
There can be no documents on the main screen and any on the secondary screen and the animation will degrade also.
I've tried this on multiple types of monitors, and multiple Macs, always ending in these results.
To make sure this was not a CVDisplayLink related issue, I also tried rendering using an NSTimer (which the CVDisplayLink is preferable to), with the same results.
What am I doing wrong?
Any help is greatly appreciated.
EDIT: I have tried using thread-based drawing too, again with the same results.
EDIT: I've made some progress, in that my thread-based drawing (basically a while
loop) works very well on only one screen. (Either the second or first).
Source: (StackOverflow)
This question already has an answer here:
I'm diving into iOS development and am building my own alarm clock app to become familiar with platform and SDK. I've noticed some alarm clock apps in the app store keep the screen from dimming and/or turning off when their app is running. How is this implemented?
Thanks so much in advance for all your help!
Source: (StackOverflow)
Because I think a lot of people will be asking this question (because I was)...
Now that the latest version of XCode allows us to test our programs on a iPhone 5 simulator, you may have noticed that your app has been "letterboxed," showing black borders on the top/bottom (or left/right, depending on orientation). There is a very simple fix for this (though it is not the proper solution), that will automatically treat your application as though it is built for the new iPhone.
1) Create a new app project.
2) Locate the "Default-568h@2x.png" file inside that projects' folder.
3) Add that file to your project
That's it. Interestingly enough, if you then remove the splash image, it will still work correctly (which tells us that it is not the correct solution, but it does make the correct solution happen in the project). Of course, you could customize that image to whatever you want, but basically that is how you tell iPhone 5 to display the app "fullscreen."
Then, to fix formatting of view size for both kinds of devices, make sure you are using
[[UIScreen mainScreen]bounds];
instead of explicitly stating screen size. IE, if your view frame was
CGRectMake(0,0,310,420);
it should be
CGRectMake(0,0,bounds.size.width-10,bounds.size.height-60);
And just in case you don't know, the new iPhone screen size is 640x1136 (cut in half for non-retina).
The more you know...
ps, someone with a higher level, please add a iphone5 tag to this
Source: (StackOverflow)
I'm developing an application for my final thesis on computer science, and I need to collect and log accelerometer data. I need to acquire it for a whole day long, so there are serious battery constraints (for instance, I cannot leave the screen on). Also, this isn't a market targeted application, so it is pretty acceptable to do some serious hacking, even low level C/C++ coding, if required.
It is well known that on many devices the listeners for accelerometer events stop generating events when screen goes off (some links regarding this problem: http://code.google.com/p/android/issues/detail?id=3708 , Accelerometer stops delivering samples when the screen is off on Droid/Nexus One even with a WakeLock). I have thoroughly searched for some alternatives, some of them include workarounds that do not work for my device (LG P990, stock ROM).
So what happens is this:
When you register an event listener for android accelerometer sensor in a Service, it works fine until the screen is turned off. I have already tried to register the eventListener on a Service, on an IntentService, tried to acquire WakeLocks. Regarding wakelocks, I can verify that the service is still running watching the LOGcat output, but it seems the accelerometer is put into sleep mode. One of the workarounds presented in some of the links is to unregister and re-register the event listener periodically using the thread of an IntentService like in this code snippet bellow
synchronized private static PowerManager.WakeLock getLock(Context context) {
if (lockStatic==null) {
PowerManager mgr=(PowerManager)context.getSystemService(Context.POWER_SERVICE);
lockStatic = mgr.newWakeLock(PowerManager.PARTIAL_WAKE_LOCK,NAME);
lockStatic.setReferenceCounted(true);
}
return(lockStatic);
}
@Override
protected void onHandleIntent(Intent intent) {
sensorManager=(SensorManager) getSystemService(SENSOR_SERVICE);
sensorManager.unregisterListener(this);
sensorManager.registerListener(this, sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_NORMAL);
synchronized (this) {
boolean run = true;
while (run){
try {
wait(1000);
getLock(AccelerometerService.this).acquire();
sensorManager=(SensorManager) getSystemService(SENSOR_SERVICE);
sensorManager.unregisterListener(this);
sensorManager.registerListener(this, sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_NORMAL);
Log.d("Accelerometer service", "tick!");
} catch (Exception e) {
run = false;
Log.d("Accelerometer service", "interrupted; cause: " + e.getMessage());
}
}
}
}
@Override
public void onSensorChanged(SensorEvent event) {
Log.d("accelerometer event received", "xyz: "+ event.values[0] + "," + event.values[1] + "," + event.values[2]);
}
which indeed makes the onSensorChange be called every time we unregister/register the listener. The problem is that the event received contains always the same values, regardless of me shaking the device.
So, basically my questions are: ( bear with me, I'm almost finishing :P )
is it possible to have low level access (C/C++ approach) to the accelerometer hardware WITHOUT registering to an event listener?
is there any other workaround or hack?
could anyone with a more up-to-date phone kindly test if the problem persists in firmware 3.0 and above?
[UPDATE]
Unfortunately, it seems to be a bug with some cellphones. More details in my answer.
Source: (StackOverflow)
I'm really feeling confused. From the docs at developer.android.com, it seems in order to keep my images scaled correctly (aspect ratio too) across all current Android devices I need all these layouts below. Is that really what everyone is doing? Am I missing something, or should I be going about this a different way?
Low density Small screens QVGA 240x320
------------------------------------------------
layout-small-ldpi
layout-small-land-ldpi
Low density Normal screens WVGA400 240x400 (x432)
------------------------------------------------
layout-ldpi
layout-land-ldpi
Medium density Normal screens HVGA 320x480
------------------------------------------------
layout-mdpi
layout-land-mdpi
Medium density Large screens HVGA 320x480
------------------------------------------------
layout-large-mdpi
layout-large-land-mdpi
High density Normal screens WVGA800 480x800 (x854)
------------------------------------------------
layout-hdpi
layout-land-hdpi
Xoom (medium density large but 1280x800 res)
------------------------------------------------
layout-xlarge
layout-xlarge-land
Source: (StackOverflow)
Is there a way to programmatically find whether the device the app is installed on is a 7 inch tablet or a 10 inch tablet?
Source: (StackOverflow)
This question already has an answer here:
I have a background that I need fit in all screen sizes. I have three folders, hdpi
, ldpi
and mdpi
for drawables, but in the emulator there isn't any referense to what resolution hdpi
is and what mdpi
and ldpi
are.
Source: (StackOverflow)
As we know Android coming with various device which having different
Features, Resolution and Screen-size so while developing an Application which support
multiple(small and big) screen there is an obstacle of size and layout.
This leads to different combinations of screen sizes, resolutions and DPIs and creates quite a challenge when designing and developing for Android devices. While some other Manufacturer(non Android) have different resolutions and DPI, they share the same screen size and the resolutions follow the same aspect ratio. Therefore, an image can be created to fit the non Android devices.
My question is that is there a proper flow or architecture that one should follow to meet the requirement?

Remember we do have Tablets of different Size and Resolution.
I'm aware that Android Developer contains this information but my view is from implementation.
From my knowledge what I understood is that for designing Android graphics even Programmer must know the designing concept.
Source: (StackOverflow)