Skip to main content

Using your broken screen Android phone

    Yesterday I accidentally dropped my phone and the screen cracked; leaving the phone unusable (as you can see from the photo). The phone itself is still functional but the display and touch screen do not work at all. This leaves me phone-less while I wait until my new phone arrives in a few days. But how can I function without a phone!? For programming I can use the emulator on my computer but it would be nice if I could at least text until I get my new phone. Well, I can!

    Luckily I already have my phone in developer mode since I often use it to test my applications. Also, I have adb (Android Debug Bridge) installed; which comes with the Android SDK and which allows deploying apps and controlling your phone from your computer. Then I simply referenced this blog post which led me to an application, Android Control Screen, which mirrors your Android phone screen on your computer. It's still hard to navigate through the phone since the application uses a virtual keyboard and gesture detector section for that purpose. Fortunately I already had the app Pushbullet installed which displays your phones notifications on your computer and even allows you to send text messages from your computer through your phone. So, using the Android Control Screen app, I navigated to and logged into the Pushbullet app and now I can text while I'm at my computer! That will hopefully hold me off until my new phone arrives.


Popular posts from this blog

Face detection and live filters

Live video filters are becoming a popular trend fueled by Facebook (through their purchase of Msqrd) and Snapchat incorporating the features into their apps. These filters apply images or animations to your face using face tracking software. This technology has been around for awhile but is becoming increasingly more common due to the powerful CPU's that our mobile phones now have. Google provides an API that provides face tracking abilities through the Google Play Services library called Mobile Vision. I'm going to use their API to build a basic live filter app. The end result will look something like this:

    The bounding box wraps around the detected face and the sunglasses are the filter I chose (which is just a PNG image) which are drawn over the eyes. You could use any PNG image (with alpha for the background) you want, you will just have to adjust the layout according to where the image should be displayed. As you move your head, the box and sunglasses are redrawn…

Setting Up Connection Pooling With Elastic Beanstalk

Amazon's Elastic Beanstalk is a service which automatically scales your application when needed. It uses Amazon's Elastic Compute Cloud (EC2) instances as deployable containers which when your app requires more resources more containers will be deployed. This removes the need to manually configure your EC2 instance whenever you need more connections or resources and attempts to add simplicity to the maintenance aspect of your application. So, when you get more users of your app, your app will scale accordingly.

    Unfortunately, along with the ability to scale automatically, comes less control and configuration. Things you would normally have the ability to configure to your liking, such as your server, you no longer can. Amazon attempts to address this issue with configuration files. You can provide configuration files which can set up your server. These files are either written in JSON or the horrible format YAML. Though these files provide some level of control, you ca…

Android Guitar Tuner

Recently I created a guitar tuner application for Android that is written with pure Java (no C++ or NDK usage). The design was inspired by the Google Chrome team's guitar tuner web app using the WebAudio API. I wanted to code a version written natively for Android that didn't have to rely on a WebView, the WebAudio APIs, or server-side logic. Also, I wanted this application to be available to as many versions of Android as possible (whereas the WebAudio API may only be supported in more recent versions of WebView available only on newer flavors of Android). So, I coded it from scratch. I used a portion of the open source TarsosDSP project (their YIN algorithm) to help with the pitch detection.

    The application is available in the Google Play Store for Android: The project is completely open source and the code can be found on the GitHub repository: Fi…