Traffcams

Almost 5 years ago, I built an app that put every NC DOT traffic camera in your hand.  At that time, there was no such thing as looking up traffic conditions in Google Maps.  I was also in the early days of being a professional Android developer and used this app as a way to learn the ins and outs of building, marketing, and publishing an app on Google Play.

Today, the app is still used by a handful of users, but I noticed it’s usage is very seasonal.  People were using NC Traffic Cams (the name at the time) to check not only traffic congestion, but road and travel conditions.  It did not surprise me to see spikes in usage around snow storms, tropical storms, or hurricanes.  Things have stalled in recent years, but I am revisiting this app in hopes of building a really cool app and platform to view images from traffic cameras.  The first step of this starts today.  I’m re-launching NC Traffic Cams as simply, Traffcams.

Traffcams

Traffcams has been completely rebuilt and redesigned from the ground up.  It is built on modern technology and given a more modern design.  Traffcams contains all of the same features people came to love, plus a few more:

  • View cameras around you using your location
  • View cameras on a map
  • Favorite cameras for easy access
  • Search for traffic cameras by city, street, highway, or interstate name
  • Pin your searches for easy access

  

Traffcams is launching with traffic camera data availability for North Carolina.  I hope to quickly add support for other states as soon as I can.  You can get Traffcams, for free, on Google Play today.
Get it on Google Play
I have a roadmap of features to implement that are going to be really useful for commuters.  As with all side projects I work, Traffcams will allow me to learn new patterns and technologies for mobile development and really dive into using machine and deep learning.

Share the Cache

A lot of Android apps make viewing images a core part of their user experience.  Many of these apps use image caching libraries, like Glide, to make image caching easy, robust, and configurable.  Sharing these cached images can be a bit tricky.  A few questions arose when I tried:

  • How do I access files in the cache?
  • Do other apps have access to these files?
  • What are the needed permissions if I need to manually copy the file somewhere else?

Those are just a few of the questions that came up, each with a pretty simple answer.  A first pass at sharing images in my cache involved, re-caching the image to a public directory (ie. root of the external storage device), generating a URI, and passing that URI to an Intent to share with other apps.  This sounds simple, but it is complicated by the fact that I needed to ask the user for permission to read and write to the external storage on their device, if they were running Android 6.0 Marshmallow or above.  This flow worked, but I was looking for something much simpler for the user and myself, the developer.  Enter the FileProvider.

The FileProvider API

The FileProvider API, added to the Android Support libraries in version 22.0.0, is a ContentProvider-like API that allows URI specific sharing of files relevant to your application.  It can, temporarily, enable access (read and/or write) to the file at the URI.  You also do not need to copy the file to a more accessible location on the user’s device.  Setting up the FileProvider is very straightforward.

Setting up a File Provider

Setting up the FileProvider is a three step process.

First, like we would do with a ContentProvider, we added a <provider> entry to our AndroidManifest.xml file.


<provider 
    android:name="android.support.v4.content.FileProvider" 
    android:authorities="com.yourdomain.android.fileprovider" 
    android:grantUriPermissions="true" 
    android:exported="false">

    <meta-data 
        android:name="android.support.FILE_PROVIDER_PATHS" 
        android:resource="@xml/filepaths" />
</provider>

This provider entry has a <meta-data> element that points to an XML file, that defines what paths, within our application file structure, we want to expose with our FileProvider.  This is important.  You can only share files in the paths contained in this XML file.  I created an XML file in the “res/xml” folder.


<paths>
    <external-files-path name="image_files" path="." />
</paths>

In my particular instance, I am caching images to the External Files directory (which can be located on the SD card or a portion of internal memory, simulating an SD Card).  Other tags you can use here:

The name attribute is a string that will be used as a URI-path component in the URI that’s generated by FileProvider.  The path attribute, is the relative path to folder containing the files you would like share.  In my case, I’m fine with sharing the root because it only contains cached images, but you should probably be as granular as possible in the case where you have multiple directories or file types in this directory.

Now that my FileProvider is set up and configured, how do I share files out of my cache.

Sharing Cached Files

The specifics on sharing cached image files is highly dependant on the Image caching library used and how it’s configured.  At a high level, the process is:

  1. Cache an image to a location (done by an image caching library) that falls in the location specified in XML configuration file given to the FileProvider.
  2. Get a reference to that cached image using the java.io.File API.
  3. Pass the File reference to FileProvider.getUriForFile() to get a URI you can pass to other apps.
  4. Pack that URI into an Android Intent to start sharing with other apps.

Very straightforward.  Now this is how I did it with Glide.

My first step was to specify a location for Glide to store cached images.  I needed to subclass DiskLruCacheFactory to make this happen (Note: you don’t need to subclass DiskLruCacheFactory if you want to use the Cache directory).


private static class DiskCacheFactory extends DiskLruCacheFactory {

    DiskCacheFactory(final Context context, final String diskCacheName, long diskCacheSize) {
        super(new CacheDirectoryGetter() {
            File cacheDirectory = new File(context.getExternalFilesDir(null), diskCacheName);
            return cacheDirectory;
        }, (int) diskCacheSize);
    }
}

As you can see from the code snippet, I pass in a reference to the External Files directory.  By default, Glide uses the Cache directory.

I implement a GlideModule so that I can customize caching behavior in my application.  I specify my subclassed DiskLruCacheFactory class in my GlideModule.


public class MyGlideModule implements GlideModule {
    private static final int IMAGE_CACHE_SIZE = 200_000_000;

    @Override
    public void applyOptions(Context context, GlideBuilder builder) {
        builder.setDiskCache(new DiskCacheFactory(context, “.”, IMAGE_CACHE_SIZE));
    }

    @Override
    public void registerComponents(Context context, Glide glide) {
        // ...
    }
}

Now, I instruct Glide where it can find my GlideModule by adding a <meta-data> tag to the AndroidManifest.xml file, in <application> element.


<meta-data android:name="com.yourdomain.android.MyGlideModule" android:value="GlideModule"/>

Next, I invoke a manual file caching with Glide.


File file = Glide.with(context)
    .load(uri) // uri to the location on the web where the image originates
    .downloadOnly(Target.SIZE_ORIGINAL, Target.SIZE_ORIGINAL)
    .get();

This is specific to Glide and worth mentioning.  When Glide caches a file to disk, the file name is comprised of generated key with an integer appended to it.  There is no extension.  This is important because sharing an extension-less image will make it difficult for the apps you are sharing with to determine how this file should be handled (despite the MIME type being sent along with the image in the Intent).

Contents of a Glide image cache folder

 

I worked around this issue by simply copying that file, appending an extension to the duplicate in the process.  Since I know I am always handling JPEGs, I give these files the “.jpg” extension.  This may not work in cases where you may be dealing with different types of images like, GIFs, PNGs, WebP, etc.

Finally, I can get a URI from the FileProvider for my cached image by calling FileProvider.getUriForFile().

// be sure to use the authority given to your FileProvider in the AndroidManifest.xml file
String authority = “com.yourdomain.android.fileprovider”;
Uri uri = FileProvider.getUriForFile(context, authority, file);
Intent intent = new Intent(Intent.ACTION_SEND);
intent.putExtra(Intent.EXTRA_STREAM, uri);
intent.setType("image/jpeg");
context.startActivity(Intent.createChooser(intent, “Share via”);

The Uri generated by the FileProvider looks like:

content://com.yourdomain.android.fileprovider/external_files/bb65e6a364264255b4833d34e____some_key.0.jpg

Once the Intent makes its way to the target app, things should look just like you are sharing a picture you’ve just taken.

Sharing an image from Traffcams to Gmail

#Dassit

Diving into Android Things

I’ve always tinkered with electronics since my teens.  I went to school and graduated with a Computer Engineering degree with a focus on hardware (embedded systems, ASIC design, etc).  I somehow got into software since graduation and am now an Android developer at RadioPublic.  When Google announced Android Things in late 2016, I was beyond excited because it gave me a reason to break out my old breadboard, resistors, LEDs, and power regulators.  It also gave me a reason to buy a Raspberry Pi.  With Android Things I’m finally able to leverage my expertise in Android development in a more embedded paradigm.

I’m not going to cover a ton of Android Things fundamentals here because a lot of other really good developers have already done a great job at that:

I’m going to share a project I began working on, on Friday, February 10th and finished prototyping on Monday, February 13th.  When exploring something new, it’s important for me to find a practical application for it.  I’m a homeowner of a house built in the early 90s.  It can use some home-built tech from 2017.  A superficial problem I and other members of my household have trouble with is parking (correctly and in alignment) in the garage. Either we parked to close to the wall and can’t walk around both sides of the car or we aren’t sure if the car will get caught in the garage door. 

Hello CantParkRight

My first Android Things project is to build an assistive parking devices that uses several sensors to assist drivers in parking correctly in the garage.  Think of the signals you see when you enter a carwash.  Normally, there are two or three lights.  When you first enter, the light is green, which instructs you to keep driving forward.  When you have driven far enough, the light turns red to alert you to stop.  I want this in my garage.

Image from Signal Tech

 

The first step of this is prototyping CantParkRight.

Prototyping the Hardware

A huge advantage that Raspberry Pi-like devices provide is the ability to quickly and cheaply prototype assistive devices like the one I’m building.  The fact that I can officially leverage Android APIs (and down the road, Google APIs) is a big plus.

The supplies I used for my prototype include:

  • Raspberry Pi Model 3 running Android Things Preview 1
  • HC-SR04 Ultrasonic proximity sensor
  • 2 resistors, 10KΩ and 20KΩ
  • 3 LEDs (Red, Yellow, Green)
  • A breadboard
  • Assortment of jumper wire

I had most of my supplies already.  I bought a Raspberry Pi sometime ago and recently bought a pack of 5 HC-SR04 Ultrasonic sensors from Amazon.  I settled on the HC-SR04 after quite a bit of research.  How the HC-SR04 works is, you send a 10µS (microsecond) signal to the TRIGGER pin.  Sometime in the next few milliseconds, the HC-SR04 sends a burst of 8 40KHz sound waves that will eventually bounce back.  If an object is in range, the signal will bounce back and be detected by the receiver portion of the sensor.  The HC-SR04 then sends a variable length echo to any device attached to the ECHO pin.  The length of this pulse is determined by the distance the signal traveled before returning to the sensor.  The HC-SR04 has a range of around 400cm (~13 feet).  Perfect. Note: check out the datasheet on the HC-SR04 here.

After a lot of experimentation, here is how my circuit is arranged on my breadboard.

CantParkRight prototype schematic

 

A few hardware gotchas:

  • The accuracy varies greatly between sensors, especially the “knockoffs”.  Out of the pack of 5, some sensors were more sensitive to object movements while others exhibited less variation. 
  • The signal sent to the ECHO pin is at 5V.  The GPIO ports on the Raspberry Pi are rated for 3.3V.  You can damage it by sending to high a voltage, so I use the resistors to step the voltage down to 3.3V.

Prototyping the Software

The best part of this project was writing the software in Android Studio, deploying it via ADB (over WiFi), and seeing the results play out in front of my eyes.  I based the implementation on:

Over the course of the article, Daniel builds several implementations, some synchronous and some asynchronous using while loops, callbacks, and threads.  I decided I wanted to build upon that, but use RxJava to implement asynchronous handling of sensor data.  I’ve used RxJava in most of the Android apps that I’ve built.  It offers quick and convenient ways to build, reuse, and arrange pieces of logic that leverage the flow of data from one end to the next, basically perfect for CantParkRight.

Disclaimer: I am NOT an RxJava expert.  There are likely ways to do what I did using RxJava in a more efficient manner.

The critical piece is how I go about initiating the TRIGGER and waiting for an ECHO.  My first implementation of this used a RxJava Observable that essentially wraps a few While loops (check out my repository, then go to the first commit).  

The process was:

  • Send the 10µS signal to the TRIGGER
  • Start a while loop that executed until the ECHO goes hi, record the start time
  • Start a while loop that iterated until the ECHO goes low, record the end time and calculate the pulse width which is used to calculate the distance

It worked, sometimes, but often for reasons I’m still researching, the sensor would stop responding (ie. the ECHO never went hi after a TRIGGER).  The improvement came when I used a GpioCallback.  A GpioCallback allows you to listen to edge triggers (signal going high, signal going low, etc.) asynchronously.  I combined my implementation of a GpioCallback with a RxJava Observable (more specifically an Emitter).  From what I’ve read, the advantages of the Emitter over using a plain Observable (using Observable.create) is that it forces you to specify a BackPressure strategy, which is important when reading values pushed from a sensor.  CantParkRight uses the BUFFER BackpressureMode.  Using RxJava allows me to start the distance detection process, simply by subscribing to the correct Observable.   Using an Emitter also allows me to right code to unregister my GpioCallback when I unsubscribe in onDestroy(…).  This prevents future memory leaks.

What’s Next

For CantParkRight, I’m working up to building an actual device I can easily mount in my garage.  With the prototype complete, I turn my efforts to making that happen.  

In the meantime, you can check out the source code for CantParkRight on GitHub.  Be sure to follow me on Twitter or (cough) Google+ for updates on CantParkRight.  I intend on posting the finished project here in the coming months, but watching the repository is great way to keep up.

Audio Playback on Android

Playing music, podcasts, or other audio is one of the most common activities for smartphones in 2016. Most of the time, audio plays in the background while we are driving, cleaning, working out, or cooking. Architecting your application to support background audio playback is standard fare whether you are incorporating the standard Android MediaPlayer API or using a library, like ExoPlayer.

I want to briefly walk through how I architected PremoFM, an open source podcast player, to play audio in the background using ExoPlayer. It’s not perfect, but it’s a good starting point for the transition to ExoPlayer 2. If you want to learn more about ExoPlayer 2, check out my previous post, Exploring ExoPlayer 2.

The Architecture

In order to play audio in the background (or do anything in the background) the process that manages playback should be based on the Service class. Services, on Android, allow background work to be done without needing to have a user interface in the foreground. Naturally, I based the background audio playback of PremoFM on a Service, the PodcastPlayerService. It is obviously doing a lot. It manages audio playback, updates the database, listens for events like a headphone disconnection, and manages the persistent notification. Initially, most of my code involving direct management of the ExoPlayer was also embedded directly in this service. This led to a bloated class and a highly coupled design. I re-architected things when I added Google Cast support by creating a generic MediaPlayer abstract class.

The abstract class provided a common set of methods for interacting with ExoPlayer like, playing a media file, fast forward & rewind, getting playback state information, and changing the playback speed. All I needed to do was extend my MediaPlayer abstract class, using ExoPlayer. This resulted in all of my ExoPlayer code existing in one class, LocalMediaPlayer.

This will make my upgrade to ExoPlayer 2 significantly easier than if I had continued the previous architecture. All of the code that needs to change exists in one place. In my next article I will get into the nitty gritty of my migration.

Feel free to take a swing at it before I do. Check out the source code for PremoFM from GitHub and hack away.

Follow me on Twitter or visit my website for more Android Development related articles like this.

OMG Kotlin

I finally got around to watching Jake Wharton’s talk on how developers can use Kotlin to build their Android apps.  It’s a must watch.  For Android developers.

My initial thoughts on Kotlin were:

  1. Hmmm, reminds my of Swift (of Apple fame).
  2. Oh my goodness, take my money.
  3. Oh, oooh, ooooooooooooh.
  4. Nice.
  5. Kotlin is compiled to Java bytecode.  You still need to have reasonably deep knowledge of the compiler to avoid creating unnecessary objects or have inner classes hold onto references to the outer class, fail…
  6. …but, you need to have reasonably deep knowledge of the Java compiler anyway, even if you are writing just Java code (kind of invalidating point 5).
  7. This is amazing, this is the future.

One thing I have begun to realize as I learn Swift and now have seen Kotlin.  Java is VERY verbose with tons of ceremony.  Swift, Kotlin, and other similar languages are doing a good job stripping that all away and making coding genuinely fun and clean.