TrafficFlow – Classifying Data

As a follow up to my initial TrafflicFlow post, I have built some more software to help me classify the dataset I collected over the past few weeks.

TrafficFlow is a project where I develop an algorithm that can “look” at a still image pulled from a traffic camera and determine whether or not traffic is congested.  I am using the deep learning framework, TensorFlow, to build the model that will house this algorithm.

Over the past few weeks, I have collected 4,966 still images from a NCDOT traffic camera.  I wrote a Python script that took a snapshot.  I cron’d that Python script to run every 4 minutes.  Now that I have all of this data, how can I efficiently classify it?  A few ideas came to mind:

  • Python script that loaded the image in a picture viewer and presented a question in terminal.  This worked, but the picture viewer grabbed focus. I also couldn’t close the picture viewer automatically.  I determined that the extra interaction involved would make classifying the data this way, inefficient.  This also limited me to classifying data on my MacBook Pro only.
  • AngularJS web app that allowed me to classify images in a desktop web browser.  This was interesting, but I didn’t know a ton of Angular and this limited me to classifying data on my MacBook Pro only.

I’m an Android developer by day (checkout RadioPublic 😉 ).  I figured I’d just build an Android app that would allow me to classify the data, so I did.  But first, I needed to gather the collected data into a format that is easily usable in this app.  So I wrote a Python script:

This script simply reads a list of files from a directory, creates an entry in a dictionary (saving some additional metadata in the process), and exports that object to JSON.

A snippet from the exported data looks like:

Next, I uploaded this JSON file to Firebase.  Firebase is a backend as a service that allows app developers to quickly build apps without needing to spin up servers or turn into “devops”.  Best of all, it’s free to get started and use.

Finally, I uploaded 4,966 images to my web server so that my app can access them.

Now on to the app.  It’s nothing special and particularly ugly, but it works.

It allows me to quickly classify an image as congested (1), not congested (0), or ditch/don’t include (-1).  Once I classify an image, it saves the result (and my progress) to Firebase, then automatically loads the next one.  It turns this exercise into a single tap adventure, well a 4,966 series-of-single-taps adventure.

I’ve uploaded the Python script and Classify Android app to GitHub (https://github.com/emuneee/trafficflow).  I hope to make my dataset available soon as well.

Now onto classification.

New Android Stuff Part 1 😍

I’m barely into the things that were released or announced at Google I/O 2017.  I’ve already got a list of stuff that I need to watch and review.  It’s really a lot of stuff and it’s only day 1!

What’s New In Android

After watching the Google I/O Keynote, this is normally the video I watch next.

Kotlin is Officially Support for Android Development

I’ve been holding off on doing anything major in Kotlin until it was blessed with official support from the Android team.  Well, I’m out of excuses.  Kotlin is an officially supported language for Android development.  It’s necessary dependencies and plugins are being integrated into Android Studio, beginning with version 3.0.

Kotlin and Android | Android Developers

New Android Studio Profilers

There are a ton of re-designed profilers for CPU, memory, and network operations in Android Studio 3.0.  I’ll let the pictures do the talking (all taken from Android Developers).

I’m especially pumped about the network profiler!

Android Studio 3.0 Canary 1 | Android Developers

Android O Beta

The next version of the Android O Beta was released today.  If you have a Nexus 5X, 6P, Pixel, Pixel XL, Nexus Player, or Pixel C, you can enroll your device at android.com/beta.  I’ve been using it for a few hours.   The only issues I’ve seen are Android Pay doesn’t work (it politely lets you know with a splash screen) and the Google Play Music playback notification just re-appears from time to time.

Android O Developer Preview | Android Developers

Android Architecture Components

The Android team has started putting together new tools and guidelines to help Android developers properly architect their app to prevent memory leaks, make lifecycle management easier (!), and reduce boiler plate code.

A new SQLite object mapper from the Android team, called Room.

Screenshot from the Architecture Components Talk

 

Android Architecture Components | Android Developers

These are just a few of the things that immediately stood out to me as an Android Developer.  I’m looking forward to doing a deeper dive into all of it.

Share the Cache

A lot of Android apps make viewing images a core part of their user experience.  Many of these apps use image caching libraries, like Glide, to make image caching easy, robust, and configurable.  Sharing these cached images can be a bit tricky.  A few questions arose when I tried:

  • How do I access files in the cache?
  • Do other apps have access to these files?
  • What are the needed permissions if I need to manually copy the file somewhere else?

Those are just a few of the questions that came up, each with a pretty simple answer.  A first pass at sharing images in my cache involved, re-caching the image to a public directory (ie. root of the external storage device), generating a URI, and passing that URI to an Intent to share with other apps.  This sounds simple, but it is complicated by the fact that I needed to ask the user for permission to read and write to the external storage on their device, if they were running Android 6.0 Marshmallow or above.  This flow worked, but I was looking for something much simpler for the user and myself, the developer.  Enter the FileProvider.

The FileProvider API

The FileProvider API, added to the Android Support libraries in version 22.0.0, is a ContentProvider-like API that allows URI specific sharing of files relevant to your application.  It can, temporarily, enable access (read and/or write) to the file at the URI.  You also do not need to copy the file to a more accessible location on the user’s device.  Setting up the FileProvider is very straightforward.

Setting up a File Provider

Setting up the FileProvider is a three step process.

First, like we would do with a ContentProvider, we added a <provider> entry to our AndroidManifest.xml file.


<provider 
    android:name="android.support.v4.content.FileProvider" 
    android:authorities="com.yourdomain.android.fileprovider" 
    android:grantUriPermissions="true" 
    android:exported="false">

    <meta-data 
        android:name="android.support.FILE_PROVIDER_PATHS" 
        android:resource="@xml/filepaths" />
</provider>

This provider entry has a <meta-data> element that points to an XML file, that defines what paths, within our application file structure, we want to expose with our FileProvider.  This is important.  You can only share files in the paths contained in this XML file.  I created an XML file in the “res/xml” folder.


<paths>
    <external-files-path name="image_files" path="." />
</paths>

In my particular instance, I am caching images to the External Files directory (which can be located on the SD card or a portion of internal memory, simulating an SD Card).  Other tags you can use here:

The name attribute is a string that will be used as a URI-path component in the URI that’s generated by FileProvider.  The path attribute, is the relative path to folder containing the files you would like share.  In my case, I’m fine with sharing the root because it only contains cached images, but you should probably be as granular as possible in the case where you have multiple directories or file types in this directory.

Now that my FileProvider is set up and configured, how do I share files out of my cache.

Sharing Cached Files

The specifics on sharing cached image files is highly dependant on the Image caching library used and how it’s configured.  At a high level, the process is:

  1. Cache an image to a location (done by an image caching library) that falls in the location specified in XML configuration file given to the FileProvider.
  2. Get a reference to that cached image using the java.io.File API.
  3. Pass the File reference to FileProvider.getUriForFile() to get a URI you can pass to other apps.
  4. Pack that URI into an Android Intent to start sharing with other apps.

Very straightforward.  Now this is how I did it with Glide.

My first step was to specify a location for Glide to store cached images.  I needed to subclass DiskLruCacheFactory to make this happen (Note: you don’t need to subclass DiskLruCacheFactory if you want to use the Cache directory).


private static class DiskCacheFactory extends DiskLruCacheFactory {

    DiskCacheFactory(final Context context, final String diskCacheName, long diskCacheSize) {
        super(new CacheDirectoryGetter() {
            File cacheDirectory = new File(context.getExternalFilesDir(null), diskCacheName);
            return cacheDirectory;
        }, (int) diskCacheSize);
    }
}

As you can see from the code snippet, I pass in a reference to the External Files directory.  By default, Glide uses the Cache directory.

I implement a GlideModule so that I can customize caching behavior in my application.  I specify my subclassed DiskLruCacheFactory class in my GlideModule.


public class MyGlideModule implements GlideModule {
    private static final int IMAGE_CACHE_SIZE = 200_000_000;

    @Override
    public void applyOptions(Context context, GlideBuilder builder) {
        builder.setDiskCache(new DiskCacheFactory(context, “.”, IMAGE_CACHE_SIZE));
    }

    @Override
    public void registerComponents(Context context, Glide glide) {
        // ...
    }
}

Now, I instruct Glide where it can find my GlideModule by adding a <meta-data> tag to the AndroidManifest.xml file, in <application> element.


<meta-data android:name="com.yourdomain.android.MyGlideModule" android:value="GlideModule"/>

Next, I invoke a manual file caching with Glide.


File file = Glide.with(context)
    .load(uri) // uri to the location on the web where the image originates
    .downloadOnly(Target.SIZE_ORIGINAL, Target.SIZE_ORIGINAL)
    .get();

This is specific to Glide and worth mentioning.  When Glide caches a file to disk, the file name is comprised of generated key with an integer appended to it.  There is no extension.  This is important because sharing an extension-less image will make it difficult for the apps you are sharing with to determine how this file should be handled (despite the MIME type being sent along with the image in the Intent).

Contents of a Glide image cache folder

 

I worked around this issue by simply copying that file, appending an extension to the duplicate in the process.  Since I know I am always handling JPEGs, I give these files the “.jpg” extension.  This may not work in cases where you may be dealing with different types of images like, GIFs, PNGs, WebP, etc.

Finally, I can get a URI from the FileProvider for my cached image by calling FileProvider.getUriForFile().

// be sure to use the authority given to your FileProvider in the AndroidManifest.xml file
String authority = “com.yourdomain.android.fileprovider”;
Uri uri = FileProvider.getUriForFile(context, authority, file);
Intent intent = new Intent(Intent.ACTION_SEND);
intent.putExtra(Intent.EXTRA_STREAM, uri);
intent.setType("image/jpeg");
context.startActivity(Intent.createChooser(intent, “Share via”);

The Uri generated by the FileProvider looks like:

content://com.yourdomain.android.fileprovider/external_files/bb65e6a364264255b4833d34e____some_key.0.jpg

Once the Intent makes its way to the target app, things should look just like you are sharing a picture you’ve just taken.

Sharing an image from Traffcams to Gmail

#Dassit

Diving into Android Things

I’ve always tinkered with electronics since my teens.  I went to school and graduated with a Computer Engineering degree with a focus on hardware (embedded systems, ASIC design, etc).  I somehow got into software since graduation and am now an Android developer at RadioPublic.  When Google announced Android Things in late 2016, I was beyond excited because it gave me a reason to break out my old breadboard, resistors, LEDs, and power regulators.  It also gave me a reason to buy a Raspberry Pi.  With Android Things I’m finally able to leverage my expertise in Android development in a more embedded paradigm.

I’m not going to cover a ton of Android Things fundamentals here because a lot of other really good developers have already done a great job at that:

I’m going to share a project I began working on, on Friday, February 10th and finished prototyping on Monday, February 13th.  When exploring something new, it’s important for me to find a practical application for it.  I’m a homeowner of a house built in the early 90s.  It can use some home-built tech from 2017.  A superficial problem I and other members of my household have trouble with is parking (correctly and in alignment) in the garage. Either we parked to close to the wall and can’t walk around both sides of the car or we aren’t sure if the car will get caught in the garage door. 

Hello CantParkRight

My first Android Things project is to build an assistive parking devices that uses several sensors to assist drivers in parking correctly in the garage.  Think of the signals you see when you enter a carwash.  Normally, there are two or three lights.  When you first enter, the light is green, which instructs you to keep driving forward.  When you have driven far enough, the light turns red to alert you to stop.  I want this in my garage.

Image from Signal Tech

 

The first step of this is prototyping CantParkRight.

Prototyping the Hardware

A huge advantage that Raspberry Pi-like devices provide is the ability to quickly and cheaply prototype assistive devices like the one I’m building.  The fact that I can officially leverage Android APIs (and down the road, Google APIs) is a big plus.

The supplies I used for my prototype include:

  • Raspberry Pi Model 3 running Android Things Preview 1
  • HC-SR04 Ultrasonic proximity sensor
  • 2 resistors, 10KΩ and 20KΩ
  • 3 LEDs (Red, Yellow, Green)
  • A breadboard
  • Assortment of jumper wire

I had most of my supplies already.  I bought a Raspberry Pi sometime ago and recently bought a pack of 5 HC-SR04 Ultrasonic sensors from Amazon.  I settled on the HC-SR04 after quite a bit of research.  How the HC-SR04 works is, you send a 10µS (microsecond) signal to the TRIGGER pin.  Sometime in the next few milliseconds, the HC-SR04 sends a burst of 8 40KHz sound waves that will eventually bounce back.  If an object is in range, the signal will bounce back and be detected by the receiver portion of the sensor.  The HC-SR04 then sends a variable length echo to any device attached to the ECHO pin.  The length of this pulse is determined by the distance the signal traveled before returning to the sensor.  The HC-SR04 has a range of around 400cm (~13 feet).  Perfect. Note: check out the datasheet on the HC-SR04 here.

After a lot of experimentation, here is how my circuit is arranged on my breadboard.

CantParkRight prototype schematic

 

A few hardware gotchas:

  • The accuracy varies greatly between sensors, especially the “knockoffs”.  Out of the pack of 5, some sensors were more sensitive to object movements while others exhibited less variation. 
  • The signal sent to the ECHO pin is at 5V.  The GPIO ports on the Raspberry Pi are rated for 3.3V.  You can damage it by sending to high a voltage, so I use the resistors to step the voltage down to 3.3V.

Prototyping the Software

The best part of this project was writing the software in Android Studio, deploying it via ADB (over WiFi), and seeing the results play out in front of my eyes.  I based the implementation on:

Over the course of the article, Daniel builds several implementations, some synchronous and some asynchronous using while loops, callbacks, and threads.  I decided I wanted to build upon that, but use RxJava to implement asynchronous handling of sensor data.  I’ve used RxJava in most of the Android apps that I’ve built.  It offers quick and convenient ways to build, reuse, and arrange pieces of logic that leverage the flow of data from one end to the next, basically perfect for CantParkRight.

Disclaimer: I am NOT an RxJava expert.  There are likely ways to do what I did using RxJava in a more efficient manner.

The critical piece is how I go about initiating the TRIGGER and waiting for an ECHO.  My first implementation of this used a RxJava Observable that essentially wraps a few While loops (check out my repository, then go to the first commit).  

The process was:

  • Send the 10µS signal to the TRIGGER
  • Start a while loop that executed until the ECHO goes hi, record the start time
  • Start a while loop that iterated until the ECHO goes low, record the end time and calculate the pulse width which is used to calculate the distance

It worked, sometimes, but often for reasons I’m still researching, the sensor would stop responding (ie. the ECHO never went hi after a TRIGGER).  The improvement came when I used a GpioCallback.  A GpioCallback allows you to listen to edge triggers (signal going high, signal going low, etc.) asynchronously.  I combined my implementation of a GpioCallback with a RxJava Observable (more specifically an Emitter).  From what I’ve read, the advantages of the Emitter over using a plain Observable (using Observable.create) is that it forces you to specify a BackPressure strategy, which is important when reading values pushed from a sensor.  CantParkRight uses the BUFFER BackpressureMode.  Using RxJava allows me to start the distance detection process, simply by subscribing to the correct Observable.   Using an Emitter also allows me to right code to unregister my GpioCallback when I unsubscribe in onDestroy(…).  This prevents future memory leaks.

What’s Next

For CantParkRight, I’m working up to building an actual device I can easily mount in my garage.  With the prototype complete, I turn my efforts to making that happen.  

In the meantime, you can check out the source code for CantParkRight on GitHub.  Be sure to follow me on Twitter or (cough) Google+ for updates on CantParkRight.  I intend on posting the finished project here in the coming months, but watching the repository is great way to keep up.

Audio Playback on Android

Playing music, podcasts, or other audio is one of the most common activities for smartphones in 2016. Most of the time, audio plays in the background while we are driving, cleaning, working out, or cooking. Architecting your application to support background audio playback is standard fare whether you are incorporating the standard Android MediaPlayer API or using a library, like ExoPlayer.

I want to briefly walk through how I architected PremoFM, an open source podcast player, to play audio in the background using ExoPlayer. It’s not perfect, but it’s a good starting point for the transition to ExoPlayer 2. If you want to learn more about ExoPlayer 2, check out my previous post, Exploring ExoPlayer 2.

The Architecture

In order to play audio in the background (or do anything in the background) the process that manages playback should be based on the Service class. Services, on Android, allow background work to be done without needing to have a user interface in the foreground. Naturally, I based the background audio playback of PremoFM on a Service, the PodcastPlayerService. It is obviously doing a lot. It manages audio playback, updates the database, listens for events like a headphone disconnection, and manages the persistent notification. Initially, most of my code involving direct management of the ExoPlayer was also embedded directly in this service. This led to a bloated class and a highly coupled design. I re-architected things when I added Google Cast support by creating a generic MediaPlayer abstract class.

The abstract class provided a common set of methods for interacting with ExoPlayer like, playing a media file, fast forward & rewind, getting playback state information, and changing the playback speed. All I needed to do was extend my MediaPlayer abstract class, using ExoPlayer. This resulted in all of my ExoPlayer code existing in one class, LocalMediaPlayer.

This will make my upgrade to ExoPlayer 2 significantly easier than if I had continued the previous architecture. All of the code that needs to change exists in one place. In my next article I will get into the nitty gritty of my migration.

Feel free to take a swing at it before I do. Check out the source code for PremoFM from GitHub and hack away.

Follow me on Twitter or visit my website for more Android Development related articles like this.