Share the Cache

A lot of Android apps make viewing images a core part of their user experience.  Many of these apps use image caching libraries, like Glide, to make image caching easy, robust, and configurable.  Sharing these cached images can be a bit tricky.  A few questions arose when I tried:

  • How do I access files in the cache?
  • Do other apps have access to these files?
  • What are the needed permissions if I need to manually copy the file somewhere else?

Those are just a few of the questions that came up, each with a pretty simple answer.  A first pass at sharing images in my cache involved, re-caching the image to a public directory (ie. root of the external storage device), generating a URI, and passing that URI to an Intent to share with other apps.  This sounds simple, but it is complicated by the fact that I needed to ask the user for permission to read and write to the external storage on their device, if they were running Android 6.0 Marshmallow or above.  This flow worked, but I was looking for something much simpler for the user and myself, the developer.  Enter the FileProvider.

The FileProvider API

The FileProvider API, added to the Android Support libraries in version 22.0.0, is a ContentProvider-like API that allows URI specific sharing of files relevant to your application.  It can, temporarily, enable access (read and/or write) to the file at the URI.  You also do not need to copy the file to a more accessible location on the user’s device.  Setting up the FileProvider is very straightforward.

Setting up a File Provider

Setting up the FileProvider is a three step process.

First, like we would do with a ContentProvider, we added a <provider> entry to our AndroidManifest.xml file.


<provider 
    android:name="android.support.v4.content.FileProvider" 
    android:authorities="com.yourdomain.android.fileprovider" 
    android:grantUriPermissions="true" 
    android:exported="false">

    <meta-data 
        android:name="android.support.FILE_PROVIDER_PATHS" 
        android:resource="@xml/filepaths" />
</provider>

This provider entry has a <meta-data> element that points to an XML file, that defines what paths, within our application file structure, we want to expose with our FileProvider.  This is important.  You can only share files in the paths contained in this XML file.  I created an XML file in the “res/xml” folder.


<paths>
    <external-files-path name="image_files" path="." />
</paths>

In my particular instance, I am caching images to the External Files directory (which can be located on the SD card or a portion of internal memory, simulating an SD Card).  Other tags you can use here:

The name attribute is a string that will be used as a URI-path component in the URI that’s generated by FileProvider.  The path attribute, is the relative path to folder containing the files you would like share.  In my case, I’m fine with sharing the root because it only contains cached images, but you should probably be as granular as possible in the case where you have multiple directories or file types in this directory.

Now that my FileProvider is set up and configured, how do I share files out of my cache.

Sharing Cached Files

The specifics on sharing cached image files is highly dependant on the Image caching library used and how it’s configured.  At a high level, the process is:

  1. Cache an image to a location (done by an image caching library) that falls in the location specified in XML configuration file given to the FileProvider.
  2. Get a reference to that cached image using the java.io.File API.
  3. Pass the File reference to FileProvider.getUriForFile() to get a URI you can pass to other apps.
  4. Pack that URI into an Android Intent to start sharing with other apps.

Very straightforward.  Now this is how I did it with Glide.

My first step was to specify a location for Glide to store cached images.  I needed to subclass DiskLruCacheFactory to make this happen (Note: you don’t need to subclass DiskLruCacheFactory if you want to use the Cache directory).


private static class DiskCacheFactory extends DiskLruCacheFactory {

    DiskCacheFactory(final Context context, final String diskCacheName, long diskCacheSize) {
        super(new CacheDirectoryGetter() {
            File cacheDirectory = new File(context.getExternalFilesDir(null), diskCacheName);
            return cacheDirectory;
        }, (int) diskCacheSize);
    }
}

As you can see from the code snippet, I pass in a reference to the External Files directory.  By default, Glide uses the Cache directory.

I implement a GlideModule so that I can customize caching behavior in my application.  I specify my subclassed DiskLruCacheFactory class in my GlideModule.


public class MyGlideModule implements GlideModule {
    private static final int IMAGE_CACHE_SIZE = 200_000_000;

    @Override
    public void applyOptions(Context context, GlideBuilder builder) {
        builder.setDiskCache(new DiskCacheFactory(context, “.”, IMAGE_CACHE_SIZE));
    }

    @Override
    public void registerComponents(Context context, Glide glide) {
        // ...
    }
}

Now, I instruct Glide where it can find my GlideModule by adding a <meta-data> tag to the AndroidManifest.xml file, in <application> element.


<meta-data android:name="com.yourdomain.android.MyGlideModule" android:value="GlideModule"/>

Next, I invoke a manual file caching with Glide.


File file = Glide.with(context)
    .load(uri) // uri to the location on the web where the image originates
    .downloadOnly(Target.SIZE_ORIGINAL, Target.SIZE_ORIGINAL)
    .get();

This is specific to Glide and worth mentioning.  When Glide caches a file to disk, the file name is comprised of generated key with an integer appended to it.  There is no extension.  This is important because sharing an extension-less image will make it difficult for the apps you are sharing with to determine how this file should be handled (despite the MIME type being sent along with the image in the Intent).

Contents of a Glide image cache folder

 

I worked around this issue by simply copying that file, appending an extension to the duplicate in the process.  Since I know I am always handling JPEGs, I give these files the “.jpg” extension.  This may not work in cases where you may be dealing with different types of images like, GIFs, PNGs, WebP, etc.

Finally, I can get a URI from the FileProvider for my cached image by calling FileProvider.getUriForFile().

// be sure to use the authority given to your FileProvider in the AndroidManifest.xml file
String authority = “com.yourdomain.android.fileprovider”;
Uri uri = FileProvider.getUriForFile(context, authority, file);
Intent intent = new Intent(Intent.ACTION_SEND);
intent.putExtra(Intent.EXTRA_STREAM, uri);
intent.setType("image/jpeg");
context.startActivity(Intent.createChooser(intent, “Share via”);

The Uri generated by the FileProvider looks like:

content://com.yourdomain.android.fileprovider/external_files/bb65e6a364264255b4833d34e____some_key.0.jpg

Once the Intent makes its way to the target app, things should look just like you are sharing a picture you’ve just taken.

Sharing an image from Traffcams to Gmail

#Dassit

Diving into Android Things

I’ve always tinkered with electronics since my teens.  I went to school and graduated with a Computer Engineering degree with a focus on hardware (embedded systems, ASIC design, etc).  I somehow got into software since graduation and am now an Android developer at RadioPublic.  When Google announced Android Things in late 2016, I was beyond excited because it gave me a reason to break out my old breadboard, resistors, LEDs, and power regulators.  It also gave me a reason to buy a Raspberry Pi.  With Android Things I’m finally able to leverage my expertise in Android development in a more embedded paradigm.

I’m not going to cover a ton of Android Things fundamentals here because a lot of other really good developers have already done a great job at that:

I’m going to share a project I began working on, on Friday, February 10th and finished prototyping on Monday, February 13th.  When exploring something new, it’s important for me to find a practical application for it.  I’m a homeowner of a house built in the early 90s.  It can use some home-built tech from 2017.  A superficial problem I and other members of my household have trouble with is parking (correctly and in alignment) in the garage. Either we parked to close to the wall and can’t walk around both sides of the car or we aren’t sure if the car will get caught in the garage door. 

Hello CantParkRight

My first Android Things project is to build an assistive parking devices that uses several sensors to assist drivers in parking correctly in the garage.  Think of the signals you see when you enter a carwash.  Normally, there are two or three lights.  When you first enter, the light is green, which instructs you to keep driving forward.  When you have driven far enough, the light turns red to alert you to stop.  I want this in my garage.

Image from Signal Tech

 

The first step of this is prototyping CantParkRight.

Prototyping the Hardware

A huge advantage that Raspberry Pi-like devices provide is the ability to quickly and cheaply prototype assistive devices like the one I’m building.  The fact that I can officially leverage Android APIs (and down the road, Google APIs) is a big plus.

The supplies I used for my prototype include:

  • Raspberry Pi Model 3 running Android Things Preview 1
  • HC-SR04 Ultrasonic proximity sensor
  • 2 resistors, 10KΩ and 20KΩ
  • 3 LEDs (Red, Yellow, Green)
  • A breadboard
  • Assortment of jumper wire

I had most of my supplies already.  I bought a Raspberry Pi sometime ago and recently bought a pack of 5 HC-SR04 Ultrasonic sensors from Amazon.  I settled on the HC-SR04 after quite a bit of research.  How the HC-SR04 works is, you send a 10µS (microsecond) signal to the TRIGGER pin.  Sometime in the next few milliseconds, the HC-SR04 sends a burst of 8 40KHz sound waves that will eventually bounce back.  If an object is in range, the signal will bounce back and be detected by the receiver portion of the sensor.  The HC-SR04 then sends a variable length echo to any device attached to the ECHO pin.  The length of this pulse is determined by the distance the signal traveled before returning to the sensor.  The HC-SR04 has a range of around 400cm (~13 feet).  Perfect. Note: check out the datasheet on the HC-SR04 here.

After a lot of experimentation, here is how my circuit is arranged on my breadboard.

CantParkRight prototype schematic

 

A few hardware gotchas:

  • The accuracy varies greatly between sensors, especially the “knockoffs”.  Out of the pack of 5, some sensors were more sensitive to object movements while others exhibited less variation. 
  • The signal sent to the ECHO pin is at 5V.  The GPIO ports on the Raspberry Pi are rated for 3.3V.  You can damage it by sending to high a voltage, so I use the resistors to step the voltage down to 3.3V.

Prototyping the Software

The best part of this project was writing the software in Android Studio, deploying it via ADB (over WiFi), and seeing the results play out in front of my eyes.  I based the implementation on:

Over the course of the article, Daniel builds several implementations, some synchronous and some asynchronous using while loops, callbacks, and threads.  I decided I wanted to build upon that, but use RxJava to implement asynchronous handling of sensor data.  I’ve used RxJava in most of the Android apps that I’ve built.  It offers quick and convenient ways to build, reuse, and arrange pieces of logic that leverage the flow of data from one end to the next, basically perfect for CantParkRight.

Disclaimer: I am NOT an RxJava expert.  There are likely ways to do what I did using RxJava in a more efficient manner.

The critical piece is how I go about initiating the TRIGGER and waiting for an ECHO.  My first implementation of this used a RxJava Observable that essentially wraps a few While loops (check out my repository, then go to the first commit).  

The process was:

  • Send the 10µS signal to the TRIGGER
  • Start a while loop that executed until the ECHO goes hi, record the start time
  • Start a while loop that iterated until the ECHO goes low, record the end time and calculate the pulse width which is used to calculate the distance

It worked, sometimes, but often for reasons I’m still researching, the sensor would stop responding (ie. the ECHO never went hi after a TRIGGER).  The improvement came when I used a GpioCallback.  A GpioCallback allows you to listen to edge triggers (signal going high, signal going low, etc.) asynchronously.  I combined my implementation of a GpioCallback with a RxJava Observable (more specifically an Emitter).  From what I’ve read, the advantages of the Emitter over using a plain Observable (using Observable.create) is that it forces you to specify a BackPressure strategy, which is important when reading values pushed from a sensor.  CantParkRight uses the BUFFER BackpressureMode.  Using RxJava allows me to start the distance detection process, simply by subscribing to the correct Observable.   Using an Emitter also allows me to right code to unregister my GpioCallback when I unsubscribe in onDestroy(…).  This prevents future memory leaks.

What’s Next

For CantParkRight, I’m working up to building an actual device I can easily mount in my garage.  With the prototype complete, I turn my efforts to making that happen.  

In the meantime, you can check out the source code for CantParkRight on GitHub.  Be sure to follow me on Twitter or (cough) Google+ for updates on CantParkRight.  I intend on posting the finished project here in the coming months, but watching the repository is great way to keep up.

Audio Playback on Android

Playing music, podcasts, or other audio is one of the most common activities for smartphones in 2016. Most of the time, audio plays in the background while we are driving, cleaning, working out, or cooking. Architecting your application to support background audio playback is standard fare whether you are incorporating the standard Android MediaPlayer API or using a library, like ExoPlayer.

I want to briefly walk through how I architected PremoFM, an open source podcast player, to play audio in the background using ExoPlayer. It’s not perfect, but it’s a good starting point for the transition to ExoPlayer 2. If you want to learn more about ExoPlayer 2, check out my previous post, Exploring ExoPlayer 2.

The Architecture

In order to play audio in the background (or do anything in the background) the process that manages playback should be based on the Service class. Services, on Android, allow background work to be done without needing to have a user interface in the foreground. Naturally, I based the background audio playback of PremoFM on a Service, the PodcastPlayerService. It is obviously doing a lot. It manages audio playback, updates the database, listens for events like a headphone disconnection, and manages the persistent notification. Initially, most of my code involving direct management of the ExoPlayer was also embedded directly in this service. This led to a bloated class and a highly coupled design. I re-architected things when I added Google Cast support by creating a generic MediaPlayer abstract class.

The abstract class provided a common set of methods for interacting with ExoPlayer like, playing a media file, fast forward & rewind, getting playback state information, and changing the playback speed. All I needed to do was extend my MediaPlayer abstract class, using ExoPlayer. This resulted in all of my ExoPlayer code existing in one class, LocalMediaPlayer.

This will make my upgrade to ExoPlayer 2 significantly easier than if I had continued the previous architecture. All of the code that needs to change exists in one place. In my next article I will get into the nitty gritty of my migration.

Feel free to take a swing at it before I do. Check out the source code for PremoFM from GitHub and hack away.

Follow me on Twitter or visit my website for more Android Development related articles like this.

Exploring ExoPlayer 2

ExoPlayer is an extensible, application level media player for Android apps. It’s an alternative to the high level Android MediaPlayer API. MediaPlayer is built on several low level media playing APIs like AudioTrack and MediaDRM. These low level APIs can also be used by developers to build your own media player with it’s own custom behavior. ExoPlayer is built on these low level APIs and it has the additional benefit of being open source. You don’t need to build your own media player, from scratch, to get the behavior you need. You can extend ExoPlayer instead.

ExoPlayer was created and is maintained by Google. Out of the box, it can play a wide range of audio and video formats such as:

  • MP3
  • MP4
  • WAV
  • MKV
  • MPEG-TS
  • Ogg

Remember, ExoPlayer is open source, so it can, with some extension, decode and play any format, as long as you build the capability.

Just a Few ExoPlayer Basics & Components

ExoPlayer component diagram

Source: ExoPlayer Documentation on GitHub

ExoPlayer

The ExoPlayer class is the actual media player. It depends on a few other components for media loading, buffering, decoding, and track selection. When all of the required components are configured, your app will interact with the ExoPlayer class to control the playback of your media. You can register listeners with ExoPlayer to be notified of certain events like buffering or the conclusion of a track.

MediaSource

The MediaSource class is charged with controlling what media will be played and how it will be loaded. The MediaSource class is used directly by the ExoPlayer class. MediaSource enables a ton of different behaviors. For example, you can merge multiple MediaSource classes together to show video, along with captions or you can use multiple MediaSource classes to create playlists where the transitions between those sources are seamless (gapless).

There are several prebuilt MediaSource classes available out of the box in ExoPlayer to support many common use cases like playing normal media files or streaming DASH content from a remote server. Of course, you can implement your own to support your application’s use case.

DataSource

The DataSource class provides samples of data to a MediaSource. These samples of data can originate from a file on the SD card, a resource in the assets directory, and even a remote server. You can use one of the prebuilt DataSource classes or build your own to read data in a way necessary to support your use case. For example, maybe your application will stream media on a company intranet. You can use a custom DataSource to define the rules and protocols that allow this to happen securely.

TrackSelector

The TrackSelector class dictates which track is selected for rendering and playback.

Renderer

The Renderer class decodes media and renders it. An example implementation is the MediaCodecAudioRenderer, which decodes audio data and renders it using several lower level ExoPlayer APIs.

LoadControl

The LoadControl class defines the buffering behavior of a particular MediaSource.

Finally

At this point, I know as much about ExoPlayer 2 as you do. I have some pretty extensive knowledge of ExoPlayer 1.X because I’ve used it on several Android projects that I’ve worked on. This series on ExoPlayer 2 will document my journey of learning about ExoPlayer 2 and upgrading an app to ExoPlayer 2 that is currently using ExoPlayer 1.5.9. I will probably make mistakes, but I hope this series will help a few other developers in their effort to implement ExoPlayer 2 in a real world app.

The app I will be using for demonstrating this upgrade is PremoFM. PremoFM is an open-source podcast player that I started building almost two years ago. The source code for the app is on GitHub (https://github.com/emuneee/premofm). I will be using a branch (https://github.com/emuneee/premofm/tree/exoplayer_2) for all of my ExoPlayer 2 upgrade work. I invite you to follow along. I’ll be back next week to discuss the structure of a typical audio playing app and how ExoPlayer fits in.

Please follow me on Twitter (@emuneee).

Finally, I’m working with a great team at RadioPublic to build an awesome podcast experience for Android and iOS. Hop on the beta today.

Some resources to review in the meantime:

ExoPlayer on GitHub

https://github.com/google/ExoPlayer

ExoPlayer — Developer Guide

https://google.github.io/ExoPlayer/guide.html

ExoPlayer on Medium

https://medium.com/google-exoplayer

Android Developer Backstage 48: ExoPlayer

http://androidbackstage.blogspot.com/2016/05/episode-48-exoplayer.html

Testing Your ContentProvider with Robolectric

You like testing?  I love testing, especially unit testing.  ContentProviders are the underpinnings of many data layer implementations in Android apps and obviously, an important thing to test.  I added some new code to a ContentProvider in the RadioPublic app and wanted to verify that the ContentProvider and model code worked.  I spent an hour looking through the documentation and online. I also wanted to use the Robolectric test framework already setup in the app.  After concluding my research, I found what I was looking for and its very straightforward.

First of all, if you haven’t already done so, in your module’s build.gradle file, depedencies section:

testCompile 'junit:junit:4.12'
testCompile 'org.robolectric:robolectric:3.1.1'

In your unit test class, class annotations

@RunWith(RobolectricTestRunner.class)
@Config(constants = BuildConfig.class, sdk = 18)

Register your ContentProvider with the appropriate Authority:

private static final String AUTHORITY = "com.example.debug";

@Before
public void setup() {
    YourProvider provider = new YourProvider();
    provider.onCreate();
    ShadowContentResolver.registerProvider(
            AUTHORITY, provider
    );
}

Get a reference to the ContentResolver, since this is, most likely, how you’ll be interacting with your provider:

ContentResolver contentResolver = RuntimeEnvironment.application.getContentResolver();

Finally, test your ContentProvider

@Test
public void getSomeData() {
   ...
   cursor = contentResolver.query(Test.MyUri, null, null, null, null);
   ...
}

That’s it!  Hopefully this will save you time and encourage you to write tests for your ContentProviders.