Almost 5 years ago, I built an app that put every NC DOT traffic camera in your hand. At that time, there was no such thing as looking up traffic conditions in Google Maps. I was also in the early days of being a professional Android developer and used this app as a way to learn the ins and outs of building, marketing, and publishing an app on Google Play.
Today, the app is still used by a handful of users, but I noticed it’s usage is very seasonal. People were using NC Traffic Cams (the name at the time) to check not only traffic congestion, but road and travel conditions. It did not surprise me to see spikes in usage around snow storms, tropical storms, or hurricanes. Things have stalled in recent years, but I am revisiting this app in hopes of building a really cool app and platform to view images from traffic cameras. The first step of this starts today. I’m re-launching NC Traffic Cams as simply, Traffcams.
Traffcams has been completely rebuilt and redesigned from the ground up. It is built on modern technology and given a more modern design. Traffcams contains all of the same features people came to love, plus a few more:
View cameras around you using your location
View cameras on a map
Favorite cameras for easy access
Search for traffic cameras by city, street, highway, or interstate name
Pin your searches for easy access
Traffcams is launching with traffic camera data availability for North Carolina. I hope to quickly add support for other states as soon as I can. You can get Traffcams, for free, on Google Play today.
I have a roadmap of features to implement that are going to be really useful for commuters. As with all side projects I work, Traffcams will allow me to learn new patterns and technologies for mobile development and really dive into using machine and deep learning.
Playing music, podcasts, or other audio is one of the most common activities for smartphones in 2016. Most of the time, audio plays in the background while we are driving, cleaning, working out, or cooking. Architecting your application to support background audio playback is standard fare whether you are incorporating the standard Android MediaPlayer API or using a library, like ExoPlayer.
I want to briefly walk through how I architected PremoFM, an open source podcast player, to play audio in the background using ExoPlayer. It’s not perfect, but it’s a good starting point for the transition to ExoPlayer 2. If you want to learn more about ExoPlayer 2, check out my previous post, Exploring ExoPlayer 2.
In order to play audio in the background (or do anything in the background) the process that manages playback should be based on the Service class. Services, on Android, allow background work to be done without needing to have a user interface in the foreground. Naturally, I based the background audio playback of PremoFM on a Service, the PodcastPlayerService. It is obviously doing a lot. It manages audio playback, updates the database, listens for events like a headphone disconnection, and manages the persistent notification. Initially, most of my code involving direct management of the ExoPlayer was also embedded directly in this service. This led to a bloated class and a highly coupled design. I re-architected things when I added Google Cast support by creating a generic MediaPlayer abstract class.
The abstract class provided a common set of methods for interacting with ExoPlayer like, playing a media file, fast forward & rewind, getting playback state information, and changing the playback speed. All I needed to do was extend my MediaPlayer abstract class, using ExoPlayer. This resulted in all of my ExoPlayer code existing in one class, LocalMediaPlayer.
This will make my upgrade to ExoPlayer 2 significantly easier than if I had continued the previous architecture. All of the code that needs to change exists in one place. In my next article I will get into the nitty gritty of my migration.
Feel free to take a swing at it before I do. Check out the source code for PremoFM from GitHub and hack away.
ExoPlayer is an extensible, application level media player for Android apps. It’s an alternative to the high level Android MediaPlayer API. MediaPlayer is built on several low level media playing APIs like AudioTrack and MediaDRM. These low level APIs can also be used by developers to build your own media player with it’s own custom behavior. ExoPlayer is built on these low level APIs and it has the additional benefit of being open source. You don’t need to build your own media player, from scratch, to get the behavior you need. You can extend ExoPlayer instead.
ExoPlayer was created and is maintained by Google. Out of the box, it can play a wide range of audio and video formats such as:
Remember, ExoPlayer is open source, so it can, with some extension, decode and play any format, as long as you build the capability.
Just a Few ExoPlayer Basics & Components
The ExoPlayer class is the actual media player. It depends on a few other components for media loading, buffering, decoding, and track selection. When all of the required components are configured, your app will interact with the ExoPlayer class to control the playback of your media. You can register listeners with ExoPlayer to be notified of certain events like buffering or the conclusion of a track.
The MediaSource class is charged with controlling what media will be played and how it will be loaded. The MediaSource class is used directly by the ExoPlayer class. MediaSource enables a ton of different behaviors. For example, you can merge multiple MediaSource classes together to show video, along with captions or you can use multiple MediaSource classes to create playlists where the transitions between those sources are seamless (gapless).
There are several prebuilt MediaSource classes available out of the box in ExoPlayer to support many common use cases like playing normal media files or streaming DASH content from a remote server. Of course, you can implement your own to support your application’s use case.
The DataSource class provides samples of data to a MediaSource. These samples of data can originate from a file on the SD card, a resource in the assets directory, and even a remote server. You can use one of the prebuilt DataSource classes or build your own to read data in a way necessary to support your use case. For example, maybe your application will stream media on a company intranet. You can use a custom DataSource to define the rules and protocols that allow this to happen securely.
The TrackSelector class dictates which track is selected for rendering and playback.
The Renderer class decodes media and renders it. An example implementation is the MediaCodecAudioRenderer, which decodes audio data and renders it using several lower level ExoPlayer APIs.
The LoadControl class defines the buffering behavior of a particular MediaSource.
At this point, I know as much about ExoPlayer 2 as you do. I have some pretty extensive knowledge of ExoPlayer 1.X because I’ve used it on several Android projects that I’ve worked on. This series on ExoPlayer 2 will document my journey of learning about ExoPlayer 2 and upgrading an app to ExoPlayer 2 that is currently using ExoPlayer 1.5.9. I will probably make mistakes, but I hope this series will help a few other developers in their effort to implement ExoPlayer 2 in a real world app.
The app I will be using for demonstrating this upgrade is PremoFM. PremoFM is an open-source podcast player that I started building almost two years ago. The source code for the app is on GitHub (https://github.com/emuneee/premofm). I will be using a branch (https://github.com/emuneee/premofm/tree/exoplayer_2) for all of my ExoPlayer 2 upgrade work. I invite you to follow along. I’ll be back next week to discuss the structure of a typical audio playing app and how ExoPlayer fits in.
You like testing? I love testing, especially unit testing. ContentProviders are the underpinnings of many data layer implementations in Android apps and obviously, an important thing to test. I added some new code to a ContentProvider in the RadioPublic app and wanted to verify that the ContentProvider and model code worked. I spent an hour looking through the documentation and online. I also wanted to use the Robolectric test framework already setup in the app. After concluding my research, I found what I was looking for and its very straightforward.
First of all, if you haven’t already done so, in your module’s build.gradle file, depedencies section:
The next version of NC Traffic Cams (version 3.0, but with a different name) is nearly feature complete. I completely overachieved on this project, but purposefully. Since I stopped actively building PremoFM, I wanted to start on a project that would teach me a few advanced Android development tricks. I’m building the next version of NC Traffic Cams with the following in mind:
Model – View – Presenter, similar to Model – View -Controller, enforces the separation of Android specific logic and business logic in an effort to make the business logic more testable. I’m also using Loaders to keep Presenters around. This is great because it means I have to do no work to persist the apps state during a device rotation.
Lots of unit tests.
Lots of RxJava, in the latest released version of RxJava I wrote a ton of AsyncTasks. This time around 0 AsyncTasks. I have RxJava to thank.
OrmLite for the data layer, no more writing SQL selects and inserts by hand (I actually tried to integrate with Realm, however, I threw it out because of the thread management issues I encountered).
OkHttp / Retrofit / Gson for the API later, no more writing HttpUrlConnection or JSON parsing logic. I have a simple API setup for NC Traffic Cams and Retrofit made it ridiculously easy for me to get that data into NC Traffic Cams.
I wrote substantially less code this time around and the app is better, more stable, and more performant. Can’t wait to show you all what it looks like. It’ll be done soon™.
Here’s NC Traffic Cams v1 & v2 since it’s #ThrowbackThursday