A better way to implement Back Button in Sencha Touch’s NavigationView

If you’re building a Sencha Touch 2 app and have deployed it to Android, expect that you’re going to get hit up about having the device’s hardware back button working.  Sencha will try to push you towards using routes as evidenced in their docs when discussing history, but to me that’s invasive and goes against the very nature of event driven UI events that have made this framework so popular. Moreover, this gets more complicated when using NavigationView as your main mechanism to move back and forth on a given tab, especially for me where my application had multiple navigation tabs.

To that end, I decided it would be much easier to simply use the browser’s history to manage the back button.  In this way, you can simply push the browser’s state as the user moves forward in your app and then pop the state as the user moves backward in the application. It was also important that different tabs would not interfere with each other’s state. Let’s take a look at how I did that assuming this is an MVC-based application.

Step 1: Add refs and controls to your navigation view’s controller for your view and your navbar with your application’s back button (different from your browser or device back button):

        refs: {
            portal: 'portal',  //Portal Tab Panel
            myContainer: 'MyContainer',  //NavigationView
            navBar: 'myContainer #navbar',  //itemId for navigationBar on NavigationView
        control: {
            myContainer: {
                push: 'onPush'
            navBar: {
                back: 'onBack'  //trap the back event for the app's back button

Here we’re just establishing references to the view components and binding the push and back events to methods we will implement in the next step. Notice that it’s important to trap your app’s back button so it pops the state similar to how the browser or device back button would.

Step 2: Add the implementation for onPush and onBack:

    onPush: function (view, item) {
        history.pushState(); //push the state

    onBack: function () {
        history.back();  //pop the state to trigger listener in step 3
        return false;  // return false so listener will take care of this

Here we leverage the JavaScript history object to push a new state on the stack as the user moves forward in the app’s NavigationView and pop the state from the stack as the user moves back.

Step 3: Add a “popstate” event listener to the window in your Controller’s launch method:

    launch: function () {
        var that = this;
        window.addEventListener('popstate', function () {
            var portal = that.getPortal();  // won't have portal until app is initialized
            if (portal) {
                var container = getTabContainer(portal.getActiveItem());
                if (container && container.getItemId() === "MyTab"
                    && container.getActiveItem().getItemId() !== "baseView") {
        }, false);

Here we add a “popstate” event listener to the window so that we can pop the last window off the stack as the user moves back. Notice I do a few checks, one to be sure we have instantiated the portal and the other to check that the container I’m on is the one for this NavigationView (i.e., the “MyTab” check). You could imagine an app with multiple tabs that you want to make sure the other tab controllers aren’t responding to the event when the user uses the device back button (which is NOT tied to a controller; just the popstate event). The final check is to check to see if I am on the “baseView” because I have no need to pop the container if I’m at the root of a particular NavigationView.

That’s all there’s too it. No need to rearchitect your app to use Sencha routes and no complicated code to manage each NavigationView’s state. All you need to do is implement this same code in each of your NavigationView tabs and you’re all set.

Thanks to Greg from Sencha Touch support for pointing out this a viable alternative!


Complementing MongoDB with Real-time Solr Search


I’ve been a long time user and evangelist of Solr given its amazing ability to fulltext index large amounts of structured and unstructured data. I’ve successfully used it on a number of projects to add both Google-like search and faceted (or filtered) search to our applications. I was quite pleased to find out that MongoDB has a connector for Solr to allow that same type of searching against my application that is back-ended with MongoDB. In this blog post, we’ll explore how to configure MongoDB and Solr and demonstrate its usage with the MongoDB application I wrote several months back that’s outlined in my blog post Mobile GeoLocation App in 30 minutes – Part 1: Node.js and MongoDB.

Mongo-Connector: Realtime access to MongoDB with Solr

I stumbled upon this connector during my research, mongo-connector. This was exactly the sort of thing I was looking for namely because it hooks into MongoDB’s oplog (somewhat similar to a transaction log in Oracle) and updates Solr in real-time based on any create-update-delete operations made to the system. The oplog is critical to MongoDB for master-slave replication, thus it is a requirement that MongoDB needs to be set-up as a replica set (one primary, n-number of slaves; in my case 2). Basically, I followed the instructions here to setup a developer replica set. Once established, I started each mongod instance as follows so they would run in the background (–fork) and use minimal space due to my disk space limitation (–smallfiles).

% mongod –port 27017 –dbpath /srv/mongodb/rs0-0 –replSet rs0 –smallfiles –fork –logpath /srv/mongodb/rs0-0.log

% mongod –port 27018 –dbpath /srv/mongodb/rs0-1 –replSet rs0 –smallfiles –fork –logpath /srv/mongodb/rs0-1.log

% mongod –port 27019 –dbpath /srv/mongodb/rs0-2 –replSet rs0 –smallfiles –fork –logpath /srv/mongodb/rs0-2.log

Once you have MongoDB configured and running you need to install the mongo-connector separately. It relies on Python, so if not installed, you will want to install version 2.7 or 3. To install the mongo-connector I simply ran this command to install it as a package:

% pip install mongo-connector

After it is installed you can run it is as follows so that it will run in the background as well using nohup (hold off on running this till after the next section):

% nohup sudo python mongo_connector.py -m localhost:27017 -t http://solr-pet.xxx.com:9650/solr-pet -d ./doc_managers/solr_doc_manager.py > mongo-connector.out 2>&1

A couple things to note here is that the -m option points to the localhost and port of the primary node in the MongoDB replica set. The -b option is the location of Solr server and context. In my case, it was a remote based instance of Solr. The -n option is the namespace to the the Mongo databases and collection I wish to have indexed by Solr (without this it would index the entire database). Finally, the -d option indicates which doc_manager I wish to use, which of course, in my case is Solr. There are other options for Elastic search as well, if you chose to use that instead.

With this is place your MongoDB instance is configured to start pushing updates to Solr in real-time, however, let’s take a look at the next section to see what we need to do on the Solr side of things.

Configuring Solr to work with Mongo-Connector

Before we run the mongo-connector, there are a few things we need to do in Solr to get it to work propertly. First, to get the mongo-connector to post documents to Solr you must be sure that you have the Solr REST service available for update operations. Second, you must configure the schema.xml with specific fields that are required as well as any fields that are being stored in Mongo. On the first point, we need to be sure that the following line exists in the solr.xml config:

<requestHandler name=”/update” class=”solr.UpdateRequestHandler”/>

As of version 4.0 of Solr, this request handler supports XML, JSON, CSV and javabin. It allows the mongo-connector to send the data to the REST interface for incremental indexing. Regarding the schema, you must include a field per each entry you have (or are going to add) to your Mongo schema. Here’s an example of what my schema.xml looks like:

<schema name="solr-suggest-box" version="1.5">
                <fieldType name="string" class="solr.StrField" sortMissingLast="true" omitNorms="true"/>
                <fieldType name="long" class="solr.TrieLongField" precisionStep="0" omitNorms="true" positionIncrementGap="0" />
                <fieldType name="text_wslc" class="solr.TextField" positionIncrementGap="100">
                        <analyzer type="index">
                                <tokenizer class="solr.WhitespaceTokenizerFactory"/>
                                <filter class="solr.LowerCaseFilterFactory"/>
                        <analyzer type="query">
                                <tokenizer class="solr.WhitespaceTokenizerFactory"/>
                                <filter class="solr.LowerCaseFilterFactory"/>
                <fieldType name="tdouble" class="solr.TrieDoubleField" precisionStep="8" positionIncrementGap="0"/>
                <fieldType name="location" class="solr.LatLonType" subFieldSuffix="_coordinate"/>
                <fieldType name="tdate" class="solr.TrieDateField" omitNorms="true" precisionStep="6" positionIncrementGap="0"/>

                <field name="_id" type="string" indexed="true" stored="true" required="true" />
                <field name="name" type="text_wslc" indexed="true" stored="true" />
                <field name="description" type="text_wslc" indexed="true" stored="true" />
                <field name="date" type="tdate" indexed="true" stored="true" />
                <field name="nmdsc" type="text_wslc" indexed="true" stored="true" multiValued="true" />
                <field name="coordinate" type="location" indexed="true" stored="true"/>
                <field name="_version_" type="long" indexed="true" stored="true"/>
                <field name="_ts" type="long" indexed="true" stored="true"/>
                <field name="_ns" type="string" indexed="true" stored="true"/>
                <field name="ns" type="string" indexed="true" stored="true"/>
                <field name="coords" type="string" indexed="true" stored="true" multiValued="true" />
                <dynamicField name="*" type="string" indexed="true" stored="true"/>



        <!-- we don't want too many results in this usecase -->
        <solrQueryParser defaultOperator="AND"/>

        <copyField source="name" dest="nmdsc"/>
        <copyField source="description" dest="nmdsc"/>

I found that all the underscore fields (lines 21-32) I have were required to get this working correctly. To future proof this, on line 32 I added a dynamicField so that the schema could change without affecting the Solr configuration — a tenant of MongoDB is to have flexible schema. Finally, I use copyfield on lines 42-43 to only include those fields I wish to search against, which name and description were only of interest for my use case. The “nmdsc” field will be used as the default search field for the UI as per line 37, which I will go into next.

After your config is in place and you start the Solr server, you can now launch the mongo-connector successfully and it will continuously update Solr with any updates that are saved to Mongo in real-time. I used nohup to kick it off in the background as shown above.

Using Solr in the DogTags Application

To tie this all together, we need to alter the UI of the original application to allow for Solr searching. See my original blog post for a refresher: Mobile GeoLocation App in 30 minutes – Part 2: Sencha Touch. Recall that this is a Sencha Touch MVC application and so all I needed to do was add a new store for the Solr REST/JSONP service that I will call for searching and update the UI to provide a control for the user to conduct a search. Let’s take a look at each of these:

Ext.define('MyApp.store.PetSearcher', {
    extend: 'Ext.data.Store',
    requires: [
    config: {
        autoLoad: true,
        model: 'MyApp.model.Pet',
        storeId: 'PetSearcher',
        proxy: {
            type: 'jsonp',
            url: 'http://solr-pet.xxx.com:9650/solr-pet/select/',
            callbackKey: 'json.wrf',
            limitParam: 'rows',
            extraParams: {
                wt: 'json',
                'json.nl': 'arrarr'
            reader: {
                root: 'response.docs',
                type: 'json'

Above is the new store I’m using to call Solr and map its results back to the original model that I used before. Note the differences from the original store that our specific to Solr, namely the URL and some of the proxy parameters on lines 10-18. The collection of docs are a bit buried in the response, so I have to set the root accordingly as I did on line 20.

The next thing I need to do is add a control to my view so the user can interact with the search service. In my case I chose to use a search field docked at the top and have it update the list based on the search term. In my view, the code looks as follows:

Ext.define('MyApp.view.PetPanel', {
    extend: 'Ext.Panel',
    alias: 'widget.petListPanel',
    config: {
        layout: {
            type: 'fit'
        items: [
                xtype: 'toolbar',
                docked: 'top',
                title: 'Dog Tags'
                xtype: 'searchfield',
                docked: 'top',
                name: 'query',
                id: 'SearchQuery'
                xtype: 'list',
                store: 'PetTracker',
                id: 'PetList',
                itemId: 'petList',
                emptyText: "<div>No Dogs Found</div>",
                loadingText: "Loading Pets",
                itemTpl: [
                    '<div>{name} is a {description} and is located at {latitude} (latitude) and {longitude} (longitude)</div>'
        listeners: [
                fn: 'onPetsListItemTap',
                event: 'itemtap',
                delegate: '#PetList'
                fn: 'onSearch',
                event: 'change',
                delegate: '#SearchQuery'
                fn: 'onReset',
                event: 'clearicontap',
                delegate: '#SearchQuery'
    onPetsListItemTap: function (dataview, index, target, record, e, options) {
        this.fireEvent('petSelectCommand', this, record);
    onSearch: function (dataview, newValue, oldValue, eOpts) {
        this.fireEvent('petSearch', this, newValue, oldValue, eOpts);
    onReset: function() {
        this.fireEvent('reset', this);

Lines 15-18 add the control and lines 38-47 define the listeners I’m using to fire events in my controller. The controller supports those events as follows:

    onPetSearch: function(view, value, oldvalue, opts) {
        if (value) {
            var store = Ext.getStore('PetSearcher');
            var list = this.getPetList();
                params: {q:value},
                callback: function() {
                    console.log("we searched");

    onReset: function (view) {
        var store = Ext.getStore('PetTracker');
        var list = view.down("#petList");

Since the model is essentially the same between Mongo and Solr, all I have to do is swap the stores and reload them to get the results updated accordingly. On line 6, you can see where I pass in the dynamic search term so that is loads the PetSearcher store with that value. When I reset the search value, I want to go back to the original PetTracker store to reload the full results as per lines 17-21. In both, I set the list component’s view to the corresponding store as I did on lines 12 and 21 so that the list will show the results according to the store it has been set to.


In this short example, we established that we could provide real-time search with Solr against MongoDB and augment an existing application to add a search control to use it. This has the potential of being a great compliment to Mongo because it keeps us from having to add additional indexes to MongoDB for searching which has a performance cost to it, especially as the record set grows. Solr removes this burden from Mongo and leverages an incremental index that can be updated in real-time for extremely fast queries. I see this approach being very powerful for modern applications.