Skip to content Skip to sidebar Skip to footer

Is There A Limit To How Much Data I Should Cache In Browser Memory?

I need to load a couple thousand records of user data (user contacts in a contact-management system, to be precise) from a REST service and run a seach on them. Unfortunately, the

Solution 1:

Storing several megs of Javascript data should cause no problems. Memory leaks will. Think about how much RAM modern computers have - a few megabytes is a molecule in the drop in the proverbial bucket.


Solution 2:

Be careful when doing anything client side if you intend your users to use mobile devices. While desktops won't have an issue, Mobile Safari will stop working at (I believe) 10Mb of JavaScript data. (See this article for more info on Mobile Safari). Other mobile browsers are likely to have similar memory restrictions. Figure out the minimal set of info that you can return to allow the user to perform the search, and then lazy load richer records from the REST API as you need them.

As an alternative, proxy the REST Service in question, and create your own search on a server that you then control. You could do this with pretty quickly and easily with Python + Django + XML Models. No doubt there are equally simple ways to do this with whatever your preferred dev language is. (In re-reading, I see that you can't do server-side caching which may make this point moot).


Solution 3:

You can manage tens of thousands of records safely in the browser. I'm running search & sorting benchmarks with jOrder (http://github.com/danstocker/jorder) on such datasets with no problem.


Solution 4:

I would look at a distributed server side cache. If you keep the data in the browser, as system grows you will have to increase the browser cache lifetime to keep traffic down.


Post a Comment for "Is There A Limit To How Much Data I Should Cache In Browser Memory?"