Brandon Konkle
Brandon Konkle

Principal Engineer, type system nerd, Rust enthusiast, supporter of social justice, loving husband & father, avid comic & manga reader, 日本語を勉強してる。

I’m a Software Architect with more than 15 years of experience creating high performance server and front-end applications targeting web and mobile platforms, & today I lead a team at Formidable Labs.

Share


Tags


The mobile web app dilemma

Update: This is a perspective I held for a short time in 2013 when I was heading in the direction of the front end but was concerned about the viability of single page apps for mobile. I now disagree with the opinion here, and fully believe that front end driven apps are the way of the future.

Drew Crawford posted a very thorough and convincing argument  on his blog a couple of days ago about about why mobile web apps are slow. He did some benchmarking and is equipped with real numbers, and his perspective makes a lot of sense. It also solidly defeats the assumption I've been working under the last few months.

Recently, I've been very concerned about mobile web apps. I believe that mobile devices like tablets are the future of computing for the general public. Ars Technica reported today that PC sales are in their 5th consecutive quarter of serious decline.

So what are people buying instead? Tablets. (Duh.)

The problem is that the web app experience on mobile is very much inferior to the experience of native apps. Web apps are not nearly as responsive and usable as native apps on mobile devices. My opinion over the last few months has been that this is because of network access. Web apps typically have to phone home for most actions because they need the server to handle logic based on the user's actions. I believed that if we could move most of that logic to the device itself using JavaScript MVC frameworks, we could begin to approach the responsiveness of native apps. After reading Dave Crawford's article, I now believe I was wrong. 

Two of the biggest issues plaguing mobile web apps today are garbage collection  and the CPU.

Garbage Collection

As long as you have about 6 times as much memory as you really need, you’re fine. But woe betide you if you have less than 4x the required memory.

This is the biggest problem. Crawford's article shows that garbage collection demonstrably falls down when you have less than 4x the memory your app needs to run. This is why Objective-C has deprecated garbage collection and instead provides tools to manually manage memory. JavaScript, however, completely denies any access to manual memory management by design.

He cited the RubyMotion project which was started to make Ruby usable on mobile devices. One effort in this was to strip garbage collection out of Ruby. The result was that apps built with the project are plagued by crashes and memory leaks, ones that the creators of the project have yet to solve.

His conclusion is that stripping garbage collection out of JavaScript would be equally hard, and not worth pursuing. I'm not convinced, though, and I think that the right community could potentially do something like that successfully if they got the right people behind it. It doesn't look like this will happen any time soon, however.

CPU

A 1.5 GHz ARM processor is still 10 times slower than an 1.5 GHz x86 processor, because ARM needs to conserve battery power. Based on the author's benchmarks, JavaScript is roughly about 5 times slower than native code on a mobile device. This is something that can only be overcome by an optimized platform and optimized application code.  Even then limitations that you have to adopt in order to optimize your JavaScript code make it quite difficult to code for. Just look at what Mozilla had to cut out of JavaScript to make asm.js perform well. There is no object structure at all, which severely limits its usefulness as a hand-coded application language.

If you are a Java, Ruby, Python, C# developer, think about iPhone 4S web development in the following way. It’s a computer that runs 10x slower than you expect (since ARM) and performance degrades exponentially if your memory usage goes above 35MB at any point, because that is how garbage collectors behave on the platform. Also, you get killed if at any point you allocate 213MB. And nobody will give you any information about this at runtime “by design”. Oh, and people keep asking you to write high-memory photo-processing and video applications in this environment.

The Future

The most viable path for it to get faster is by pushing the hardware to desktop-level performance. This might be viable long-term, but it’s looking like a pretty long wait.

It's difficult to see what the future holds. For now, if we want to provide the best possible mobile experience then I think we need to focus on providing great APIs for native code to interact with. This is something Python and Django excels in, so those of us in the Django community are starting in a good place. Unfortunately, I now believe we need a native app on the phone to act as a client to our awesome API, so my focus will be shifting to how to best interact with iOS and Java applications in my APIs.

For apps that don't require heavy frontend logic, then the responsive HTML/CSS approach with Django on the server acting as the workhorse still sounds like the best solution. Yeah, you're waiting a few seconds each time you load a new page, but the app is not sufficiently complex to require a native app.

I'd love to hear your perspective after reading through Dave Crawford's comprehensive article. Let me know in the comments below.

I’m a Software Architect with more than 15 years of experience creating high performance server and front-end applications targeting web and mobile platforms, & today I lead a team at Formidable Labs.

View Comments