DALVIK vs A.R.T ( Android Run Time )

Android [4.4] KitKat users would have noticed the new option to choose the default run-time environment in Android (Dalvik or ART). Dalvik has been the run-time environment in Android since the very beginning. Although ART was added as an experimental feature in Kitkat, it has now replaced Dalvik from Android [5.0] Lollipop.

Image1

WHAT IS RUN-TIME ?

When a software program is executed, it is in a run-time state. During this state, the program sends instructions to the computer’s processor to access the system resources. To do this, we have a run-time environment that executes software instructions when a program is running. These instructions translate the software code into machine code (byte code) that the computer is capable of understanding. In simpler terms, it means that all Android application files (APKs) are basically un-compiled instructions.

WHY ANDROID USES A VIRTUAL MACHINE ?

Android uses a virtual machine as its run-time environment to compile and run its applications. Unlike our Virtual Box, this virtual machine does not emulate the entire computer! Using a virtual machine ensures that the application execution is isolated from the core operating system, so even if the application contains some malicious code, it cannot directly affect the system. This provides stability and reliability for the operating system. This also provides more compatibility, since different processors use different instruction sets and architectures, compilation on the device ensures compatibility with the specific device.

HOW DALVIK WORKS ? WHY IT GOT REPLACED ?

Dalvik uses a JIT (Just-In-Time) compiler for its process virtual machine. Applications need a lot of resources to run. Taking up a lot of resource can slow down the system. But with JIT, the resources are fetched just when they are needed meaning the application gets compiled when they are launched and are loaded into RAM. But compiling the entire code when we launch an application takes a lot of time which translates into what we call lag. But the entire code is not compiled on every application-launch, rather the part of the code that is needed to run the application is compiled every time and gets stored in cache called as the dalvik-cache so it can be reused and this cache gets optimized with every compilation overtime and creates something like a tree of dependencies on every device.

Boss Android users would know that they have to wipe dalvik-cache  before they install a new ROM in their Android, that is because this tree of dependency now has to reconstructed for the new system in the ROM.

But if the compiled application that is loaded into RAM is manually killed, the whole process of compilation has to be done again. Over time, the dalvik-cache gets bigger and does not get cleaned and this takes up a lot of storage, which slows down the device.

HOW A.R.T SOLVES THIS ?

A.R.T uses AOT (Ahead-Of-Time) compiler. When an application is installed, the A.O.T compiler translates the entire code into machine code via compilation. So this means the application doesn’t need to compiled again and again. This makes the process of launching and using the application faster and smoother because the pre-compiled code (machine code) just needs to be executed and all the resources are readily available. Reducing the number of compilations also improves the battery life of the device. But compiling the entire code means installing the application will take more time and storage as well.

DOES IT REALLY MAKE A DIFFERENCE ?

Although on paper, A.R.T smokes Dalvik, it doesn’t make a huge difference as you would expect. Apps do launch faster and the performance is a tad better in A.R.T.

ART-performance-comparison-e1404425718598

Here’s a a slide shown on one of the I/O keynotes.

A.R.T may use the “Ahead-Of-Time” method of compilation, but I personally don’t think it is ahead of it’s time! Yes, it does make more sense, it might be the next right step towards a better Android. But it still uses a virtual machine and running applications through a VM would never be faster than running applications in native code!

This article featured in the June issue of the Open Source For You magazine.                View here

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s