This blog aims to offer a concise overview of the procedures employed to detect memory leaks in our Android application and confirm the effectiveness of the implemented solutions.
Our approach involves utilising two primary tools: the Leak Canary library and the Memory Profiler in Android Studio.
Memory allocations, which come from creating new objects in code, can cause significant system work. Not only do the allocations themselves require effort from the Android Runtime (ART), but freeing these objects later (garbage collection) also requires time and effort.
Given its high effectiveness and user-friendly nature, this library functions as our main detection tool for identifying leaks. The process entails enabling the library in our project's build.gradle file, running the application, and conducting various user-level scenarios and interactions.
Once a leak is detected, the library provides us with stack traces associated with each leak, which helps us identify exactly where undesired references to the leaked objects are being held.
The official documentation elaborates on how to read the stack traces quite effectively: https://square.github.io/leakcanary/fundamentals-how-leakcanary-works/
There are two main reasons the memory profiler is an extremely important accessory to the Leak Canary library:
The following is a step-by-step guide to using the memory profiler:
The memory profiler can be studied in depth beyond the minimum required for our use case from the official Android documentation:
https://developer.android.com/studio/profile/memory-profilerOur first iteration of this process focused on detecting and resolving issues in user flows that are most frequently accessed. This resulted in establishing a lower baseline as well as a significant decline in memory retention during a typical usage session. The subsequent graph illustrates these enhancements.
Author: Abdullah Mujtaba, SDE-II, Head Digital Works