INTELLIGENT WIDGET RECONFIGURATION FOR MOBILE PHONES: SYSTEM ARCHITECTURE

The system architecture shown in Figure 1 comprises of the following modules – GUI Manager, Learning Engine, RMS (Record Management System) and Rule-Base Engine.
Fig1Intelligent Widget_decrypted
Figure 1: System Architecture

The mobile phone user interacts with the screen and that is usually captured by the phone OS. The proposed system can be viewed to be interposed between the screen and the phone OS. External Inputs simulates the supply of sensor-and location-based information to the system. A timer module (not shown) simulates the transition of time for performance analysis and evaluation. Simulated external inputs include GPS location information, time of day, traffic conditions and heart condition. The GUI manager records user interaction when application widgets are accessed. Together, these form the contextual information that is passed to the learning engine.

Interface re-configuration is based on commands from the learning engine referencing rules in the rule-base engine. The RMS handles rule storage in the mobile phone.

Fig2Intelligent Widget_decrypted
Figure 2: Learning Process Flow

LEARNING PROCESS

The learning engine will add widgets, remove widgets or maintain current screen state based on the results of its learning algorithm. The learning engine communicates with the rule-base engine for rules update and reference. The rule-base engine accepts parameters from the learning engine, fires the appropriate rules and returns the resultant action from the fired rule. The learning process flow is illustrated in Figure 2.

The learning engine is an integral part of the simulation process. The simulation process flow is illustrated in Figure 3. Besides simulating time progression, it will continuously capture users’ widget interactions and pass this information to the learning engine for processing. Changes to the context will cause context rules to be updated as shown in the rule-base engine.