Kivy - Architecture



Read this chapter to understand the design architecture of Kivy framework. On one end, Kivy provides various widgets that lets the user to interact with the application, and on the other end, it interacts with the various hardware devices such as mouse TUIO, audio and video streams, etc. The middle layer consists of drivers or providers for processing touch inputs, audio and video, graphics instructions and text input.

This is the official architectural diagram of the Kivy framework −

Kivy Architecture

Core Providers

An important feature of Kivy architecture is "modularity" and "abstraction". The actions such as opening a window, reading audio and video stream, loading images, etc., are the core tasks in any graphical application. Kivy abstracts these core tasks by providing easy to implement API to the drivers that control the hardware.

Kivy uses specific providers for operating system on which your app is being run. Each operating system (Windows, Linux, MacOS, etc.) has its own native APIs for the different core tasks. The act as an intermediate communication layer between the operating system on one side and to Kivy on the other. Thus Kivy fully leverages the functionality exposed by the operating system to enhance the efficiency.

Use of platform-specific libraries reduces the size of the Kivy distribution and makes packaging easier. This also makes it easier to port Kivy to other platforms. The Android port benefited greatly from this.

Input Providers

An input provider is a piece of code that adds support for a specific input device. The different input devices having built-in support in Kivy include −

  • Android Joystick Input Provider
  • Apple's trackpads
  • TUIO (Tangible User Interface Objects)
  • mouse emulator
  • HIDInput

To add support for a new input device, provide a new class that reads your input data from your device and transforms them into Kivy basic events.

Graphics

OpenGL is base of the entire Graphics API of Kivy framework. OpenGL instructions are used by Kivy to issue hardware-accelerated drawing commands. Kivy does away the difficult part of writing OpenGL commands by defining simple-to-use functionality.

Kivy uses OpenGL version 2.0 ES (GLES or OpenGL for embedded systems) with which you can undertake cross-platform development.

Core Library

The high level abstraction is provided by the following constituents of Kivy framework −

  • Clock − the Clock API helps you to schedule timer events. Both one-shot timers and periodic timers are supported.

  • Gesture Detection − An important requirement of multitouch interfaces. The gesture recognizer detects various kinds of strokes, such as circles or rectangles. You can even train it to detect your own strokes.

  • Kivy Language − The kivy language is used to easily and efficiently describe user interfaces. This results in separation of the app design from developing the application logic.

  • Properties − Kivy's unique concept of property classes (they are different from property as in a Python class) are the ones that link your widget code with the user interface description.

UIX

Kivy's user interface is built with widgets and layouts.

  • Widgets are the UI elements that you add to your app to provide some kind of functionality. Examples of widget include buttons, sliders, lists and so on. Widgets receive MotionEvents.

  • More than one widgets are arranged in suitable layouts. Kivy provides layout classes that satisfy the requirement of placement of widgets for every purpose. Examples would be Grid Layouts or Box Layouts. You can also nest layouts.

Event Dispatch

The term "widget" is used for UI elements in almost all graphics toolkits. Any object that receives input events is a widget. One or more widgets are arranged in a tree structure.

The Kivy app window can hold only one root widget, but the root widget can include other widgets in a tree structure. As a result, there is a "parent-children-sibling" relationship amongst the widgets.

Whenever a new input event occurs, the root widget of the widget tree first receives the event. Depending on the state of the touch, the event is propagated down the widget tree.

Each widget in the tree can either process the event or pass it to the next widget in hierarchy. If a widget absorbs and process the event, it should return True so that its propagation down the tree is stopped and no further processing will happen with that event.

def on_touch_down(self, touch):
   for child in self.children[:]:
      if child.dispatch('on_touch_down', touch):
         return True

Since the event propagates through the widget tree, it is often necessary to verify if the event has occurred in the area of a certain widget that is expected to handle it. The collide_point() method can help in ascertaining this fact. This method checks if the touch position falls within the 'watched area' of a certain widget and returns True or False otherwise. By default, this checks the rectangular region on the screen that's described by the widget's pos (for position; x & y) and size (width & height).

Advertisements