1. Before you begin
A stylus is a pen-shaped tool that helps users perform precise tasks. In this codelab, you learn how to implement organic stylus experiences with the android.os
and androidx
libraries. You also learn how to use the MotionEvent
class to support pressure, tilt, and orientation, and palm rejection to prevent unwanted touches. In addition, you learn how to reduce stylus latency with motion prediction and low-latency graphics with OpenGL and the SurfaceView
class.
Prerequisites
- Experience with Kotlin and lambdas.
- Basic knowledge of how to use Android Studio.
- Basic knowledge of Jetpack Compose.
- Basic understanding of OpenGL for low-latency graphics.
What you'll learn
- How to use the
MotionEvent
class for stylus. - How to implement stylus capabilities, including support for pressure, tilt, and orientation.
- How to draw on the
Canvas
class. - How to implement motion prediction.
- How to render low-latency graphics with OpenGL and the
SurfaceView
class.
What you'll need
- The latest version of Android Studio.
- Experience with Kotlin syntax, including lambdas.
- Basic experience with Compose. If you're unfamiliar with Compose, complete the Jetpack Compose basics codelab.
- A device with stylus support.
- An active stylus.
- Git.
2. Get the starter code
To get the code that contains the starter app's theming and basic setup, follow these steps:
- Clone this GitHub repository:
git clone https://github.com/android/large-screen-codelabs
- Open the
advanced-stylus
folder. Thestart
folder contains the starter code and theend
folder contains the solution code.
3. Implement a basic drawing app
First, you build the necessary layout for a basic drawing app that lets users draw, and shows stylus attributes on the screen with the Canvas
Composable
function. It looks like the following image:
The upper part is a Canvas
Composable
function where you draw the stylus visualization, and show the different attributes of the stylus, such as orientation, tilt, and pressure. The lower part is another Canvas
Composable
function that receives stylus input and draws simple strokes.
To implement the basic layout of the drawing app, follow these steps:
- In Android Studio, open the cloned repository.
- Click
app
>java
>com.example.stylus
, and then double-clickMainActivity
. TheMainActivity.kt
file opens. - In the
MainActivity
class, notice theStylusVisualization
andDrawArea
Composable
functions. You focus on theDrawArea
Composable
function in this section.
Create a StylusState
class
- In the same
ui
directory, , click File > New > Kotlin/Class file. - In the text box, replace the Name placeholder with
StylusState.kt
, and then pressEnter
(orreturn
on macOS). - In the
StylusState.kt
file, create theStylusState
data class, and then add the variables from the following table:
Variable | Type | Default value | Description |
|
| A value that ranges from 0 to 1.0. | |
|
| A radian value that ranges from -pi to pi. | |
|
| A radian value that ranges from 0 to pi/2. | |
|
| Stores lines rendered by the |
StylusState.kt
package com.example.stylus.ui
import androidx.compose.ui.graphics.Path
data class StylusState(
var pressure: Float = 0F,
var orientation: Float = 0F,
var tilt: Float = 0F,
var path: Path = Path(),
)
- In the
MainActivity.kt
file, find theMainActivity
class, and then add the stylus state with themutableStateOf()
function:
MainActivity.kt
import androidx.compose.runtime.setValue
import androidx.compose.runtime.getValue
import androidx.compose.runtime.mutableStateOf
import com.example.stylus.ui.StylusState
class MainActivity : ComponentActivity() {
private var stylusState: StylusState by mutableStateOf(StylusState())
The DrawPoint
class
The DrawPoint
class stores data about each point drawn on the screen; when you link these points, you create lines. It mimics how the Path
object works.
The DrawPoint
class extends the PointF
class. It contains the following data:
Parameters | Type | Description |
|
| Coordinate |
|
| Coordinate |
|
| Type of point |
There are two types of DrawPoint
objects, which are described by the DrawPointType
enum:
Type | Description |
| Moves the start of a line to a position. |
| Traces a line from the previous point. |
DrawPoint.kt
import android.graphics.PointF
class DrawPoint(x: Float, y: Float, val type: DrawPointType): PointF(x, y)
Render the data points into a path
For this app, the StylusViewModel
class holds data of the line, prepares data for rendering, and performs some operations on the Path
object for palm rejection.
- To hold the lines' data, in the
StylusViewModel
class, create a mutable list ofDrawPoint
objects:
StylusViewModel.kt
import androidx.lifecycle.ViewModel
import com.example.stylus.data.DrawPoint
class StylusViewModel : ViewModel() {private var currentPath = mutableListOf<DrawPoint>()
To render the data points into a path, follow these steps:
- In the
StylusViewModel.kt
file'sStylusViewModel
class, add acreatePath
function. - Create a
path
variable of typePath
withPath()
constructor. - Create a
for
loop in which you iterate through each data point in thecurrentPath
variable. - If the data point is of type
START
, call themoveTo
method to start a line at the specifiedx
andy
coordinates. - Otherwise, call the
lineTo
method with thex
andy
coordinates of the data point to link to the previous point. - Return the
path
object.
StylusViewModel.kt
import androidx.compose.ui.graphics.Path
import com.example.stylus.data.DrawPoint
import com.example.stylus.data.DrawPointType
class StylusViewModel : ViewModel() {
private var currentPath = mutableListOf<DrawPoint>()
private fun createPath(): Path {
val path = Path()
for (point in currentPath) {
if (point.type == DrawPointType.START) {
path.moveTo(point.x, point.y)
} else {
path.lineTo(point.x, point.y)
}
}
return path
}
private fun cancelLastStroke() {
}
Process MotionEvent
objects
Stylus events come through MotionEvent
objects, which provide information about the action performed and the data associated with it, like the position of the pointer and the pressure. The following table contains some of the MotionEvent
object's constants and their data, which you can use to identify what the user does on the screen:
Constant | Data |
| The pointer touches the screen. It's the start of a line at the position reported by the |
| The pointer moves on the screen. It's the line that's drawn. |
| The pointer stops touching the screen. It's the end of the line. |
| Unwanted touch detected. Cancels the last stroke. |
When the app receives a new MotionEvent
object, the screen should render to reflect the new user input.
- To process
MotionEvent
objects in theStylusViewModel
class, create a function that gathers the line coordinates:
StylusViewModel.kt
import android.view.MotionEvent
class StylusViewModel : ViewModel() {
private var currentPath = mutableListOf<DrawPoint>()
...
fun processMotionEvent(motionEvent: MotionEvent): Boolean {
when (motionEvent.actionMasked) {
MotionEvent.ACTION_DOWN -> {
currentPath.add(
DrawPoint(motionEvent.x, motionEvent.y, DrawPointType.START)
)
}
MotionEvent.ACTION_MOVE -> {
currentPath.add(DrawPoint(motionEvent.x, motionEvent.y, DrawPointType.LINE))
}
MotionEvent.ACTION_UP -> {
currentPath.add(DrawPoint(motionEvent.x, motionEvent.y, DrawPointType.LINE))
}
MotionEvent.ACTION_CANCEL -> {
// Unwanted touch detected.
cancelLastStroke()
}
else -> return false
}
return true
}
Send data to the UI
To update the StylusViewModel
class so that the UI can collect changes in the StylusState
data class, follow these steps:
- In the
StylusViewModel
class, create a_stylusState
variable of aMutableStateFlow
type of theStylusState
class, and astylusState
variable of aStateFlow
type of theStylusState
class. The_stylusState
variable is modified whenever the stylus state is changed in theStylusViewModel
class and thestylusState
variable is consumed by the UI in theMainActivity
class.
StylusViewModel.kt
import com.example.stylus.ui.StylusState
import kotlinx.coroutines.flow.MutableStateFlow
import kotlinx.coroutines.flow.StateFlow
class StylusViewModel : ViewModel() {
private var _stylusState = MutableStateFlow(StylusState())
val stylusState: StateFlow<StylusState> = _stylusState
- Create a
requestRendering
function that accepts aStylusState
object parameter:
StylusViewModel.kt
import kotlinx.coroutines.flow.update
...
class StylusViewModel : ViewModel() {
private var _stylusState = MutableStateFlow(StylusState())
val stylusState: StateFlow<StylusState> = _stylusState
...
private fun requestRendering(stylusState: StylusState) {
// Updates the stylusState, which triggers a flow.
_stylusState.update {
return@update stylusState
}
}
- At the end of the
processMotionEvent
function, add arequestRendering
function call with aStylusState
parameter. - In the
StylusState
parameter, retrieve the tilt, pressure, and orientation values from themotionEvent
variable, and then create the path with acreatePath()
function. This triggers a flow event, which you connect in the UI later.
StylusViewModel.kt
...
class StylusViewModel : ViewModel() {
...
fun processMotionEvent(motionEvent: MotionEvent): Boolean {
...
else -> return false
}
requestRendering(
StylusState(
tilt = motionEvent.getAxisValue(MotionEvent.AXIS_TILT),
pressure = motionEvent.pressure,
orientation = motionEvent.orientation,
path = createPath()
)
)
Link the UI with the StylusViewModel
class
- In the
MainActivity
class, find theonCreate
function'ssuper.onCreate
function, and then add the state collection. To learn more about state collection, see Collecting flows in a lifecycle-aware manner.
MainActivity.kt
import androidx.lifecycle.lifecycleScope
import kotlinx.coroutines.launch
import androidx.lifecycle.repeatOnLifecycle
import kotlinx.coroutines.flow.onEach
import androidx.lifecycle.Lifecycle
import kotlinx.coroutines.flow.collect
...
class MainActivity : ComponentActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
lifecycleScope.launch {
lifecycle.repeatOnLifecycle(Lifecycle.State.STARTED) {
viewModel.stylusState
.onEach {
stylusState = it
}
.collect()
}
}
Now, whenever the StylusViewModel
class posts a new StylusState
state, the activity receives it, and the new StylusState
object updates the local MainActivity
class' stylusState
variable.
- In the body of the
DrawArea
Composable
function, add thepointerInteropFilter
modifier to theCanvas
Composable
function to provideMotionEvent
objects.
- Send the
MotionEvent
object to the StylusViewModel'sprocessMotionEvent
function for processing:
MainActivity.kt
import androidx.compose.ui.ExperimentalComposeUiApi
import androidx.compose.ui.input.pointer.pointerInteropFilter
...
class MainActivity : ComponentActivity() {
...
@Composable
@OptIn(ExperimentalComposeUiApi::class)
fun DrawArea(modifier: Modifier = Modifier) {
Canvas(modifier = modifier
.clipToBounds()
.pointerInteropFilter {
viewModel.processMotionEvent(it)
}
) {
}
}
- Call the
drawPath
function with thestylusState
path
attribute, and then provide a color and stroke style.
MainActivity.kt
class MainActivity : ComponentActivity() {
...
@Composable
@OptIn(ExperimentalComposeUiApi::class)
fun DrawArea(modifier: Modifier = Modifier) {
Canvas(modifier = modifier
.clipToBounds()
.pointerInteropFilter {
viewModel.processMotionEvent(it)
}
) {
with(stylusState) {
drawPath(
path = this.path,
color = Color.Gray,
style = strokeStyle
)
}
}
}
- Run the app, and then notice that you can draw on the screen.
4. Implement support for pressure, orientation, and tilt
In the previous section, you saw how to retrieve stylus information from MotionEvent
objects, such as pressure, orientation, and tilt.
StylusViewModel.kt
tilt = motionEvent.getAxisValue(MotionEvent.AXIS_TILT),
pressure = motionEvent.pressure,
orientation = motionEvent.orientation,
However, this shortcut only works for the first pointer. When multi-touch is detected, multiple pointers are detected and this shortcut only returns the value for the first pointer—or the first pointer on the screen. To request data about a specific pointer, you can use the pointerIndex
parameter:
StylusViewModel.kt
tilt = motionEvent.getAxisValue(MotionEvent.AXIS_TILT, pointerIndex),
pressure = motionEvent.getPressure(pointerIndex),
orientation = motionEvent.getOrientation(pointerIndex)
To learn more about pointers and multi-touch, see Handle multi-touch gestures.
Add visualization for pressure, orientation, and tilt
- In the
MainActivity.kt
file, find theStylusVisualization
Composable
function, and then use the information for theStylusState
flow object to render the visualization:
MainActivity.kt
import StylusVisualization.drawOrientation
import StylusVisualization.drawPressure
import StylusVisualization.drawTilt
...
class MainActivity : ComponentActivity() {
...
@Composable
fun StylusVisualization(modifier: Modifier = Modifier) {
Canvas(
modifier = modifier
) {
with(stylusState) {
drawOrientation(this.orientation)
drawTilt(this.tilt)
drawPressure(this.pressure)
}
}
}
- Run the app. You see three indicators at the top of the screen that indicate orientation, pressure, and tilt.
- Scribble on the screen with your stylus, and then observe how each visualization reacts with your inputs.
- Inspect the
StylusVisualization.kt
file to understand how each visualization is constructed.
5. Implement palm rejection
The screen can register unwanted touches. For example, it happens when a user naturally rests their hand on the screen for support while handwriting.
Palm rejection is a mechanism that detects this behavior and notifies the developer to cancel the last set of MotionEvent
objects. A set of MotionEvent
objects starts with the ACTION_DOWN
constant.
This means that you must maintain a history of the inputs so that you can remove unwanted touches from the screen and re-render the legitimate user inputs. Thankfully, you already have the history stored in the StylusViewModel
class in the currentPath
variable.
Android provides the ACTION_CANCEL
constant from the MotionEvent
object to inform the developer about unwanted touch. Since Android 13, the MotionEvent
object provides the FLAG_CANCELED
constant that should be checked on the ACTION_POINTER_UP
constant.
Implement the cancelLastStroke
function
- To remove a data point from the last
START
data point, go back to theStylusViewModel
class, and then create acancelLastStroke
function that finds the index of the lastSTART
data point and only keeps the data from the first data point until the index minus one:
StylusViewModel.kt
...
class StylusViewModel : ViewModel() {
...
private fun cancelLastStroke() {
// Find the last START event.
val lastIndex = currentPath.findLastIndex {
it.type == DrawPointType.START
}
// If found, keep the element from 0 until the very last event before the last MOVE event.
if (lastIndex > 0) {
currentPath = currentPath.subList(0, lastIndex - 1)
}
}
Add the ACTION_CANCEL
and FLAG_CANCELED
constants
- In the
StylusViewModel.kt
file, find theprocessMotionEvent
function. - In the
ACTION_UP
constant, create acanceled
variable that checks whether the current SDK version is Android 13 or higher, and whether theFLAG_CANCELED
constant is activated. - On the next line, create a conditional that checks whether the
canceled
variable is true. If so, call thecancelLastStroke
function to remove the last set ofMotionEvent
objects. If not, call thecurrentPath.add
method to add the last set ofMotionEvent
objects.
StylusViewModel.kt
import android.os.Build
...
class StylusViewModel : ViewModel() {
...
fun processMotionEvent(motionEvent: MotionEvent): Boolean {
...
MotionEvent.ACTION_POINTER_UP,
MotionEvent.ACTION_UP -> {
val canceled = Build.VERSION.SDK_INT >= Build.VERSION_CODES.TIRAMISU &&
(motionEvent.flags and MotionEvent.FLAG_CANCELED) == MotionEvent.FLAG_CANCELED
if(canceled) {
cancelLastStroke()
} else {
currentPath.add(DrawPoint(motionEvent.x, motionEvent.y, DrawPointType.LINE))
}
}
- In the
ACTION_CANCEL
constant, notice thecancelLastStroke
function:
StylusViewModel.kt
...
class StylusViewModel : ViewModel() {
...
fun processMotionEvent(motionEvent: MotionEvent): Boolean {
...
MotionEvent.ACTION_CANCEL -> {
// unwanted touch detected
cancelLastStroke()
}
Palm rejection is implemented! You can find the working code in the palm-rejection
folder.
6. Implement low latency
In this section, you reduce the latency between user input and screen rendering to improve performance. The latency has multiple causes and one of them is the long graphic pipeline. You reduce the graphic pipeline with front buffer rendering. Front buffer rendering gives developers direct access to the screen buffer, which provides great results for handwriting and sketching.
The GLFrontBufferedRenderer
class provided by the androidx.graphics
library takes care of the front and doubled buffer rendering. It optimizes a SurfaceView
object for fast rendering with the onDrawFrontBufferedLayer
callback function and normal rendering with the onDrawDoubleBufferedLayer
callback function. The GLFrontBufferedRenderer
class and GLFrontBufferedRenderer.Callback
interface work with a user-provided data type. In this codelab, you use the Segment
class.
To get started, follow these steps:
- In Android Studio, open the
low-latency
folder so that you get all the required files: - Notice the following new files in the project:
- In the
build.gradle
file, theandroidx.graphics
library has been imported into with theimplementation "androidx.graphics:graphics-core:1.0.0-alpha03"
declaration. - The
LowLatencySurfaceView
class extends theSurfaceView
class to render OpenGL code on the screen. - The
LineRenderer
class holds OpenGL code to render a line on the screen. - The
FastRenderer
class allows fast rendering and implements theGLFrontBufferedRenderer.Callback
interface. It also interceptsMotionEvent
objects. - The
StylusViewModel
class holds the data points with aLineManager
interface. - The
Segment
class defines a segment as follows: x1
,y1
: coordinates of the first pointx2
,y2
: coordinates of the second point
The following images shows how the data moves between each class:
Create a low-latency surface and layout
- In the
MainActivity.kt
file, find theMainActivity
class'sonCreate
function. - In the body of the
onCreate
function, create aFastRenderer
object, and then pass in aviewModel
object:
MainActivity.kt
class MainActivity : ComponentActivity() {
...
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
fastRendering = FastRenderer(viewModel)
lifecycleScope.launch {
...
- In the same file, create a
DrawAreaLowLatency
Composable
function. - In the function's body, use the
AndroidView
API to wrap theLowLatencySurfaceView
view and then provide thefastRendering
object:
MainActivity.kt
import androidx.compose.ui.viewinterop.AndroidView
import com.example.stylus.gl.LowLatencySurfaceView
class MainActivity : ComponentActivity() {
...
@Composable
fun DrawAreaLowLatency(modifier: Modifier = Modifier) {
AndroidView(factory = { context ->
LowLatencySurfaceView(context, fastRenderer = fastRendering)
}, modifier = modifier)
}
- In the
onCreate
function after theDivider
Composable
function, add theDrawAreaLowLatency
Composable
function to the layout:
MainActivity.kt
class MainActivity : ComponentActivity() {
...
override fun onCreate(savedInstanceState: Bundle?) {
...
Surface(
modifier = Modifier
.fillMaxSize(),
color = MaterialTheme.colorScheme.background
) {
Column {
StylusVisualization(
modifier = Modifier
.fillMaxWidth()
.height(100.dp)
)
Divider(
thickness = 1.dp,
color = Color.Black,
)
DrawAreaLowLatency()
}
}
- In the
gl
directory, open theLowLatencySurfaceView.kt
file, and then notice the following in theLowLatencySurfaceView
class:
- The
LowLatencySurfaceView
class extends theSurfaceView
class. It uses thefastRenderer
object'sonTouchListener
method. - The
GLFrontBufferedRenderer.Callback
interface through thefastRenderer
class needs to be attached to theSurfaceView
object when theonAttachedToWindow
function is called so that the callbacks can render to theSurfaceView
view. - The
GLFrontBufferedRenderer.Callback
interface through thefastRenderer
class needs to be released when theonDetachedFromWindow
function is called.
LowLatencySurfaceView.kt
class LowLatencySurfaceView(context: Context, private val fastRenderer: FastRenderer) :
SurfaceView(context) {
init {
setOnTouchListener(fastRenderer.onTouchListener)
}
override fun onAttachedToWindow() {
super.onAttachedToWindow()
fastRenderer.attachSurfaceView(this)
}
override fun onDetachedFromWindow() {
fastRenderer.release()
super.onDetachedFromWindow()
}
}
Handle MotionEvent
objects with the onTouchListener
interface
To handle MotionEvent
objects when the ACTION_DOWN
constant is detected, follow these steps:
- In the
gl
directory, open theFastRenderer.kt
file. - In the body of the
ACTION_DOWN
constant, create acurrentX
variable that stores theMotionEvent
object'sx
coordinate and acurrentY
variable that stores itsy
coordinate. - Create a
Segment
variable that stores aSegment
object that accepts two instances of thecurrentX
parameter and two instances of thecurrentY
parameter because it's the start of the line. - Call the
renderFrontBufferedLayer
method with asegment
parameter to trigger a callback on theonDrawFrontBufferedLayer
function.
FastRenderer.kt
class FastRenderer ( ... ) {
...
val onTouchListener = View.OnTouchListener { view, event ->
...
MotionEvent.ACTION_DOWN -> {
// Ask that the input system not batch MotionEvent objects,
// but instead deliver them as soon as they're available.
view.requestUnbufferedDispatch(event)
currentX = event.x
currentY = event.y
// Create a single point.
val segment = Segment(currentX, currentY, currentX, currentY)
frontBufferRenderer?.renderFrontBufferedLayer(segment)
}
To handle MotionEvent
objects when the ACTION_MOVE
constant is detected, follow these steps:
- In the body of the
ACTION_MOVE
constant, create apreviousX
variable that stores thecurrentX
variable and apreviousY
variable that stores thecurrentY
variable. - Create a
currentX
variable that saves theMotionEvent
object's currentx
coordinate and acurrentY
variable that saves its currenty
coordinate. - Create a
Segment
variable that stores aSegment
object that accepts apreviousX
,previousY
,currentX
, andcurrentY
parameters. - Call the
renderFrontBufferedLayer
method with asegment
parameter to trigger a callback on theonDrawFrontBufferedLayer
function and execute OpenGL code.
FastRenderer.kt
class FastRenderer ( ... ) {
...
val onTouchListener = View.OnTouchListener { view, event ->
...
MotionEvent.ACTION_MOVE -> {
previousX = currentX
previousY = currentY
currentX = event.x
currentY = event.y
val segment = Segment(previousX, previousY, currentX, currentY)
// Send the short line to front buffered layer: fast rendering
frontBufferRenderer?.renderFrontBufferedLayer(segment)
}
- To handle
MotionEvent
objects when theACTION_UP
constant is detected, call thecommit
method to trigger a call on theonDrawDoubleBufferedLayer
function and execute OpenGL code:
FastRenderer.kt
class FastRenderer ( ... ) {
...
val onTouchListener = View.OnTouchListener { view, event ->
...
MotionEvent.ACTION_UP -> {
frontBufferRenderer?.commit()
}
Implement the GLFrontBufferedRenderer
callback functions
In the FastRenderer.kt
file, the onDrawFrontBufferedLayer
and onDrawDoubleBufferedLayer
callback functions execute OpenGL code. At the beginning of each callback function, the following OpenGL functions map Android data to the OpenGL workspace:
- The
GLES20.glViewport
function defines the size of the rectangle in which you render the scene. - The
Matrix.orthoM
function computes theModelViewProjection
matrix. - The
Matrix.multiplyMM
function performs matrix multiplication to transform the Android data to OpenGL reference, and provides the setup for theprojection
matrix.
FastRenderer.kt
class FastRenderer( ... ) {
...
override fun onDraw[Front/Double]BufferedLayer(
eglManager: EGLManager,
bufferInfo: BufferInfo,
transform: FloatArray,
params: Collection<Segment>
) {
val bufferWidth = bufferInfo.width
val bufferHeight = bufferInfo.height
GLES20.glViewport(0, 0, bufferWidth, bufferHeight)
// Map Android coordinates to OpenGL coordinates.
Matrix.orthoM(
mvpMatrix,
0,
0f,
bufferWidth.toFloat(),
0f,
bufferHeight.toFloat(),
-1f,
1f
)
Matrix.multiplyMM(projection, 0, mvpMatrix, 0, transform, 0)
With that part of the code set up for you, you can focus on the code that does the actual rendering. The onDrawFrontBufferedLayer
callback function renders a small area of the screen. It provides a param
value of Segment
type so that you can render a single segment fast. The LineRenderer
class is an openGL renderer for the brush that applies the color and size of the line.
To implement the onDrawFrontBufferedLayer
callback function, follow these steps:
- In the
FastRenderer.kt
file, find theonDrawFrontBufferedLayer
callback function. - In the
onDrawFrontBufferedLayer
callback function's body, call theobtainRenderer
function to get theLineRenderer
instance. - Call the
LineRenderer
function'sdrawLine
method with the following parameters:
- The
projection
matrix previously calculated. - A list of
Segment
objects, which is a single segment in this case. - The
color
of the line.
FastRenderer.kt
import android.graphics.Color
import androidx.core.graphics.toColor
class FastRenderer( ... ) {
...
override fun onDrawFrontBufferedLayer(
eglManager: EGLManager,
bufferInfo: BufferInfo,
transform: FloatArray,
params: Collection<Segment>
) {
...
Matrix.multiplyMM(projection, 0, mvpMatrix, 0, transform, 0)
obtainRenderer().drawLine(projection, listOf(param), Color.GRAY.toColor())
}
- Run the app, and then notice that you can draw on the screen with minimum latency. However, the app won't persist the line because you still need to implement the
onDrawDoubleBufferedLayer
callback function.
The onDrawDoubleBufferedLayer
callback function is called after the commit
function to allow persistence of the line. The callback provides params
values, which contain a collection of Segment
objects. All the segments on the front buffer are replayed in the double buffer for persistence.
To implement the onDrawDoubleBufferedLayer
callback function, follow these steps:
- In the
StylusViewModel.kt
file, find theStylusViewModel
class, and then create anopenGlLines
variable that stores a mutable list ofSegment
objects:
StylusViewModel.kt
import com.example.stylus.data.Segment
class StylusViewModel : ViewModel() {
private var _stylusState = MutableStateFlow(StylusState())
val stylusState: StateFlow<StylusState> = _stylusState
val openGlLines = mutableListOf<List<Segment>>()
private fun requestRendering(stylusState: StylusState) {
- In the
FastRenderer.kt
file, find theFastRenderer
class'sonDrawDoubleBufferedLayer
callback function. - In the body of the
onDrawDoubleBufferedLayer
callback function, clear the screen with theGLES20.glClearColor
andGLES20.glClear
methods so that the scene can be rendered from scratch, and add the lines to theviewModel
object to persist them:
FastRenderer.kt
class FastRenderer( ... ) {
...
override fun onDrawDoubleBufferedLayer(
eglManager: EGLManager,
bufferInfo: BufferInfo,
transform: FloatArray,
params: Collection<Segment>
) {
...
// Clear the screen with black.
GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f)
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT)
viewModel.openGlLines.add(params.toList())
- Create a
for
loop that iterates through and renders each line from theviewModel
object:
FastRenderer.kt
class FastRenderer( ... ) {
...
override fun onDrawDoubleBufferedLayer(
eglManager: EGLManager,
bufferInfo: BufferInfo,
transform: FloatArray,
params: Collection<Segment>
) {
...
// Clear the screen with black.
GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f)
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT)
viewModel.openGlLines.add(params.toList())
// Render the entire scene (all lines).
for (line in viewModel.openGlLines) {
obtainRenderer().drawLine(projection, line, Color.GRAY.toColor())
}
}
- Run the app, and then notice that you can draw on the screen, and the line is preserved after the
ACTION_UP
constant is triggered.
7. Implement motion prediction
You can further improve latency with the androidx.input
library, which analyzes the course of the stylus, and predicts the next point's location and inserts it for rendering.
To set up motion prediction, follow these steps:
- In the
app/build.gradle
file, import the library in the dependencies section:
app/build.gradle
...
dependencies {
...
implementation"androidx.input:input-motionprediction:1.0.0-beta01"
- Click File > Sync project with Gradle files.
- In the
FastRendering.kt
file'sFastRendering
class, declare themotionEventPredictor
object as an attribute:
FastRenderer.kt
import androidx.input.motionprediction.MotionEventPredictor
class FastRenderer( ... ) {
...
private var frontBufferRenderer: GLFrontBufferedRenderer<Segment>? = null
private var motionEventPredictor: MotionEventPredictor? = null
- In the
attachSurfaceView
function, initialize themotionEventPredictor
variable:
FastRenderer.kt
class FastRenderer( ... ) {
...
fun attachSurfaceView(surfaceView: SurfaceView) {
frontBufferRenderer = GLFrontBufferedRenderer(surfaceView, this)
motionEventPredictor = MotionEventPredictor.newInstance(surfaceView)
}
- In the
onTouchListener
variable, call themotionEventPredictor?.record
method so that themotionEventPredictor
object gets motion data:
FastRendering.kt
class FastRenderer( ... ) {
...
val onTouchListener = View.OnTouchListener { view, event ->
motionEventPredictor?.record(event)
...
when (event?.action) {
The next step is to predict a MotionEvent
object with the predict
function. We recommend predicting when an ACTION_MOVE
constant is received and after the MotionEvent
object is recorded. In other words, you should predict when a stroke is underway.
- Predict an artificial
MotionEvent
object with thepredict
method. - Create a
Segment
object that uses the current and predicted x and y coordinates. - Request fast rendering of the predicted segment with the
frontBufferRenderer?.renderFrontBufferedLayer(predictedSegment)
method.
FastRendering.kt
class FastRenderer( ... ) {
...
val onTouchListener = View.OnTouchListener { view, event ->
motionEventPredictor?.record(event)
...
when (event?.action) {
...
MotionEvent.ACTION_MOVE -> {
...
frontBufferRenderer?.renderFrontBufferedLayer(segment)
val motionEventPredicted = motionEventPredictor?.predict()
if(motionEventPredicted != null) {
val predictedSegment = Segment(currentX, currentY,
motionEventPredicted.x, motionEventPredicted.y)
frontBufferRenderer?.renderFrontBufferedLayer(predictedSegment)
}
}
...
}
Predicted events are inserted to render, which improves latency.
- Run the app, and then notice the improved latency.
Improving latency will give stylus users a more natural stylus experience.
8. Congratulations
Congratulations! You know how to handle stylus like a pro!
You learned how to process MotionEvent
objects to extract the information about pressure, orientation and tilt. You also learned how to improve the latency by implementing both androidx.graphics
library and androidx.input
library. These enhancements implemented together, offer a more organic stylus experience.