mediapipe in Snap!
文章目录
About mediapipe
MediaPipe Solutions provides a suite of libraries and tools for you to quickly apply artificial intelligence (AI) and machine learning (ML) techniques in your applications.– Github/MediaPipe
Introduce mediapipe to Snap!
Goals
We hope that all the work of using mediapipe will be completed entirely in the Snap! IDE!
This has the following benefits:
No need to update the Snap! platform, no developer intervention required, all work is done in the user environment (just a Snap! project), which means ordinary users can continue to extend these capabilities. (This is an example of end-user programming)
We can fully utilize the liveness of Snap! and enjoy an efficient and pleasant development experience.
Dynamic import
mediapipe provides JavaScript API, so it seems we can introduce mediapipe into Snap! through JavaScript function.
import() can asynchronously and dynamically load ECMAScript modules, which is exactly what we need.
Let’s try to import the mediapipe library. mediapipe has rich features. An example we intend to create is Gesture Recognition (click to experience). I referred to the mediapipe documentation and the use case.
The following introduces the key parts.
First, use import()
to dynamically import the module:
|
|
After importing, you can use the functions of the module:
|
|
After that, you can use gestureRecognizer to recognize the image.
Drawing hand landmarks
The example not only provides gesture results but also draws landmarks:
Details of landmarks:
From the example code, it can be seen that the drawing work uses drawingUtils. Instead of reading the drawingUtils code (which is usually very tedious), I prefer to draw in Snap! based on the raw data. This way, we can fully enjoy the benefits of Snap!’s liveness. Snap!’s pen feature is very easy to use and powerful, and we can use it to draw:
Specifically, use the pen of Snap! to draw the coordinate information contained in results.landmarks
. In this process, it is necessary to convert the coordinate system of mediapipe (with the origin at the top left corner and the positive direction of the y-axis downward) to the Snap! stage coordinate system.
During the drawing process, Snap!, as a product of the interactive personal computing vision, contains features (such as liveness) that are particularly helpful for exploring data, allowing users to interactively observe the data:
When you are interested in the current frame, you can pause the program and freeze the system at that moment:
As if time had frozen, after exploring the data of the current time slice, you can continue the process.
In this project, we only plotted the data of one hand, results.landmarks
may contain the data of two hands.
Interoperability with MicroBlocks
mediapipe can fully utilize the computing power of edge devices (mobile phones/tablets) to provide AI functions. The computing power of edge devices is getting stronger, and Apple’s recently launched M4 chip has the fastest neural engine in Apple’s history, capable of performing up to 38 trillion operations per second. Apple’s artificial intelligence strategy seems to be reasoning/learning on edge devices.
With the standardization of WebGPU, Snap! will be able to fully utilize these capabilities. Since Snap! can also connect to MicroBlocks devices via BLE, it seems we can use a mobile phone as the computing power for a microcontroller. Using frameworks like mediapipe, it seems quite easy to create a wireless AI camera with a mobile phone (similar to the HuskyLens camera).
References
文章作者 种瓜
上次更新 2024-05-09