Player plugin

This document explains how to create a player plugin for the Quick Brick framework.

Player Plugin API

This document explains how to create a player plugin for the Quick Brick framework.

Create your plugin project

In order to have a working environment for your plugin, you will need several things :

  • a repository for your plugin code
  • tooling to create the Quick Brick react-native entry point, so you can start the React-Native server locally
  • Native source code for the app you are working with.

Specific requirements

because The Quick Brick app is built with react-native, and react-native is still implemented in objective-C on iOS, it is important that the main player view class and all the properties declared below are annotated with @objc and declared as public

Anatomy of a QuickBrick player plugin

Simply put, the main export of your player plugin should simply be a view, extending UIView on iOS, and FrameLayout on android.

your view must have the following properties below. Everytime the react-native code is setting or changing a property in the react-native component, it invokes the setter function of the relevant property. Depending on the value of that property, you can trigger whatever action is required on the native side (play / pause, mute / unmute, seek, etc...)

Typically, the javascript front end will look something like this (there are more props and component methods, but this is just an example)

// we get a reference to the underlying native component.
// You don't have to create a bridge, it's built-in, and will instantiate
// your player view by reflexion
const NativeVideoComponent = requireNativeComponent("NativePlayerManager");
class PlayerComponent extends React.Component {
onVideoLoadStart = ({ uri, isNetwork, type }) => {
// will be called when the loading start
};
onVideoEnd = () => {
// this is called when the video ends
};
render() {
const { item } = this.props;
return (
<NativeVideoComponent
playableItem={item}
onVideoLoadStart={this.onVideoLoadStart}
onVideoEnd={this.onVideoEnd}
/>
);
}
}

The properties typed below as functions are callbacks which need to be invoked with the described payload when the relevant events are triggered.

On iOS & tvOS, things are pretty straightforward. The functions from the javascript are received as RCTBubblingEventBlock, and can be invoked as a function, with a dictionary as argument

@objc public class MyPlayerView: UIView {
// these properties can simply be called by passing a dictionary with the properties
// described below
@objc public var onVideoLoadStart: RCTBubblingEventBlock?
@objc public var onVideoEnd: RCTBubblingEventBlock?
@objc public var playableItem: NSDictionary? {
didSet {
// when the react component mounts, the `src` prop is assigned
// which causes this setter to be called. if the `src` prop changes,
// it will invoke this setter again
stopCurrentPlaybackIfNeeded();
if let item = playableItem as? [String: Any],
content = playableItem["content"] as? [String: Any],
sourceUrl = content["src"] as? String {
preparePlayerWithContent(sourceUrl)
}
}
}
func preparePlayerWithContent(sourceUrl: String) {
// initialize player to load new content
// we also need to tell the js side that video loading started
if let onVideoLoadStart = onVideoLoadStart {
onVideoLoadStart([
"uri": src?.uri,
"isNetwork": true,
"type": "video/hls",
])
}
}
func stopCurrentPlayBackIfNeeded() {
// ... clean up current playback if any and needed...
}
func videoEnds() {
// clean up when video ends, and invoke the js function
if let onVideoEnd = onVideoEnd {
onVideoEnd()
}
}
}

on Android, things are a bit more complicated. What happens practically is you fire events, and the corresponding javascript functions are invoked when this happens. The payload sent with the event is a WritableMap from the React library, and is received as a plain javascript object on the javascript side. Fortunately, we've created an interface which lets you directly invoke these javascript functions from the native code. You simply need to implement the QuickBrickPlayer interface on your player view class, and fire the events when relevant, by using the functions accessible from the interface

class MyPlayerView(context: Context): FrameLayout(context), QuickBrickPlayer {
override fun setPlayableItem(playableItem: ReadableMap) {
// these methods are called when the prop - in this case `playableItem` - is set on the js side
// you can implement them freely, depending on what you want to do when these properties are set
// in this example, you're likely to want to prepare the media for play back
val uri = getUriStringFromPlayableItem(playableItem)
preparePlayerWithContent(uri)
}
fun preparePlayerWithContent(uri: String) {
// start loading the player - you need to call the function implemented in the interface to
// fire the event, in this case `onVideoLoadStart`
// the argument is a `WritableMap` and can be constructed with the provided helper class
val arguments = ReactArgumentsBuilder()
.putString("uri", uri)
.putBoolean("isNetwork", true)
.putString("type", "video/hls")
.build()
// in the interface, this is calling sendEvent("onVideoLoadStart", arguments)
onVideoLoadStart(arguments)
}
fun videoPlaybackEnd() {
// same with all lifecycle events, simply invoke the method provided in the interface
// it will call the proper function prop in javascript
onVideoEnd()
}
}

Properties

Required:

  • src: { uri: string }

    the source url of the content to play

  • entry: {}

    the full entry to be played. contains the source url above but also all the media item metadata (see feed API to know more)

  • paused: Boolean

    Will be set to true or false when content needs to be paused or unpaused.

  • muted: Boolean

    Will be set to true or false when content needs to be muted or unmuted.

  • controls: Boolean

    if set to false, the native controls shouldn't be presented at all. This prop won't be changed during the playback of the item, and shouldn't be used to check when the controls need to be dismissed after a few seconds. This is meant to provide an options in certain situations to override the native controls with custom js-based player controls.

  • volume: Float

    sets the volume from [0] to [1]

  • rate: Float

    sets the rate of the playback

  • seek: { time: Flat, tolerance: Float }

    will move the playback to a specific time on the video playback, with a provided tolerance Once this is done, the onVideoSeek function needs to be called

  • currentTime: Float

    sets the currentTime of the playback. Should set the seek property at that time

  • fullScreen: Boolean

    allows the player to toggle fullscreen mode. The fullscreen event functions need to be called appropriately during the operation (see below onFullscreenPlayerWillPresent, onFullscreenPlayerDidPresent, onFullscreenPlayerWillDismiss, onFullscreenPlayerDidDismiss)

  • onVideoLoadStart: (onLoadStartEvent) => void

    should be invoked when the loading of the video starts, with the following payload:

    onLoadStartEvent: {
    isNetwork: Boolean,
    type: string,
    uri: string,
    }
  • onVideoLoad: (onLoadEvent) => void

    should be invoked when the video is loaded and playback can start, with the following payload:

    onLoadEvent: {
    currentPosition: Float,
    duration: Float,
    naturalSize: { width: Float, height: Float, orientation: "portrait" | "landscape" },
    audioTracks: [{ index: Int, title: String, language: String, type: String }],
    textTracks: [{ index: Int, title: String, language: String, type: String }],
    }

    language prop is 2-letters ISO 639-1 type prop is the mime-type

  • onVideoBuffer: () => void

    should be invoked when the video starts to buffer

  • onVideoError: (error) => void

    should be invoked when the playback is throwing an error. This will trigger the display of an error screen in the app

  • onVideoProgress: (onVideoProgressEvent) => void

    should be invoked every second and provide the player's heartbeat, along with the following payload:

    onVideoProgressEvent : {
    currentTime: Float,
    playableDuration: Float,
    seekableDuration: Float,
    }
  • onVideoSeek: (onSeekEvent) => void

    should be invoked when the seek operation completes

    onSeekEvent : {
    currentTime: Float,
    seekTime: Float
    }
  • onVideoEnd: () => void

    should be called when the playback has reached the end of the video

  • onVideoFullscreenPlayerWillPresent: () => void

    should invoked before setting the player to fullscreen

  • onVideoFullscreenPlayerDidPresent: () => void

    should invoked after setting the player to fullscreen

  • onVideoFullscreenPlayerWillDismiss: () => void

    should be invoked before dismissing fullscreen

  • onVideoFullscreenPlayerDismiss: () => void

    should be invoked after dismissing fullscreen

  • onReadyForDisplay: () => void

    should be invoked when the content is ready to be played

  • onPlaybackStalled: () => void

    should be invoked when the playback of the video is stalled

  • onPlaybackResume: () => void

    should be invoked when the playback of the video is resumed after being paused, stalled or bufferred

  • onPlaybackRateChange: ({ playbackRate: true }) => void

    should be invoked when the playback rate of the video changes

optional:

  • resizeMode: String

    provides resize mode to use for the video

  • allowsExternalPlayback: Boolean

    set to true when the player should allow to play on external sources (airplay, chromecast...)

  • playInBackground: Boolean

    set to true when the player should allow to play in the background

  • onVideoExternalPlaybackChange: ({ isExternalPlaybackActive: Boolean }) => void

    should be invoked when the playback is set to an external source (airplay, chromecast...)

Requirements

In order to create your Player plugin, you will need have an understanding of the following tools and frameworks

  • node.js / npm
  • yarn & yarn workspaces
  • React & React-Native

Anatomy of a QuickBrick player plugin

Quick Brick plugins are React-Native modules, published as npm packages. They should contain all the relevant code for the feature, both the javascript front-end and the backing native layer.

When building the app, we first install the npm package for the plugin. Then the native layer will pull the plugin native code to build from the node_modules folder, where the npm package of the plugin is installed. The manifests for each platform will provide the information for the native SDKs to find the proper path to the ios podspec and the android gradle file

Eventually, your plugin source code will look like this

|-- ios
| |-- MyPlayerPlugin.podspec
| |-- MyPlayerPlugin/... // native code
|-- android
| |-- build.gradle
| |-- com/package/src/main/java/... // native code
|-- src
| |-- index.js
| |-- ...
|-- manifests
| |-- tvos.json
| |-- android.json
| |-- samsung_tv.json
| |-- ...
|-- package.json
|-- .gitignore
|-- .npmignore
|-- README.md

And the manifests will include the following information:

// for all platforms, you need to provide the identifier, name & version of the npm package :
{
"name": "My Player Plugin",
"identifier": "my-player-plugin",
"dependency_name": "@my-org/my-player-plugin", // this is the name of the npm package
"dependency_version": "1.0.0", // current version on npm
"manifest_version": "1.0.0", // version of the plugin in Zapp*
"react_native": true,
"type": "player",
"targets": ["tv"],
// + other available manifest properties...
}
// Then you need to provide information so the native layer can
// - make sure the npm package is installed
// - retrieve the native code inside node_modules
// tvos
{
"platform": "tvos",
"api": { "class_name": "<native class name>"},
// this will tell the native layer to install this npm dependency
"npm_dependencies":["@my-org/my-player-plugin@1.0.0"],
"extra_dependencies": [
// This will be used by the native app to know where to pull the native layer for that plugin,
// by adding this line in the app's main Podfile
{ "MyPlayerPlugin": ":path =\u003e './node_modules/@my-org/my-player-plugin/ios" }
]
}
// android tv
{
"platform": "android",
"api": {
"class_name": "com.applicaster.reactnative.plugins.APReactNativeAdapter",
"react_packages": [
// should point to the class that registers your native module to the React Native manager
"com.my-org.my-player-plugin.MyPlayerPluginReactPackage",
]
},
// this will tell the native layer to install this npm dependency
"npm_dependencies":["@my-org/my-player-plugin@1.0.0"],
"project_dependencies": [
{
// this will tell gradle to configure a project entry in settings.gradle, pointing
// to the provided path, and add a compile project statement in the app's main gradle file
"my-player-plugin": "node_modules/@my-org/my-player-plugin/android",
}
]
}
// * although this is not mandatory, we encourage to keep the plugin manifest version in sync with the npm package version

Setting up your plugin

web

ios / tvos

android / amazon fire

Create your plugin project

In order to have a working environment for your plugin, you will need several things :

  • a repository for your plugin code
  • tooling to create the Quick Brick react-native entry point, so you can start the React-Native server locally
  • Native source code for the app you are working with.

Javascript front-end API

Connecting to Native

React Native provides an extensive API to allow communication between the javascript app and the native layer. You can refer here (ios - android ) for their documentation on native modules for more details.

tvos / ios

On Apple platforms, you need 3 pieces to connect the native code to the javascript front-end:

  1. an extension of the RCTViewManager class :
import React
import AVFoundation
@objc(MyPlayerPlugin)
public class MyPlayerPlugin: RCTViewManager {
// you need to define a name to require the native module on the JS side.
// this must match the string in the @objc annotation above, and must be returned
// by the moduleName method
static let nativeModuleName = "MyPlayerPlugin"
override public static func moduleName() -> String? {
return MyPlayerPlugin.nativeModuleName;
}
override public class func requiresMainQueueSetup() -> Bool {
return true
}
override open var methodQueue: DispatchQueue {
return bridge.uiManager.methodQueue
}
override public func view() -> UIView? {
guard let eventDispatcher = bridge?.eventDispatcher() else { return nil }
// here you simply return an instance of your player's native view.
// it will be initialized with the eventDispatcher which you can use
// to send events from native to JS
return MyPluginPlayerView(eventDispatcher: eventdispatcher);
}
@objc public override func constantsToExports() -> [AnyHashable: Any]! {
return [
"ScaleNone": AVLayerVideoGravity.resizeAspect as Any,
"ScaleToFill": AVLayerVideoGravity.resize as Any,
"ScaleAspectFit": AVLayerVideoGravity.resizeAspect as Any,
"ScaleAspectFill": AVLayerVideoGravity.resizeAspectFill as Any
]
}
}
  1. A declaration of the props available on the native component, using React Native's macros
@import React;
#import <AVFoundation/AVFoundation.h>
#import <React/RCTViewManager.h>
@interface RCT_EXTERN_MODULE(MyPlayerPlugin, RCTViewManager)
RCT_EXPORT_VIEW_PROPERTY(src, NSDictionary);
RCT_EXPORT_VIEW_PROPERTY(entry, NSDictionary);
RCT_EXPORT_VIEW_PROPERTY(adTagUri, NSString);
RCT_EXPORT_VIEW_PROPERTY(resizeMode, NSString);
RCT_EXPORT_VIEW_PROPERTY(repeatVideo, BOOL);
RCT_EXPORT_VIEW_PROPERTY(allowsExternalPlayback, BOOL);
RCT_EXPORT_VIEW_PROPERTY(paused, BOOL);
RCT_EXPORT_VIEW_PROPERTY(muted, BOOL);
RCT_EXPORT_VIEW_PROPERTY(controls, BOOL);
RCT_EXPORT_VIEW_PROPERTY(volume, float);
RCT_EXPORT_VIEW_PROPERTY(playInBackground, BOOL);
RCT_EXPORT_VIEW_PROPERTY(playWhenInactive, BOOL);
RCT_EXPORT_VIEW_PROPERTY(ignoreSilentSwitch, NSString);
RCT_EXPORT_VIEW_PROPERTY(rate, float);
RCT_EXPORT_VIEW_PROPERTY(seek, NSDictionary);
RCT_EXPORT_VIEW_PROPERTY(currentTime, float);
RCT_EXPORT_VIEW_PROPERTY(fullScreen, BOOL);
RCT_EXPORT_VIEW_PROPERTY(filter, NSString);
RCT_EXPORT_VIEW_PROPERTY(progressUpdateInterval, float);
RCT_EXPORT_VIEW_PROPERTY(onVideoLoadStart, RCTBubblingEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onVideoLoad, RCTBubblingEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onVideoBuffer, RCTBubblingEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onVideoError, RCTBubblingEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onVideoProgress, RCTBubblingEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onVideoSeek, RCTBubblingEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onVideoEnd, RCTBubblingEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onTimedMetadata, RCTBubblingEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onVideoAudioBecomingNoisy, RCTBubblingEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onVideoFullscreenPlayerWillPresent, RCTBubblingEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onVideoFullscreenPlayerDidPresent, RCTBubblingEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onVideoFullscreenPlayerWillDismiss, RCTBubblingEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onVideoFullscreenPlayerDidDismiss, RCTBubblingEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onReadyForDisplay, RCTBubblingEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onPlaybackStalled, RCTBubblingEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onPlaybackResume, RCTBubblingEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onPlaybackRateChange, RCTBubblingEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onVideoExternalPlaybackChange, RCTBubblingEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onVideoSaved, RCTBubblingEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onAdChangedState, RCTBubblingEventBlock);
@end
  1. An extension of UI View to implement your player
import AVKit
import Foundation
import React
@objc public class MyPluginPlayerView: UIView {
// all variables of type RCTBubblingEventBlock are javascript
// functions passed as prop to the native component
// you can easily invoke these javascript function
// from the native side by calling them as a function, and passing
// a dictionary of arguments
@objc public var onVideoLoadStart: RCTBubblingEventBlock?
@objc public var onVideoLoad: RCTBubblingEventBlock?
@objc public var onVideoBuffer: RCTBubblingEventBlock?
@objc public var onVideoError: RCTBubblingEventBlock?
@objc public var onVideoProgress: RCTBubblingEventBlock?
@objc public var onVideoSeek: RCTBubblingEventBlock?
@objc public var onVideoEnd: RCTBubblingEventBlock?
@objc public var onTimedMetadata: RCTBubblingEventBlock?
@objc public var onVideoAudioBecomingNoisy: RCTBubblingEventBlock?
@objc public var onVideoFullscreenPlayerWillPresent: RCTBubblingEventBlock?
@objc public var onVideoFullscreenPlayerDidPresent: RCTBubblingEventBlock?
@objc public var onVideoFullscreenPlayerWillDismiss: RCTBubblingEventBlock?
@objc public var onVideoFullscreenPlayerDidDismiss: RCTBubblingEventBlock?
@objc public var onReadyForDisplay: RCTBubblingEventBlock?
@objc public var onPlaybackStalled: RCTBubblingEventBlock?
@objc public var onPlaybackResume: RCTBubblingEventBlock?
@objc public var onPlaybackRateChange: RCTBubblingEventBlock?
@objc public var onVideoExternalPlaybackChange: RCTBubblingEventBlock?
@objc public var onAdChangedState: RCTBubblingEventBlock?
// other props are mapped to types automatically. When a prop is set on
// the javascript side, `didSet` is invoked
@objc public var entry: [String: Any]?
@objc public var src: NSDictionary? {
didSet {
//what to do when `src` prop is set or changed on the javascript side
}
}
/* Required to publish events */
var eventDispatcher: RCTEventDispatcher?
public init(eventDispatcher: RCTEventDispatcher) {
super.init(frame: .zero)
self.eventDispatcher = eventDispatcher
}
public required init?(coder aDecoder: NSCoder) {
return nil
}
deinit {
// clean up when player is being unmounted from the javascript side
}
}

Each of these props being properties declared on your native views, annotated with @objc