To create your project and initialize the player instance, see Creating the player.
Creating a player
There are several ways to create a player instance. You can:
Create a player from a URL.
Create a player from a player item initialized from a URL.
Creating a player from a player item initialized from an OTVAVURLAsset
.
Create a player and set the player item
Click here to see the example code.
Creating a player from URL
CODE
let assetURL = URL(string: "https://d3bqrzf9w11pn3.cloudfront.net/basic_hls_bbb_clear/index.m3u8")!
let player = OTVAVPlayer(url: assetURL)
Creating a player from player item initialized from URL
CODE
let assetURL = URL(string: "https://d3bqrzf9w11pn3.cloudfront.net/basic_hls_bbb_clear/index.m3u8")!
let playerItem = OTVAVPlayerItem(url: assetURL)
let player = OTVAVPlayer(playerItem: playerItem)
Creating a player from player item initialized from OTVAVURLAsset
CODE
let assetURL = URL(string: "https://d3bqrzf9w11pn3.cloudfront.net/basic_hls_bbb_clear/index.m3u8")!
let asset = OTVAVURLAsset(url: assetURL)
let playerItem = OTVAVPlayerItem(asset: asset)
let player = OTVAVPlayer(playerItem: playerItem)
Creating a player and setting player item
CODE
let assetURL = URL(string: "https://d3bqrzf9w11pn3.cloudfront.net/basic_hls_bbb_clear/index.m3u8")!
let asset = OTVAVURLAsset(url: assetURL)
let playerItem = OTVAVPlayerItem(asset: asset)
let player = OTVAVPlayer()
player.replaceCurrentItem(with: playerItem)
Attach player for Audio/Video rendering
Attaching a player to UIView
Before playing a stream, the player must be attached to a UIView
to get the audio and video rendered.
Click here to see the example code.
Creating a view with AVPlayerLayer to be attached to a player
CODE
class PlayerView: UIView {
var player: AVPlayer? {
get { return playerLayer.player }
set { playerLayer.player = newValue }
}
var playerLayer: AVPlayerLayer {
return layer as! AVPlayerLayer
}
override class var layerClass: AnyClass {
return AVPlayerLayer.self
}
}
Setting the player to the view
CODE
@IBOutlet weak var playerView: PlayerView!
playerView.player = player
The following code can be added to indicate how the layer displays the video content within its bounds.
Setting the video rendering mode in the view
CODE
playerView.playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
For more information, see the Apple document AVFoundation/AVLayerVideoGravity.
AVLayerVideoGravity | Descriptions |
---|
resizeAspect | The player should preserve the video’s aspect ratio and fit the video within the layer’s bounds. |
resizeAspectFill | The player should preserve the video’s aspect ratio and fill the layer’s bounds. |
resize | The video should be stretched to fill the layer’s bounds. |
Attaching a player to SwiftUI View
The player must be attached to VideoPlayer
view of SwiftUI to get the audio and video rendered during playback.
Click here to see the example code.
Attaching OTVAVPlayer to VideoPlayer View
CODE
VideoPlayer(player: otvAVPlayer)
Playing a stream
Play a stream that has been set by calling:
CODE
player.play() /* https://developer.apple.com/documentation/avfoundation/avplayer/1386726-play */
or
CODE
player.rate = 1.0 /* https://developer.apple.com/documentation/avfoundation/avplayer/1388846-rate */
or
CODE
player.playImmediately(atRate: 1.0) /* https://developer.apple.com/documentation/avfoundation/avplayer/1643480-playimmediately */
Pausing a playback
Pause the playback by calling:
CODE
player.pause() /* https://developer.apple.com/documentation/avfoundation/avplayer/1387895-pause */
or
CODE
player.rate = 0.0 /* https://developer.apple.com/documentation/avfoundation/avplayer/1388846-rate */
Seeking through a playback stream
You can seek through a playback stream.
Click here to see the example code.
CODE
// Sets the current playback time to the specified time and executes the specified block when the seek operation completes or is interrupted.
func seek(to: CMTime, completionHandler: ((Bool) -> Void)?) /* https://developer.apple.com/documentation/avfoundation/avplayeritem/1387418-seek */
//Sets the current playback time within a specified time bound and invokes the specified block when the seek operation completes or is interrupted.
func seek(to: CMTime, toleranceBefore: CMTime, toleranceAfter: CMTime, completionHandler: ((Bool) -> Void)?) /* https://developer.apple.com/documentation/avfoundation/avplayeritem/1387753-seek */
// Sets the current playback time to the time specified by the date object.
func seek(to: Date, completionHandler: ((Bool) -> Void)?) -> Bool /* https://developer.apple.com/documentation/avfoundation/avplayeritem/1389877-seek */
// Cancels any pending seek requests and invokes the corresponding completion handlers if present.
func cancelPendingSeeks() /* https://developer.apple.com/documentation/avfoundation/avplayeritem/1388316-cancelpendingseeks */
CODE
// Returns the current time of the item.
func currentTime() -> CMTime /* https://developer.apple.com/documentation/avfoundation/avplayeritem/1387230-currenttime */
// Returns the current time of the item as an NSDate object.
func currentDate() -> Date? /* https://developer.apple.com/documentation/avfoundation/avplayeritem/1386188-currentdate */
// The duration of the item.
var duration: CMTime /* https://developer.apple.com/documentation/avfoundation/avplayeritem/1389386-duration */
// The timebase information for the item.
var timebase: CMTimebase? /* https://developer.apple.com/documentation/avfoundation/avplayeritem/1387605-timebase */
CODE
// An array of time ranges indicating media data that is readily available.
var loadedTimeRanges: [NSValue] /* https://developer.apple.com/documentation/avfoundation/avplayeritem/1389953-loadedtimeranges */
// An array of time ranges within which it is possible to seek.
var seekableTimeRanges: [NSValue] /* https://developer.apple.com/documentation/avfoundation/avplayeritem/1386155-seekabletimeranges */
| VOD | LIVE |
---|
currentTime() -> CMTime
| The time of the current playback position. The value starts from 0 to the duration. | The time of the current playback position. The value starts from the time near the end of the seekable window (three segments before the last one of the manifest) and keeps increasing. It should drop in the seekable range window. |
currentDate() -> Date?
| The date and time of the current playback position. The value of the program date time from the manifest. E.g. #EXT-X-PROGRAM-DATE-TIME:2022-02-19T14:54:23.031+08:00 | The date and time of the current playback position. The value of the program date time from the manifest. E.g. #EXT-X-PROGRAM-DATE-TIME:2022-02-19T14:54:23.031+08:00 |
duration: CMTime
| The duration of the content. | The duration of all the segments in the manifest. |
seekableTimeRanges: [NSValue]
| The time range of the static seekable window, [0, duration]. | The time range of the dynamic seekable window, [startTime, startTime + duration ]. The startTime is the current time of the first segment in the manifest, and the value keeps updating. |
For more information, see the Apple document Seeking Through Media.
Observing the playback time
You can use the following code to observe the currentTime change and handle the UI about the time update in the closure.
CODE
// You need to retain the instance timeObserverToken to make sure the closure can be called.
timeObserverToken = player.addPeriodicTimeObserver(forInterval: time, queue: .main) {
[weak self] time in
// update player UI for time update
}
// You also need to remove the observer when it's not used:
player.removeTimeObserver(timeObserverToken)
timeObserverToken = nil
For more information, see the Apple document Observing the Playback Time.
Changing streams
The playback stream can be changed by calling:
CODE
let playerItem = OTVAVPlayerItem(url: assetURL)
player.replaceCurrentItem(with: playerItem)
Stopping playback
Playback can be stopped by calling:
CODE
player.replaceCurrentItem(with: nil)