Unfortunately, Progressive Download is the only ubiquitously supported option
Different Browsers support different video codec's -H.264 -WebM -vp8/VP9 -etc
Safari (iOs and macOS only) natively supports HLS
MediaSource Extensions released in Chrome, IE11, and Safari -Firefox in beta
[NB] For the video to stream smoothly without buffering, the video on the server has to be stored in segments/chunks. For example, one segment or first segment has would contain a segment initialize called Head.m4s and the following other segments would be in different sizes. YouTube actually stores segments of videos and audio separately for really fast transport over HTTP protocol. The data is received as bytes in the form of an array buffer. The array buffer is cast in the format of Uint8Array(response) the array bytes is then inputted in a source buffer by appending the stream of bytes.
The files have their manifest to tell the algorithm where the video files are located
What is MPEG-DASH
DASH - Dynamic Adaptive Streaming via HTTP
The international open standard developed and published by ISO
Addresses both simple and advanced use cases
Enable highest-quality multi-screen distribution and efficient dynamic adaptive switching
Enables reuse of existing content, devices and infrastructure
Attempts to unify to a single standard for HTTP Streaming
[NB] It is the only standard of the HTTP Streaming today. Streaming of HLS owned by Apple, Smooth Streaming owned by Microsoft and HDS owned by Adobe, and others are proprietary protocol owned by other companies. Dash Streaming specification was created by these three companies, Dash was created recently based on these other proprietary protocols.
Advantages of Using Dash
Provides a clean separation between audio and video files (for multi-language video, becomes easy to swap up audio)
The DASH specification is codec agnostic
Any existing or future codec can work with DASH
Allows ability for a single manifest to describe several different versions in different codecs (It doesn't care whether you are dealing with webM or MP4 etc. It will simply describe what the content is and let the player decide what it knows what to play)
Disadvantages
Understanding what is now. For live videos
Building a DASH player
DASH player is available as an open-source for different platforms, HTML 5, flash, and android
DASH.js is open source
DASH.js is the reference player for the DASH industry Forum (dashif.org)
DASH.js has all the bit rate algorithm, buffering
How to play a DASH Stream
Download Manifest
Parse Manifest
Determine option bandwidth for client
Initialize for bandwidth
Download Segment
Hand segment to MSE
Check Bandwidth to determine if the change is necessary
DASH Manifest
You have a period section, this can be used for advertising
AdaptationSet section cab is used for audio and video (It is more like a web.config file for videos) The adaptive bitrate algorithm determines which AdaptiveSet it would want to render.
When the client first requests the download of the Manifest file, the DASH will calculate how fast the file was downloaded and know the client bandwidth.
Understand DASH Structure
Three types of files - Manifest (.mpd) - XML file describing the segments - Initialization file - contains headers needed to decode bytes in segments -Segments Files - contains playable media - includes (0 to many video tracks and 0 to many audio tracks)
[Manifest File] -SegmentBase (Describes a stream with only a single Segment per bit-rate, Can be used for Byte Range Requests) -SegmentList (A SegmentURL- individual HTTP packets with media data[ Can be used for Byte Range Request]) - SegmentTemplate (Defines a Known URL for the fragment with wildcards, resolved at run-time to request segments. Alternatively, can specify a list of segments based on duration)
Simple SegmentList -Representation id='id' mimeType= 'blablabla' codecs='codk' width='jdjd' height='jdjd' startWithSAP='1' bandwidth='7654' ..So on and so on
SegmentTemplate fixed segment duration AdaptationSet ContentComponet id ='1' (The Segmentation fixed segment duration defines how long before another segment starts loading. This avoid the video from buffering. Look and load the next segment before the segment ends)
Class Structure -The player is divided into two main packages -Streaming - Contains the classes responsible for creating and populating the MediaSource buffers. These classes are intended to be abstract enough for use with any segmented stream (such as DASH, HLS, HDS, and MSS) -dash - contains the classes responsible for making decisions specifically related to Dash.
Dash API (mediaPlayer.js) -Exposes the top-level functions and properties to the Developer (play, autoPlay, isLive, abr quality, and metrics) -The manifest URL and the HTML Video object as passed to the MediaPlayer
Dependency Injection -Uses a class called Context from Context.js -The dependency mapping for the stream package. -The context is passed into the MediaPlayer object allowing for different MediaPlayer instances to use different mappings.
Stream.js -Loads/refreshes the manifest -Create SourceBuffers from MediaSource -Responds to events from HTML Video object -For a live stream, the live edge is calculated and passed to the BufferController instances.
Debug.js (class) -Convenience class for logging methods -A default implementation is to just use console.log() -Extension point for tapping into logging messages
BufferController.js -Responsible for loading fragments and pushing the bytes into the SourceBuffer. -Once play() has been called a timer is started to check the status of the bytes in the buffer. -If the amount of time left to play is less than Manifest.minBufferTime the next fragment is loaded -Records metrics related to playback.
ManifestLoader.js -Responsible for loading manifest files -Returns the parsed manifest object
FragmentLoader.js -Responsible for loading fragments -Loads requests sequentially
AbrController.js -Responsible for deciding if the current quality should be changed -The stream metrics are passed to the set of 'rules'. -Methods: getPlayBackQuality(type, data) :type = the type of data audio or video
Rules: -DownloadRatialRule.js: Validates that fragment is being downloaded in a timely manner. -Compares the time it takes to download a fragment to how long it takes to play out a fragment -If the download time is considered a bottleneck the quality will be lowered.
-InsufficientBufferRull.js -Validates that the buffer doesn't run dry during playback -If the buffer is running dry continuously it likely means that the player has a processing bottleneck (video decode time is longer than playback time)
DASH packages -DashContext.js - Defines dependency mapping specific to the dash package. -Parser, Index Handler and Manifest Extensions
DashParser.js -Converts the manifest to a JSON object -Converts duration and DateTime strings into number/date objects -Manages inheritance fields. - many fields are inherited from parent to child nodes in DASH
DashHandler.js -Responsible for deciding which fragment URL should be loaded -Methods: getInitRequeqst (quality)
Flow -Create the Context and MediaPlayer instances var context = new Dash.di.DashContext(); player = new MediaPlayer(context); -Initialize MediaPlayer and set manifest URL player.startup(); player.setIsLive(false); player.attachSource(manifest_url) -Attach HTML Video element video = document.querySelector('.dash-video-player'),player.autoplay=true; player.attachView(video)
If you are working in Dotnet core 2.2 and you have a separate video folder inside your assets wwwroot folder, you will have to include some code inside your startup class in order to use the video folder in asp.net core 2.2 asset pipeline.
Mark said:
If you are working in Dotnet core 2.2 and you have a separate video folder inside your assets wwwroot folder, you will have to include some code inside your startup class in order to use the video folder in asp.net core 2.2 asset pipeline.