Unity: Live Video Streaming

Unity: Live Video Streaming

I ran your code and it worked sometimes and failed sometimes(about 90% of the time). It ran with on my computer with 5 FPS. This will not play well on mobile device which I am sure you are targeting iPad.

There are few problems in your code but they are very serious problems.


1.Your image is not completely received before you load them.

This is why your image look so weird.

The biggest mistake people make when working with socket is to assume that everything you send will be received at once. This is not true. That's the way your client is coded. Please read this.

This is the method I used in my answer:

A.Get Texture2D byte array.

B.Send the byte array length. Not the byte array but the length.

C.The client will read the length first.

D.The client will use that length to read the whole texture data/pixel until completion.

E.Convert the received bytes to array.

You can look at the private int readImageByteSize(int size) and the private void readFrameByteArray(int size) functions for how to read all the bytes.

Of-course, you must also know the length of the data's length that is sent first. The length is saved in int data-type.

The max int value is 2,147,483,647 and that is 10 digit long. So, I made the array length of the array that is sent first to be 15 as a protocol. This is a rule that must be obeyed on the client side too.

This how it works now:

Read the byte array from Texture2D, read the length of that array, send it to the client. Client follows a rule that the first 15 bytes is simply the length. Client would then read that 15 bytes, convert it back into length then use that length in a loop to read complete Texture2D from the server.

The length conversion is done with the void byteLengthToFrameByteArray(int byteLength, byte[] fullBytes) and int frameByteArrayToByteLength(byte[] frameBytesLength) functions. Take a look at those to understand them.


2.Performing socket operation in the Main Thread.

This is why the FPS is 5 on my computer.

Don't do this as this will make your frame rate to be low just like it is already. I have answered many questions like this but won't go deep because it looks like you know what you are doing and tried to use Thread but did it wrong.

A.You were reading from the main Thread when you did: serverStream.Read(readBuffer, 0, readBuffer.Length); in the Update function.

You should have done that inside

Loom.RunAsync(() =>
{ //your red code });

B. You made the-same mistake in the SendPng function, when you were sending data with the stream.Write(pngBytes, 0, pngBytes.Length); in the

Loom.QueueOnMainThread(() =>
{});

Anything you do inside Loom.QueueOnMainThread will be done in the main Thread.

You are supposed to do the sending in another Thread.Loom.RunAsync(() =>{});


Finally, listner = new TcpListener(port); is obsolute. This did not cause any problem but use listner = new TcpListener(IPAddress.Any, port); in your server code which should listen to nay IP.

The final FPS is over 50 on my computer after making all these fixes. The code below can be improved a-lot. I will leave that for you to do.

You can use online code compare to see things that changed in each class.

SERVER:

using UnityEngine;
using System.Collections;
using System.IO;
using UnityEngine.UI;
using System;
using System.Text;
using System.Net;
using System.Net.Sockets;
using System.Threading;
using System.Collections.Generic;

public class Connecting : MonoBehaviour
{
WebCamTexture webCam;
public RawImage myImage;
public bool enableLog = false;

Texture2D currentTexture;

private TcpListener listner;
private const int port = 8010;
private bool stop = false;

private List<TcpClient> clients = new List<TcpClient>();

//This must be the-same with SEND_COUNT on the client
const int SEND_RECEIVE_COUNT = 15;

private void Start()
{
Application.runInBackground = true;

//Start WebCam coroutine
StartCoroutine(initAndWaitForWebCamTexture());
}

//Converts the data size to byte array and put result to the fullBytes array
void byteLengthToFrameByteArray(int byteLength, byte[] fullBytes)
{
//Clear old data
Array.Clear(fullBytes, 0, fullBytes.Length);
//Convert int to bytes
byte[] bytesToSendCount = BitConverter.GetBytes(byteLength);
//Copy result to fullBytes
bytesToSendCount.CopyTo(fullBytes, 0);
}

//Converts the byte array to the data size and returns the result
int frameByteArrayToByteLength(byte[] frameBytesLength)
{
int byteLength = BitConverter.ToInt32(frameBytesLength, 0);
return byteLength;
}

IEnumerator initAndWaitForWebCamTexture()
{
// Open the Camera on the desired device, in my case IPAD pro
webCam = new WebCamTexture();
// Get all devices , front and back camera
webCam.deviceName = WebCamTexture.devices[WebCamTexture.devices.Length - 1].name;

// request the lowest width and heigh possible
webCam.requestedHeight = 10;
webCam.requestedWidth = 10;

myImage.texture = webCam;

webCam.Play();

currentTexture = new Texture2D(webCam.width, webCam.height);

// Connect to the server
listner = new TcpListener(IPAddress.Any, port);

listner.Start();

while (webCam.width < 100)
{
yield return null;
}

//Start sending coroutine
StartCoroutine(senderCOR());
}

WaitForEndOfFrame endOfFrame = new WaitForEndOfFrame();
IEnumerator senderCOR()
{

bool isConnected = false;
TcpClient client = null;
NetworkStream stream = null;

// Wait for client to connect in another Thread
Loom.RunAsync(() =>
{
while (!stop)
{
// Wait for client connection
client = listner.AcceptTcpClient();
// We are connected
clients.Add(client);

isConnected = true;
stream = client.GetStream();
}
});

//Wait until client has connected
while (!isConnected)
{
yield return null;
}

LOG("Connected!");

bool readyToGetFrame = true;

byte[] frameBytesLength = new byte[SEND_RECEIVE_COUNT];

while (!stop)
{
//Wait for End of frame
yield return endOfFrame;

currentTexture.SetPixels(webCam.GetPixels());
byte[] pngBytes = currentTexture.EncodeToPNG();
//Fill total byte length to send. Result is stored in frameBytesLength
byteLengthToFrameByteArray(pngBytes.Length, frameBytesLength);

//Set readyToGetFrame false
readyToGetFrame = false;

Loom.RunAsync(() =>
{
//Send total byte count first
stream.Write(frameBytesLength, 0, frameBytesLength.Length);
LOG("Sent Image byte Length: " + frameBytesLength.Length);

//Send the image bytes
stream.Write(pngBytes, 0, pngBytes.Length);
LOG("Sending Image byte array data : " + pngBytes.Length);

//Sent. Set readyToGetFrame true
readyToGetFrame = true;
});

//Wait until we are ready to get new frame(Until we are done sending data)
while (!readyToGetFrame)
{
LOG("Waiting To get new frame");
yield return null;
}
}
}

void LOG(string messsage)
{
if (enableLog)
Debug.Log(messsage);
}

private void Update()
{
myImage.texture = webCam;
}

// stop everything
private void OnApplicationQuit()
{
if (webCam != null && webCam.isPlaying)
{
webCam.Stop();
stop = true;
}

if (listner != null)
{
listner.Stop();
}

foreach (TcpClient c in clients)
c.Close();
}
}

CLIENT:

using UnityEngine;
using System.Collections;
using UnityEngine.UI;
using System.Net.Sockets;
using System.Net;
using System.IO;
using System;

public class reciver : MonoBehaviour
{
public RawImage image;
public bool enableLog = false;

const int port = 8010;
public string IP = "192.168.1.165";
TcpClient client;

Texture2D tex;

private bool stop = false;

//This must be the-same with SEND_COUNT on the server
const int SEND_RECEIVE_COUNT = 15;

// Use this for initialization
void Start()
{
Application.runInBackground = true;

tex = new Texture2D(0, 0);
client = new TcpClient();

//Connect to server from another Thread
Loom.RunAsync(() =>
{
LOGWARNING("Connecting to server...");
// if on desktop
client.Connect(IPAddress.Loopback, port);

// if using the IPAD
//client.Connect(IPAddress.Parse(IP), port);
LOGWARNING("Connected!");

imageReceiver();
});
}

void imageReceiver()
{
//While loop in another Thread is fine so we don't block main Unity Thread
Loom.RunAsync(() =>
{
while (!stop)
{
//Read Image Count
int imageSize = readImageByteSize(SEND_RECEIVE_COUNT);
LOGWARNING("Received Image byte Length: " + imageSize);

//Read Image Bytes and Display it
readFrameByteArray(imageSize);
}
});
}

//Converts the data size to byte array and put result to the fullBytes array
void byteLengthToFrameByteArray(int byteLength, byte[] fullBytes)
{
//Clear old data
Array.Clear(fullBytes, 0, fullBytes.Length);
//Convert int to bytes
byte[] bytesToSendCount = BitConverter.GetBytes(byteLength);
//Copy result to fullBytes
bytesToSendCount.CopyTo(fullBytes, 0);
}

//Converts the byte array to the data size and returns the result
int frameByteArrayToByteLength(byte[] frameBytesLength)
{
int byteLength = BitConverter.ToInt32(frameBytesLength, 0);
return byteLength;
}

/////////////////////////////////////////////////////Read Image SIZE from Server///////////////////////////////////////////////////
private int readImageByteSize(int size)
{
bool disconnected = false;

NetworkStream serverStream = client.GetStream();
byte[] imageBytesCount = new byte[size];
var total = 0;
do
{
var read = serverStream.Read(imageBytesCount, total, size - total);
//Debug.LogFormat("Client recieved {0} bytes", total);
if (read == 0)
{
disconnected = true;
break;
}
total += read;
} while (total != size);

int byteLength;

if (disconnected)
{
byteLength = -1;
}
else
{
byteLength = frameByteArrayToByteLength(imageBytesCount);
}
return byteLength;
}

/////////////////////////////////////////////////////Read Image Data Byte Array from Server///////////////////////////////////////////////////
private void readFrameByteArray(int size)
{
bool disconnected = false;

NetworkStream serverStream = client.GetStream();
byte[] imageBytes = new byte[size];
var total = 0;
do
{
var read = serverStream.Read(imageBytes, total, size - total);
//Debug.LogFormat("Client recieved {0} bytes", total);
if (read == 0)
{
disconnected = true;
break;
}
total += read;
} while (total != size);

bool readyToReadAgain = false;

//Display Image
if (!disconnected)
{
//Display Image on the main Thread
Loom.QueueOnMainThread(() =>
{
displayReceivedImage(imageBytes);
readyToReadAgain = true;
});
}

//Wait until old Image is displayed
while (!readyToReadAgain)
{
System.Threading.Thread.Sleep(1);
}
}

void displayReceivedImage(byte[] receivedImageBytes)
{
tex.LoadImage(receivedImageBytes);
image.texture = tex;
}

// Update is called once per frame
void Update()
{

}

void LOG(string messsage)
{
if (enableLog)
Debug.Log(messsage);
}

void LOGWARNING(string messsage)
{
if (enableLog)
Debug.LogWarning(messsage);
}

void OnApplicationQuit()
{
LOGWARNING("OnApplicationQuit");
stop = true;

if (client != null)
{
client.Close();
}
}
}

I want to play a live stream on Unity after extracting the video from the m3u file

I found a solution after a long search Loading ts files and converting them to a readable file on Unity is difficult But the UMP package helped me to read this format That's why I'm using it right now

Note : after downloading and trying the package it will not work, don't lose hope and try again inside a GameObjectsExample Scane

How to load YouTube live video streaming in unity?

asset store has plugins for youtube videos:
https://www.assetstore.unity3d.com/en/?stay#!/search/page=1/sortby=relevance/query=youtube

i'm pretty sure unity video player plays youtube videos also, if you pass it the actual mp4 youtube video url. (you need to extract it with those plugins, use some website api to extract it, or make your own)

Livestream playback on Hololens2

The problem was copying from CPU to GPU, the SharpDX library allowed copying frames directly to Idirect3dsurface. I'm attaching the code, maybe it will be useful.
Direct3D11 helpers is available in the microsoft documentation
https://learn.microsoft.com/en-us/windows/uwp/audio-video-camera/screen-capture-video#helper-wrapper-classes

        private UnityEngine.GameObject MainCamera;
private UnityEngine.Texture2D targetTexture;

private IDirect3DSurface surface;
private SharpDX.Direct3D11.Device dstDevice;
private void AppCallbacks_Initialized()
{
SharpDX.Direct3D11.Device srcDevice = new SharpDX.Direct3D11.Device(SharpDX.Direct3D.DriverType.Hardware);

UnityEngine.WSA.Application.InvokeOnAppThread(() =>
{
Display = UnityEngine.GameObject.Find("Display");
targetTexture = null;
//Create texture for get Device and Device context from Unity
UnityEngine.Texture2D deviceTexture = new UnityEngine.Texture2D(2048, 2048, UnityEngine.TextureFormat.RGBA32, false);
IntPtr txPtr = deviceTexture.GetNativeTexturePtr();
SharpDX.Direct3D11.Texture2D dstTexture = new SharpDX.Direct3D11.Texture2D(txPtr);
dstDevice = dstTexture.Device;
//Create sharedResource

SharpDX.Direct3D11.Texture2DDescription sharedTextureDesc = dstTexture.Description;
sharedTextureDesc.OptionFlags = SharpDX.Direct3D11.ResourceOptionFlags.Shared;
SharpDX.Direct3D11.Texture2D m_DstTexture = new SharpDX.Direct3D11.Texture2D(dstDevice, sharedTextureDesc);

SharpDX.Direct3D11.ShaderResourceViewDescription rvdesc = new SharpDX.Direct3D11.ShaderResourceViewDescription
{
Format = sharedTextureDesc.Format,
Dimension = SharpDX.Direct3D.ShaderResourceViewDimension.Texture2D
};
rvdesc.Texture2D.MostDetailedMip = 0;
rvdesc.Texture2D.MipLevels = 1;
SharpDX.Direct3D11.ShaderResourceView rvptr = new SharpDX.Direct3D11.ShaderResourceView(
dstDevice,
m_DstTexture, rvdesc);
targetTexture = UnityEngine.Texture2D.CreateExternalTexture(sharedTextureDesc.Width, sharedTextureDesc.Height, UnityEngine.TextureFormat.BGRA32, false, false, rvptr.NativePointer);
MainCamera = UnityEngine.GameObject.Find("Main Camera");
Display.GetComponent<UnityEngine.UI.RawImage>().texture = targetTexture;
var sharedResourceDst = m_DstTexture.QueryInterface<SharpDX.DXGI.Resource>();
var sharedTexDst = srcDevice.OpenSharedResource<SharpDX.Direct3D11.Texture2D>(sharedResourceDst.SharedHandle);
surface = Direct3D11Helper.CreateDirect3DSurfaceFromSharpDXTexture(sharedTexDst);
sharedResourceDst.Dispose();
sharedTexDst.Dispose();
dstTexture.Dispose();
m_DstTexture.Dispose();
}, false);
InitializeMediaPlayer();
}
private void MediaPlayer_VideoFrameAvailable(Windows.Media.Playback.MediaPlayer sender, object args)
{
Debug.WriteLine("frameAvail");
sender.CopyFrameToVideoSurface(surface);
}




Related Topics



Leave a reply



Submit