Hello everyone.

We have decided to phase out the Mixed Reality Forums over the next few months in favor of other ways to connect with us.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

The plan between now and the beginning of May is to clean up old, unanswered questions that are no longer relevant. The forums will remain open and usable.

On May 1st we will be locking the forums to new posts and replies. They will remain available for another three months for the purposes of searching them, and then they will be closed altogether on August 1st.

So, where does that leave our awesome community to ask questions? Well, there are a few places we want to engage with you. For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality. If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers. And always feel free to hit us up on Twitter @MxdRealityDev.

Networking with the HoloLens


I hope this Question is in the right category. I'm currently working on a Server - Client app, where the Client runs on the HoloLens in Unity and the Server on Linux. Because my latencies were very unstable I gathered some data and made these graphs:

Without any other traffic
Sending a package from the HoloLens to the server and back (TCP)

Sending a package from the HoloLens to the server and back (UDP)

With traffic (around 300 * 62 Bytes per second)
Sending a package from the HoloLens to the server and back (TCP)

I'm using System.Net.Sockets.Socket.Socket on the HoloLens with the functions SendAsync() (TCP) and SendToAsync() (UDP).

I know that I should not use TCP for so many small packages, that's why I switched to UDP. But I still keep getting very large peeks. The server is locally connected to the HoloLens by wifi, so no latency > 10ms should exist.

Thanks for your help,




Sign In or Register to comment.