NDI plugin experiments ?

Hello !

I’m currently testing everything i can with a Theta X and i am wondering why there is no NDI capabilities in any community plugins.

I will try the RTMP/RTSP live streaming plugin, but for what i can understand from source code, this is basically an Android app that uses a all-in-one dependency -pedroLibrary) for encoding and streaming.

It seems like solutions exists for Android apps to implement NDI, like GitHub - WalkerKnapp/devolay: A Java binding for the Newtek NDI(tm) SDK. but i have no experience (yet!) in Android development (neither for the Theta).

Could someone be of some help to experiment with it ?
Anyone already tried ?

Thanks

I have heard of no attempts to use NDI.

What is the advantage over other approaches such as SRT or event RTMP?

What are your requirements?

Have you looked at this plug-in?

Hello craig (thanks a lot for your reply, you are the reason i bought a Theta X).

For what i’ve tested in the past, NDI gave me the best results with iPhones and Android phones as remote camera, with both good resolutions and low latency.

My requirements would be to keep latency as low as possible (ideally <1s), and i never succeed to reach this kind of results with RTSP/RTMP (i tried having a local nginx rtmp server but latency was huge).

I don’t know about SRT though.

I did some tests, and i’m hesitating between risking loosing time to have NDI working, or returning the Theta X, switch for a Theta Z1 and only use Live USB Mode with 4K.

Initial project is streaming for 2hrs+, 4K30fps minimum, with low latency, no battery and ethernet.
Currently i can have 15mns@4K30fps with low latency before the camera goes too hot, because i can’t remove the battery (other topic opened), and USB-C cable is a bit short.

Also, issue with plugin is i have to use Phone App to launch plugins, and i need have Wifi for other usage on my phone (could use a spare phone if there is no alternative)

I tested sub-second latency with @biviel on his system using SRT. It may be possible for you to test it on his system if you contact him.

The THETA Z1 has better heat management for live streaming compared to the X. If streaming is your focus, you may want to start with the Z1. The plug-in from FlowTours (biviel) only works with the Z1.

Instead of focusing on removing the battery, you may want to assess the Z1 first if you can get full refund on the X.

What are the first 4 digits of your camera serial number? Does it start with one of these: YR11, YR12, YR13, YR14, or YR15

1 Like

I have a YR10, why, do you know some serials have heat issue that others have not ?

I’ll dig into all solutions (noteably SRT) before trying to return it, but the Z1 is not cheap. If i cannot have a better resolution than 4K@30fps, this would indeed be my best solution.

Mmh, first test was 30 minutes in Live Mode.
Battery was removed, ambient temperature is 26°C, no fans, 4K30fps - connected to a Macbook pro m1 into OBS. firmware 2.10.1.

I suppose the YR10 still have this heat issue ? I’m kinda disppointed that they announced 24hrs, and amazon still ships this affected series.

In this notice, it seems like the most recent firmware upgrade improved heat management only for certain serial numbers:

If it were my money and time, I would check into returning the X (if you can get a full refund) and getting a Z1 (maybe you can even find a used one).

Also, maybe wait a few days and see if Laszlo responds. He had some additional heat management issues with the X when streaming using WiFi.

NOTE: I do not work for RICOH. This is just my personal opinion based on public information and my own personal tests.

@Remi_Jarasson , for me live streaming wirelessly via local WiFi network and through cellular network, stabilizing stream optimize heat and minimize latency, were top priorities last 2 years.

I guess you would like to live stream to OBS or other software in local network (through wifi or ethernet) and than present it to a big wall with some effects, through projector or other solution?

With my plugin using Z1, I can stream directly to OBS or vMix, etc. anything that can ingest SRT protocol and use it as source.

There is one problem, Z1 can stream only 24-25FPS 30FPS is too high currently and it’s not stable enough when rendering equirectangular format. If dual fisheye is good enough too, it can do 30FPS without issues. Operation time varies, depending on environment temperature. It can stream for hours…

1 Like

Thank you @craig !
I watched a lot of your videos before deciding to test the Theta X, i only saw today (in Youtube comments) your advice that for streaming the Z1 would be better.

Plus, i suppose all plugins i could plan to implement on the Theta X (NDI) would also be interesting to have on a Z1, so that could still be ok.
I am a bit disappointed that USB is only limited to 4K, and that i cannot find a proper solution to have 8K so there is no point in having the X right now.
RAW stills could also be in my interests (i do a bit of 3D so… having my own HDRi would be cool).

I think i am going to return it to Amazon asap.

OOOH thank you for your feedback @biviel !

I don’t plan to project it aywhere, i’m more interested in providing an original way of streaming and filming music sessions.

25fps seems correct, but i may be wrong.
But can you provide more details ?
Like, is 25fps in equirectangular a bit unstable through USB ? or through your plugin ?
Is it dropping frames at 25fps or only at 30 ?
Concerning dual fisheye, what is it ? 2 webcam streams (unsynced), or front and back are just in a single frame with seams (no merge) ?

My plan is to have an equirectangular video stream.
NDI seems to be the best solution i know, but SRT could be something i try if latency is ok (it’s hard to delay audio or other cameras more than a second, and it can make live video directing a pain).
But i like the “webcam” / mjpeg solution, cause i tried the Theta X in Chrome with Three.JS, and it could really be a good directing tool.
Like… you have a 360 view you can control with a PS4 joystick, and stream the show with 3D overlays, only directing with a joystick.

Is this a potential type of ongoing business where you get money from music groups and you could potentially pay FlowTours for a streaming service on the backend if it worked? I’m only asking because Laszlo is a great engineer, but he’s been too busy to put together a good marketing or reseller channel for his service.

I’m trying to help him find use cases for his FlowTours system because he put so much time into developing the technology.

At the current time, his plug-in is free and it can stream to YouTube at no cost. However, I believe he’s building a commercial streaming system with more features such as low latency.

It could be an opportunity to collaborate with the developer of a streaming system and explore business opportunities together.

@craig currently it’s just exploratory work.

But i’m hoping to host on Twitch or YouTube some events with friends, and maybe provide a local service for musicians and hobbyist (like DnD players, cooking streams, and such), and maybe provide my own set of tools to use 360 cameras with the solutions that i consider working.
I have no plan to pay for a streaming service nor building one.
But i would be glad to share somehow the solution i found to acheive my project :slight_smile:

If you stream on YouTube or Twitch, would NDI improve the latency? I don’t know how NDI works. I believe a lot of the latency is due to the YouTube servers.

@craig NDI is a network protocol for transmitting audio & video locally, it’s not used to stream directly to a streaming service.
It’s compatible with some professional gear, and there are plugins for video software like OBS, vMix and such.
One of the advantage, is that each client can be authed with a name (with Bonjour i think), and it works on LAN, you can put your camera wirelessly to your OBS, and connect/disconnect is handled automatically. And this is the protocol with the lowest latency with smartphones i had tested. Elgato also has an app called EpocCam which also uses a proprietary protocol which works well, but currently, NDI is the best solution for LAN (like sending an OBS video to another OBS in a dual computer streaming setup with NDI is pretty common).
NDI is licensed and a registerer trademark, but in an open source project, i think you just have to mention the trademarks correctly according to their license.

i.e. NDI is basically SDI but through network.

Thanks for the explanation. I quickly looked at the devices here:

It may be possible to implement NDI on the Android OS in the Z1 or the X. The Z1 and the X run a version of the Snapdragon chipset from Qualcomm. It looks like a mobile phone. However, as it has two lenses that need to be operated simultaneously, applications need to be modified.

If people want to attempt the plug-in, the THETA must be unlocked. It is free to join the partner program, but may void the warranty of the camera.

https://www8.webcas.net/db/pub/ricoh/thetaplugin/create/input

One note of caution is that while the THETA camera can stream, they are not specifically designed for long-term unattended operation. Thus, you can’t expect the same level of stability in the workflow as with the cameras listed on the NDI.tv site. For example, on the Z1, it is not easy to turn on the Z1 over the USB cable if it is in a power off state. Most people manually press the power button to turn on the Z1.

I did some research on NDI.

options to get signal into computer

  • USB cables have max 15’ limit before USB extender is needed
  • USB to SDI
  • NDI - video over IP. often using Power Over Ethernet

using NDI

  • OBS requires NDI plug-in to use NDI camera as source

hi!

welcome!

24FPS is stable for sure, if there is a good network built, I don’t think frame drop occurred for me during tests.
At 25 FPS, I think I noticed audio/video sync/delay issues, but it seems you want to consume only video mainly and mix it in OBS with other sources, right?

Dual fisheye is just 2 lenses projected to video frame, search for it on google and will see how does it look like.

I’m working on my plugin upgrade, so that from Z1, directly will be able to stream wirelessly to any desktop or to any remote server via internet SRT. OBS and vMix can ingest it as source vide and audio like a webcam. Latency in local network is I think less than 1 second.

I’m not sure what do you mean by “directing tool” here, could you please explain what did you do with Theta X?

So you want to ingest video of Theta Z1 in OBS or streaming rather you desktop while you are watching 360 view in browser and moving the actual view with a joystick? Do I describe correctly? You do not want to stream the full 360 view, but only to cut a regular 2d part of video and stream it further to cloud, like YouTube or else? This is what you mean by “directing”?

1 Like

Thanks @biviel.
I see what dual fisheye means now, not sure if this could be useful (i think it’s not a common way to project it to a sphere in a 3d software).

Any way, the Z1 is on the way, so i’ll be able to test it.

Yes, i’m mostly interested in video, i have plenty of ways of capturing audio.
But what i fear is that the sync issue could be accumulating.
When publishing or streaming video, having unsynced audio & video will result in a massive fail (no matter the quality of both). People can excuse a few frame drop, but not 300ms of delay, particularly for a live music stream, this is unbearable. So it will require some compensating delay to make it work, but i know for sure RTSP’s latency is unpredictable at all.

Directing, like a TV director choosing camera angles and scenes, but with a single 360 camera. I may not have the good terminology here.

I like the idea that the camera is not moving like a PTZ would do, but only a part of it is projected to the video output (a stream, a video, anything) - or maybe both, 360 on youtube with spatial audio, and a 2D stream on twitch.
And using a Webview is a way i found to do it
I did not tested directly in OBS as a Browser Source, i had issues in the past with the CEF used by OBS which has not access to some media (maybe i won’t be able to access to the webcam or the joystick), but yes, basically, i can window capture a chrome tab (or a standalone application), and use it as the camera projection, like you would do in Unity or Unreal, but simply with WebGL.
And the joystick would be a way to control the framing: FOV, Pan Tilt Zoom, and add 3D incrustations, key points, 3d effects or shaders… don’t know yet.
I tried it quickly, to have a “lensball” effect, and i liked the result.
Plus, with a Webview, i can connect to OBS in websocket, so make key frames in sync with OBS scenes, etc…

But i think this is a bit out of topic. My main issue is, i cannot do a live stream if there is a 5s delay, and it would be horrible if i have other cameras that have a different latency.
I may experiment with NDI.

The FlowTours system by @biviel can do 1s latency or less in a test environment. I only used it in a test environment. It may be in production now.

1 Like