Navigate to the section:
- Introduction
- Setting up the video grabber
- Configuring adalight USB LED strip on the primary instance
- Configuring smoothing and color calibration
- Creating an additional instance for other light source
- Setting up the Philips Hue device
- Configuring and enabling sound visualization effects
- Remotely controlling HDR tone mapping
- Logs are your best friend
Introduction
For the tutorial in this chapter I connect to HyperHDR installed on Raspberry Pi 4 from Apple macOS. But the steps are almost the same for other OS. From version 16 you can have HyperHDR installed on the macOS as an alternative if you are interested. Ezcap 269 is used as a grabber.
To make this tutorial a bit more challenging I will configure two instances of leds: classic sk6812 LED strip on the back of TV and remote instance of Philips Hue lamps. Both will be used at the same time to make deeper ambient lighting immersion. Of course you may use just one instance with a LED strip and skip a part for the second instance.
OK, let's start. Connect with the HyperHDR with the address:
http://IP_OF_HYPERHDR:8090
For the configuration phase please DO NOT USE https instance available at port 8092 because some WWW wizards won't work (the browser security protection for https channel blocks non-encrypted communication with other devices).You can use https later when everything is set up.
As you can see there is warning about setting new password so it's good to do it now or later.
You grabber should be available now with default settings.
Click TV icon in the upper right corner to turn on video preview.
Now turn on video stream clicking on the 'Live video' button.
Success! We've captured video stream from the grabber. But something is wrong...You have noticed that the colors are washed-out and overall luminescence is low? That's because we have captured a HDR10 video stream and almost none of USB grabbers can process HDR metadata so important part of information about the image is lost. But we can do something about it later. Go to the Capturing hardware tab.
First select a video device from a list. Then you can set up resolution and refresh rate. I think max useful resolution is 720p. For most setups 480p is sufficient (only when no decimation/image size reduction is applied).
You need decent LED strip controller to have any use of 60 frames per seconds from the grabber such as direct SPI/PWM connection from Rpi or USB driver like HyperSerialWLED or HyperSerialEsp8266. If you plan to use it only for a network/WiFi controller instance or for an Arduino with over 150 leds in the led strip then you can let it go: it won't work well.
Let's fix our broken HDR video stream now. Click on the 'HDR to SDR tone mapping' button and save it. We have our colors back :-)
Other most import options are:
- "Quarter of frame mode" it scales the video stream to (1/2 width x 1/2 height) size. Useful on slower devices and for a grabber that doesn't allow changing resolution like Rullz Navy U3 that sticks to the 1080p capture resolution (which we don't need) for most time.
- "Software frame skipping" if the grabber minimal 30 FPS is too much for you or your hardware then you can reduce the refresh rate in the software. If you set it to n then every n-th frame from 30FPS will be processed. For example set it to 5 and as a result final refresh rate (for a grabber 30FPS) will be 30/5 = 6 FPS.
- "Force encoding format on the grabber" most grabbers like popular MS2109 are capable of providing at least 2 or more video encoders. MS2109 provides MJPEG and YUV for lower resolutions and refresh rates and only MJPEG for higher resolutions like 1080p. The settings allows you to set desired format for a grabber. MJPEG has a higher CPU demand so you should avoid it especially on Raspberry Pi 1 or Zero. YUV has greater demand for a USB bandwidth so higher resolutions will work only for grabbers with real (not just with connectors painted in blue) USB3.0/3.1 port & hardware interface.
Configuring USB adalight LED strip on the primary instance
Now it's time to set up our primary instance for our LED strip behind the TV.
The LED strip is controlled by USB high-speed HyperSerialWLED/HyperSerial8266. So go to the LED devices and select 'LED hardware' tab then adalight from the list.
For HyperSerialWLED/HyperSerial8266 enable AWA protocol and set 2000000 speed.
If you are using classic USB Adalight driver like legacy Arduino sketch (not recommended!) set AWA protocol OFF and speed can be 500000 max depending on the Arduino sketch and capability. Above that speed you will meet serious troubles from time to time.
Proceed to the LED strip geometry definition. Switch to 'LED Layout' tab in the same window.
Let's assume that we have 50 leds on the top, 20 leds on both sides and 40 leds on the bottom because there is a 10 led gap for TV stand. Like on the following picture.
So set it up in the following dialog. Even if you have less than 50 leds at the bottom segment set it to 50 as the gap takes the missing rest.
We have set up our LED layout but something is still wrong as on the above screenshot. The input in the HyperHDR doesn't match the real one at the bottom and the gap is at wrong place. You must set also gap & input position to make it right. They are equal to: top+left or right+A segment (diagram) so: 50 + 20 + 20 = 90.
That's better. If you have setup opposite to the clock, set 'Reversed direction' option enabled.
Configuring smoothing and color calibration
One of most important option in HyperHDR is Smoothing because almost no one likes sudden burst of light from the LED strip. It's available in the 'Image processing' tab.
Default options should be proper for most users. But if you are using not recommended and slow LED strip driver like Arduino then 80Hz refresh rate from the screen above is probably a way too much. For example typical 60 led/meter led strip for a 55" TV that it's driven by the Arduino has only just over 20Hz pathetic refresh rate.
WLED 0.12 (and HyperSerialWLED/HyperSerial8266) has a built-in benchmark so you can test the real performance on your own but first enable 'continues output' and set high refresh rate to measure it on device. For everyday usage you should disable 'continues output'.
From version 16 there is important anti-flickering filter. Without it you may experience disturbing subtle flickering on the dark scenes. It's caused by the video source or video player aggressive dithering with a combination of high refresh rate on the LED strip. It's detailed described on the configuration page.
You can see how the problem manifests on the following clip and how is resolved on the second one when anti-flickering filter is enabled with relaxed settings: threshold = 32 and minimal step = 2 (you may need to increase it on your system) . Capturing video at such conditions is a difficult task for the phone camera to maintain colors but at least luminescence's changes are visible on these samples. New smoothing algorithm 'Alternative Linear' was used. It was introduced in v16 and it is similar to the standard 'Linear' but at the end of transition the decay effect is reduced.
and the flickering issue is resolved on the same video source with new anti-flickering filter:
Because streamable service can reduce quality during the processing here are direct links to the video's downloads:
Download flickering clip at ufile.io
Download antiflickering clip at ufile.io
On the same section there is a 'Color calibration' panel:
Most important options are: 3 gamma's, 'Backlight threshold' and 'Colored backlight'.
- 'Red/green/blue gamma' decides about intensity/gain of each color channel.
- 'Backlight threshold' is minimal gray backlit of your led strip so it won't turn OFF causing flickering.
- 'Colored backlight' tries to preserve a part of original color to set backlit instead of gray.
Probably you want to have backlight enabled as 'Smoothing/Anti-flickering filter' works best with 'Colored backlight' enabled and 'Backlight threshold' set to 2-3 depending on your LED setup and needs.
Creating an additional instance for other light source
We have finished configuration of our primary instance. Now we'll create a secondary instance for our other light source. Go to the 'General' tab, provide 'New instance name' and click the button to create new instance.
New instance for our Philips Hue lamps is created.
Now you have to start it.
This is a very important step:
you must switch to the new instance in the upper-right corner before proceeding further. Otherwise you will just override your primary instance with new settings.
Setting up the Philips Hue device
Go to the 'LED hardware' and select 'philipshue' from the list.
Before proceeding you must create an 'Entertainment group' for your lamps in the Philips Android or IOS mobile application. Don't even try without doing it first. Your Hue Bridge should also to be working for at least 2-3 minutes to avoid connection problems during configuration at startup. You should try first to send a color from the Philips mobile app to make sure it's working and your 'Entertainment group' is created.
Then start a wizard in HyperHDR.
HyperHDR should find the Philips Hue bridge in the same sub-network. Some custom router's firewall rules or enabling WiFi isolation can cause the process to fail.
Click the 'Create new user and and clientkey' button.
Now it's time to click a large hardware button located on your Hue bridge. It allows to authorize the HyperHDR's access. If everything goes OK you should have the Username and Clientkey filled in automatically. Identificator of 'Entertainment group' should also be found automatically.
Proceed with clicking the 'use group...' button.
Now you can assign area of the TV to single, selected lamp in the entertainment group. Because I have 2 lamps on the floor (one just on the right of the TV and second to the left) I selected following options.
Configuring and enabling sound visualization effects
One of unique HyperHDR feature is sound visualization on your ambient ecosystem. Digital capture devices are preferred as there is no analog sound filter at input available in the current version. To make it work you must make sure that the grabber receive audio stream. Be aware that some amplifiers like Denon can block it in typical configuration (use of ARC/EARC is necessary then).
Let's configure the hardware first in the 'Effect' tab. On the following screen I selected Ezcap 269 device. Do not confuse it with some other system devices.
Next we need to enable 'Activate' option and save our settings.
Navigate to the 'Remote control' tab and turn on video preview to verify result.
For the beginning use 'Equalizer' from the list to test if everything works OK.
If you activated & set up your grabber correctly and it receives a audio stream you should see jumping equalizer's bars.
If they are flat something is wrong. In 99% cases your grabber doesn't receive a sound. Navigate to the 'Logs' tab to confirm it. Otherwise you can have fun with other music effects. See them in the video visualizer to know how they work.
Let's see the logs...
It's a final confirmation: you have enabled some audio device and it works but it provides no sound...only silence. Maybe you've selected wrong device but more probably there is no sound on the grabber's input.
Remotely controlling HDR tone mapping
You can remotely control the HDR tone mapping state using for example Home Automation system. If you are using Home Assistant and Denon amplifier, it's possible to automatically switch it depending on the actual video stream format (example).
Version 16 allows using simply GET requests.
Turning HDR tone mapping OFF:
Turning HDR tone mapping ON:
Getting HDR tone mapping state (search for videomodehdr property):
Logs are your best friend
In case of any trouble (not necessary related to the sound capturing) first refer to the log.
For example on the following screenshot you can find information about misconfiguration in the adalight device section (serious errors are mostly in red) and video performance stat in the last line. Performance statistics are refreshed every one minute and it provides you important information about how your system performs and if it's necessary to reduce video capturing resolution.
Comments
2021-05-13T20:22:24.664Z [V4L2:/DEV/VIDEO0] (DEBUG) (V4L2Grabber.cpp:99:GetSharedLut()) LUT folder location: '/usr/share/hyperhdr/lut'
2021-05-13T20:22:24.664Z [V4L2:/DEV/VIDEO0] (DEBUG) (V4L2Grabber.cpp:135:loadLutFile()) LUT file found: /home/pi/.hyperhdr/lut_lin_tables.3d
2021-05-13T20:22:24.665Z [V4L2:/DEV/VIDEO0] (DEBUG) (V4L2Grabber.cpp:144:loadLutFile()) Index 1 for HDR YUV
2021-05-13T20:22:24.822Z [V4L2:/DEV/VIDEO0] (DEBUG) (V4L2Grabber.cpp:166:loadLutFile()) LUT file has been loaded
2021-05-13T20:22:24.822Z [V4L2:/DEV/VIDEO0] (INFO) Pixel format: NV12
2021-05-13T20:22:24.844Z [V4L2:/DEV/VIDEO0] (INFO) Started
2021-05-13T20:22:24.844Z [V4L2:/DEV/VIDEO0] (DEBUG) (V4L2Grabber.cpp:1522:setBrightnessContrastSaturationHue()) setBrightnessContrastSaturationHue nothing changed
2021-05-13T20:22:24.844Z [V4L2:/DEV/VIDEO0] (DEBUG) (V4L2Grabber.cpp:211:setHdrToneMappingEnabled()) setHdrToneMappingMode nothing changed: Fullscreen
2021-05-13T20:22:24.844Z [V4L2:/DEV/VIDEO0] (DEBUG) (V4L2Grabber.cpp:211:setHdrToneMappingEnabled()) setHdrToneMappingMode nothing changed: Fullscreen
2021-05-13T20:22:24.844Z [V4L2:/DEV/VIDEO0] (DEBUG) (V4L2Grabber.cpp:181:setFpsSoftwareDecimation()) setFpsSoftwareDecimation to: 1
2021-05-13T20:22:24.845Z [V4L2:/DEV/VIDEO0] (INFO) Signal detection area set to: 0.250000,0.250000 x 0.750000,0.750000
2021-05-13T20:22:24.845Z [V4L2:/DEV/VIDEO0] (INFO) Signal threshold set to: {38, 38, 38} and frames: 200
2021-05-13T20:22:24.845Z [V4L2:/DEV/VIDEO0] (DEBUG) (V4L2Grabber.cpp:1487:setEncoding()) Force encoding (setEncoding): nv12 (nv12)
2021-05-13T20:22:24.845Z [V4L2:/DEV/VIDEO0] (INFO) setQFrameDecimation is now: enabled
2021-05-13T20:22:24.830Z [WEBSOCKET] (DEBUG) (JsonAPI.cpp:1099:handleLoggingCommand()) log streaming deactivated for client ::ffff:192.168.2.226
2021-05-13T20:22:24.938Z [HYPERHDR] (DEBUG) (PriorityMuxer.cpp:252:setInputImage()) Priority 240 is now active
2021-05-13T20:22:24.938Z [HYPERHDR] (DEBUG) (PriorityMuxer.cpp:378:setCurrentTime()) Set visible priority to 240
2021-05-13T20:22:24.941Z [HYPERHDR] (DEBUG) (HyperHdrInstance.cpp:550:handlePriorityChangedLedDevice()) priority[240], previousPriority[255]
2021-05-13T20:22:24.941Z [HYPERHDR] (DEBUG) (HyperHdrInstance.cpp:560:handlePriorityChangedLedDevice()) new source available -> switch LED-Device on
2021-05-13T20:22:24.942Z [IMAGETOLED] (DEBUG) (ImageProcessor.cpp:180:setHardLedMappingType()) set hard led mapping to multicolor_mean
2021-05-13T20:22:26.291Z [V4L2:/DEV/VIDEO0] (ERROR) Frame too small: 0 != 3110400
2021-05-13T20:22:26.293Z [V4L2:/DEV/VIDEO0] (ERROR) Frame too small: 0 != 3110400
2021-05-13T20:22:26.293Z [V4L2:/DEV/VIDEO0] (ERROR) Frame too small: 0 != 3110400
2021-05-13T20:22:28.192Z [HYPERHDR] (DEBUG) (PriorityMuxer.cpp:252:setInputImage()) Priority 240 is now inactive
2021-05-13T20:22:28.193Z [HYPERHDR] (DEBUG) (PriorityMuxer.cpp:378:setCurrentTime()) Set visible priority to 255
2021-05-13T20:22:28.193Z [HYPERHDR] (DEBUG) (HyperHdrInstance.cpp:550:handlePriorityChangedLedDevice()) priority[255], previousPriority[240]
2021-05-13T20:22:28.193Z [HYPERHDR] (ERROR) No source left -> switch LED-Device off
2021-05-13T20:22:28.194Z [IMAGETOLED] (DEBUG) (ImageProcessor.cpp:180:setHardLedMappingType()) set hard led mapping to multicolor_mean
2021-05-13T20:22:34.024Z [WEBSOCKET] (DEBUG) (JsonAPI.cpp:1090:handleLoggingCommand()) log streaming activated for client ::ffff:192.168.2.226
Indicates problem with Linux v4l2 driver or device that is sending empty frames to HyperHDR and then it stops at all. Sometimes it happens at the initialization but should be no more than one such message.
I have'got both Ezcap 320 & 321 too. Some questions:
Did you update their firmwares?
Which system do you use: current v16 HyperHDR SD image or some other OS?
What is in the dmesg when that happens?
2021-05-14T11:33:38.316Z [V4L2:AUTO] (INFO) configured v4l device: /dev/video0
2021-05-14T11:33:38.318Z [V4L2:AUTO] (INFO) Set resolution to: 1280 x 720
2021-05-14T11:33:38.319Z [V4L2:AUTO] (INFO) Set framerate to 60 FPS
2021-05-14T11:33:38.319Z [V4L2:AUTO] (DEBUG) (V4L2Grabber.cpp:99:GetSharedLut()) LUT folder location: '/usr/share/hyperhdr/lut'
2021-05-14T11:33:38.319Z [V4L2:AUTO] (DEBUG) (V4L2Grabber.cpp:121:loadLutFile()) LUT table: trying distro file location
2021-05-14T11:33:38.319Z [V4L2:AUTO] (DEBUG) (V4L2Grabber.cpp:135:loadLutFile()) LUT file found: /usr/share/hyperhdr/lut/lut_lin_tables.3d
2021-05-14T11:33:38.320Z [V4L2:AUTO] (DEBUG) (V4L2Grabber.cpp:144:loadLutFile()) Index 1 for HDR YUV
2021-05-14T11:33:38.428Z [V4L2:AUTO] (DEBUG) (V4L2Grabber.cpp:166:loadLutFile()) LUT file has been loaded
2021-05-14T11:33:38.428Z [V4L2:AUTO] (INFO) Pixel format: NV12
2021-05-14T11:33:38.438Z [V4L2:AUTO] (INFO) Started
2021-05-14T11:33:38.438Z [V4L2:AUTO] (DEBUG) (V4L2Grabber.cpp:1522:setBrightnessContrastSaturationHue()) setBrightnessContrastSaturationHue nothing changed
2021-05-14T11:33:38.438Z [V4L2:AUTO] (DEBUG) (V4L2Grabber.cpp:211:setHdrToneMappingEnabled()) setHdrToneMappingMode nothing changed: Fullscreen
2021-05-14T11:33:38.438Z [V4L2:AUTO] (DEBUG) (V4L2Grabber.cpp:211:setHdrToneMappingEnabled()) setHdrToneMappingMode nothing changed: Fullscreen
2021-05-14T11:33:38.439Z [V4L2:AUTO] (DEBUG) (V4L2Grabber.cpp:181:setFpsSoftwareDecimation()) setFpsSoftwareDecimation to: 1
2021-05-14T11:33:38.439Z [V4L2:AUTO] (INFO) Signal detection area set to: 0.250000,0.250000 x 0.750000,0.750000
2021-05-14T11:33:38.439Z [V4L2:AUTO] (INFO) Signal threshold set to: {38, 38, 38} and frames: 200
2021-05-14T11:33:38.439Z [V4L2:AUTO] (DEBUG) (V4L2Grabber.cpp:1487:setEncoding()) Force encoding (setEncoding): nv12 (nv12)
2021-05-14T11:33:38.439Z [V4L2:AUTO] (INFO) setQFrameDecimation is now: enabled
2021-05-14T11:33:38.380Z [WEBSOCKET] (DEBUG) (JsonAPI.cpp:1099:handleLoggingCommand()) log streaming deactivated for client ::ffff:192.168.2.226
2021-05-14T11:33:38.482Z [HYPERHDR] (DEBUG) (PriorityMuxer.cpp:252:setInputImage()) Priority 240 is now active
2021-05-14T11:33:38.482Z [HYPERHDR] (DEBUG) (PriorityMuxer.cpp:378:setCurrentTime()) Set visible priority to 240
2021-05-14T11:33:38.485Z [IMAGETOLED] (INFO) Total index number for instance: 0 is: 60160. Sparse processing: disabled, image size: 640 x 360, area number: 224
2021-05-14T11:33:38.487Z [HYPERHDR] (DEBUG) (HyperHdrInstance.cpp:550:handlePriorityChangedLedDevice()) priority[240], previousPriority[255]
2021-05-14T11:33:38.487Z [HYPERHDR] (DEBUG) (HyperHdrInstance.cpp:560:handlePriorityChangedLedDevice()) new source available -> switch LED-Device on
2021-05-14T11:33:38.488Z [IMAGETOLED] (DEBUG) (ImageProcessor.cpp:180:setHardLedMappingType()) set hard led mapping to multicolor_mean
2021-05-14T11:33:39.887Z [IMAGETOLED] (INFO) Total index number for instance: 0 is: 58400. Sparse processing: disabled, image size: 640 x 360, area number: 224
2021-05-14T11:33:43.662Z [V4L2:AUTO] (ERROR) Frame too small: 0 != 1382400
Why it's 1920x1080 ? For the first run I've got inpus as Camera1, after restart there is no input in log.
And the dmesg log https://pastebin.com/dl/nzLuMYzt
v4l2-ctl returns still the same result on v16 OS. Ezcap 320 was of course connected to the Rpi4's USB 3.0 port.
Why it's 1920x1080 ? For the first run I've got inpus as Camera1, after restart there is no input in log.
There is a problem with your device stability, probably it disappeared from the system and automatic initialization used only available internal Rpi device that won't work (/dev/video14 or /dev/video15). There is an error that should not be present in the first log: Throws error nr: Cannot open '/dev/video0' error code 19, No such device.
I've got latest firmware for 320 from Ezcap homepage but leave that subject for now.
1 Before doing anything else please execute v4l2-ctl --list-formats-ext and v4l2-ctl --all first.
1 As a workaround you can swap Ezcap 320 to USB2.0 port.
2 If you use it on USB3.0 or USB2.0 don't go above 30 FPS. Don't use automatic, force it to 30FPS. And set resolution to 720x576 for testing (yuyv selected in forced codec)
When I change resolution to 720x576 30fps, I get restart and everything change to automatic again. I will try this with ezcap 321. Firmware for ezcap 320 and 321 is same as on their site. If that didn't work i will use cheap grabber.
Your result is completely different from mine for Ezcap 320. I thought that there is some problem with USB3 and my Rpi... I used even USB cable provided by the manufacturer and it changed nothing. But as I connected now my test Ezcap 321 to Rpi4 the results are the same as yours so maybe my Rpi4 is OK. I purchased Ezcap 320 from their official shop on aliexpress soon after release, maybe it's engineering sample or it's broken. It hasn't got even serial number. I flashed it again with the official firmware but it didn't changed anything.
I think it's completely unnecessary to buy other equipment at the moment ;) Just use USB2.0 port on your Rpi4 and then you will have probably stable connection but max at 1280x760/30FPS (which is enough).
If you want to use your current setup on USB3.0, you give a try: 1280x720/60FPS/YUYV and set Software Frame Decimation to 2 (it should reduce FPS to 30FPS). Next check benchmarks in logs after few minutes to verify delay and real FPS.
https://pastebin.com/dl/QFEqZPwP
On USB 2.0 with 1280x720 30fps NV12 it's working. I used your LUT table for ezcap320. It looks good, not great but for now is good. Thank you for your help and time.
Thanks for your work on this programs. I am keen to test it out.
One question, does HyperHDR have the ability to capture via software? So I could skip the requirement of using a hardware grabber.
Thanks
Nice to hear that you're interested. Next beta version of v17 will include some software screen captures. Still, USB grabbers are better and recommended solution.
Regards
Awawa
Thanks
- full log after you start HyperHDR (copy a content of your logs from the www tab and paste them on pastebin service: provide us just a link to it)
- screen shot from HyperHDR led driver page with your configuration of WS2812B
- description of your setup: how it's all connected, does Raspberry have its own power supply, how long is the cable from the Rpi to the LED strip (which pin you use for that purpose), do you use level shifter etc
Only major thing that has changed in the Smoothing in v17 is new antiflickering timeout and it should not affect the basic smoothing operation: at least it doesn't for me and for other users so far. Hard to guess more about your setup from the description but 3-4 seconds delay is definitely abnormal (wifi connection to wled is stable?) and I think it is unrelated to smoothing unless some other things in the configuration breaks it. You must analyze your log file, maybe there is an answer.
after restarting the PI 4b, the start effect works for the set time, but then the first 5 LEDs of the strip flash. And the rest of the total 139 keep the color of the start effect. by switching Max Leds from 139 to 140 and back to 139 on the web server everything works again. What can I do?
Regards
I've never experienced such behavior before, so I'm not sure what causes it. If this flashing happens and you set a static color (effects), what is the result? Does disabling smoothing improving something? You can also try to use the latest version of v18beta1 (installers are in the latest actions tab on github, you need to have a github account to access them) and report if this has changed anything, since that version has a lot of improvements for LED devices.
regards
HyperHDR is not compatible with that app. You can use web interface from the phone or use JSON API playground to prepare browser shortcuts/links that start selected effects.
i have a problem with led layout. Ive set my No1 led to correct position but the leds are 90o out?ie bottom right is top left on tv, bottom middle is top middle. otherwise seems to display correct. If i use Knight rider effect, instead of cycling left to right on top and bottom its going bottom to top. any thoughts. i,m using pi 3 and usb capture device with HDMI in/out and usb out to pi.
Post a Comment