Skip to content
Awawa edited this page Jan 19, 2024 · 64 revisions

Introduction

In the tutorial, I will configure two instances of light sources: a classic LED strip on the back of the TV (highspeed AWA adalight USB) and a remote instance of Philips Hue lamps. Both will be used at the same time to get a deeper immersion in the ambient lighting. Of course, you can only use one instance with LED strip and skip the part for the second instance. HyperHDR is installed on Apple macOS. But the steps are pretty much the same for other operating systems as used LED drivers are independent of the operating system. The Ezcap 321 USB grabber serves as a video source. The tutorial is updated to HyperHDR v20

  1. Must read
  2. HyperHDR basics
  3. How to create and enable your first LED strip (instance)
  4. Manual LED layout editing and testing
  5. Smoothing and anti-flickering filter
  6. Add your second instance e.g. Philips Hue lamps
  7. Configuring and testing your USB video grabber
  8. Synchronization between different HyperHDR devices (e.g. Windows PC and RPi)
  9. How to read HyperHDR statistics
  10. JsonAPI for remotely controlling HyperHDR
  11. Configuring and testing audio effects

Performance pro-tips:

Although the use of WiFi LED drivers tempts with simplicity, in fact, for the entire duration of the session that requires constant data transmission, you will be at the mercy of the quality of the radio connection, the reliability and compatibility of your router with your esp8266 / esp32 model, the quality of the latter and the reliability of the given version e.g. Espressif SDK which was used to build the firmware. In addition, LED control can often lead to conflict with WiFi communication (especially "IRQ thunderstorm" if popular RMT method is used). You have been warned ⚠️

Note

  • Use the highest framerate for your USB grabber. GENERAL RULE: HIGHEST FRAMERATE, LOWEST RESOLUTION.
  • Choose YUV/NV12/RGB codec for your grabber instead of MJPEG.
    Note: some low quality USB3.0 OS drivers (Rpi4 is not affected) can cause higher CPU usage for YUV/NV12/RGB.
    In that case try to switch to MJPEG and monitor the performance.
  • Do not set high capturing resolutions like 1080p: they have little effect for your ambient lightning but they drain resources. Avoid grabbers without a scaler offering only very high capturing resolution or MJPEG only codec.
  • Check your video processing logs updated every minute: if the video processing latency exceeds ~10-20ms then you probably want to lower it down. Lower your capturing resolution or use 'Quarter of frame mode'.
  • If your CPU usage is near 100% for single core unit (or 400% for 4 cores Rpi 2/3/4) then consider reducing the frame-rate. It the grabber doesn't offer lower rate use 'Software frame skipping' option.
  • Keep your 'Smoothing' processing settings realistic. Don't use 80Hz Smoothing refresh rate if your old Arduino LED driver can output only 20Hz.
  • If you are unsure about your Smoothing configuration and experience LED lag, please disable this module. If it helps then turn it on again and start carefully experimenting with your settings starting with delay around 50ms and refresh rate ~20-50Hz.

single_scheme


First steps

Ok let's get started. Connect to HyperHDR using the address: http://IP_OF_HYPERHDR:8090

Screen Shot 2021-09-12 at 3 25 16 PM

Main menu is located on the left part of the page:

menu

The instance switch button is disabled for now because there is only one instance at the beginning. There is a warning about setting a new password, so it's a good idea to do it now. Follow the link or go to the "Advanced" tab to change the password:

obraz

You may also be greeted with an error warning. Don't panic and go to the logs. In the example below, I made a mistake when entering the IP address of my Philips Hue Bridge:

obraz


Configuring the LED strip

First give a friendly name for our instance:

Screen Shot 2021-09-12 at 3 25 52 PM

The LED strip is controlled by USB high-speed HyperSerialWLED/HyperSerial8266/HyperSerialESP32. So go to the LED devices and select 'LED hardware' tab then adalight from the list.

Screen Shot 2021-09-12 at 3 26 33 PM

On Raspberry Pi the COM list could be different. Remember that Rpi 3 & 4 has available Bluetooth device that is enabled on default. Its exact path (e.g. ttyAMA0 here) may vary.

adalight

Because we use non-standard highspeed AWA protocol @2Mb we need to configure two more things:

Screen Shot 2021-09-12 at 3 26 55 PM

Now we need to set geometry of the LED strip. For example let's assume we have: input in the middle of the bottom, 50 LEDs at the top, 25 LEDs on the left/right. At the bottom we 40 LEDs and a gap of 10 LEDs so you must put 50 LEDs (40 + 10) as bottom and set gap as 10.

Screen Shot 2021-09-12 at 3 30 12 PM

Still there is a problem with the input position. Set gap & input position to 95 (top 50 + right 25 + one of bottom part 20)

Screen Shot 2021-09-12 at 3 30 55 PM


Manual LED layout and testing

If you are not satisfied with the automatically generated layout, you can correct it manually. Right-click on the selected LED and a context menu will be displayed. You can manually adjust its position (Move) or size (Properties), disable it (it will never be lit) or remove it (Delete) from the system altogether.

Use 'Identify' to test a particular diode or check its physical location. This will cause it to blink for a few seconds.

obraz


Smoothing anti-flickering filter

Without it you may experience disturbing subtle flickering on the dark scenes. It's caused by the video source or video player aggressive dithering with a combination of high refresh rate on the LED strip. It's detailed described on the configuration page (Image processing tab).

obraz

You can see how the problem manifests on the following clip and how is resolved on the second one when anti-flickering filter is enabled with relaxed settings: threshold = 32 and minimal step = 2 (you may need to increase it on your system) . Capturing video at such conditions is a difficult task for the phone camera to maintain colors but at least luminescence's changes are visible on these samples. New smoothing algorithm 'Alternative Linear' was used. It was introduced in v16 and it is similar to the standard 'Linear' but at the end of transition the decay effect is reduced.

Live example:

Flickering appears at 00:07 and at the 00:18 flickering without the filter and the flickering issue is resolved on the same video source with new anti-flickering filter: no flickering with the filter


Second instance (Philips Hue lamps)

First create new instance for our Philips Hue lamp:

Screen Shot 2021-09-12 at 3 50 33 PM

Now switch to the new instance:

Screen Shot 2021-09-12 at 3 50 47 PM

Before proceeding you must create an 'Entertainment group' for your lamps in the Philips Android or IOS mobile application. Don't even try without doing it first. Your Hue Bridge should also to be working for at least 2-3 minutes to avoid connection problems during configuration at startup. You should try first to send a color from the Philips mobile app to make sure it's working and your 'Entertainment group' is created.

Then add Philips Hue light source and run the wizard:

Screen Shot 2021-09-12 at 3 51 14 PM

HyperHDR should find the Philips Hue bridge in the same sub-network. Some custom router's firewall rules or enabling WiFi isolation can cause the process to fail.

Screen Shot 2021-09-12 at 3 51 26 PM

Click the 'Create new user and and clientkey' button. Now it's time to click a large hardware button located on your Hue bridge. It allows to authorize the HyperHDR's access. If everything goes OK you should have the Username and Clientkey filled in automatically. Identificator of 'Entertainment group' should also be found automatically.

Screen Shot 2021-09-12 at 3 51 32 PM

Proceed with clicking the 'use group...' button.

Screen Shot 2021-09-12 at 3 52 37 PM

Now you can assign area of the TV to single, selected lamp in the entertainment group. Because I have 2 lamps on the floor (one just on the right of the TV and second to the left) I selected following options.

Screen Shot 2021-09-12 at 3 52 58 PM Screen Shot 2021-09-12 at 3 53 07 PM


Configuring video & system grabber

Now we configure the video source for HyperHDR: it can be an USB grabber or system screen capture. Go to the 'Video capturing tab' and select your USB grabber.

Screen Shot 2021-09-12 at 3 45 36 PM

The yellow 'Info' button will appear after you select the grabber. Click it to open the dialog. Ezcap 321 offers very high capturing modes. I choose the lowest NV12. You should not go above 720p, because it causes drain of CPU resources & introduces video processing lag and gives very little to our ambient effect.

Screen Shot 2021-09-12 at 3 45 46 PM

Next I check 'Quarter to frame mode' option. It will reduce the frame dimensions (to 50%) & size (to 25%) but will not reduce color's data due to NV12 codec properties.

Screen Shot 2021-09-12 at 3 45 54 PM

Click the 'Video preview & LED visualization' button in the upper left corner:

Screen Shot 2021-09-12 at 3 45 21 PM

The video stream colors are washed-out and overall luminescence is low. That's because we have captured a HDR10 video stream and almost none of USB grabbers can process HDR metadata so important part of information about the image is lost. Let's back to the Capturing hardware tab and enable 'HDR to SDR tone mapping'.

Screen Shot 2021-09-12 at 3 48 54 PM


Synchronization between different HyperHDR devices

There are cases when we want HyperHDR to work with instances on other devices. For example, one instance of HyperHDR on Raspberry Pi working with a USB grabber and LEDs is sometimes also intended to display colors based on a Windows PC screen (no grabber and no LEDs attached, only built-in DirectX software capture as a video source for RPi instance).

Connect a HyperHDR instance on Windows (a remote video/colors source) to another HyperHDR instance on RPi (HyperHDR actually acts similar to WLED here as a LED driver) using the UDP raw sender/receiver mechanism.

This makes synchronization even faster because we do not send entire images over the network (when flatbuffers or protobuffers are used), only the colors of the RGB LEDs, and therefore much smaller packets.

expla

Both instances must have the same number of LEDs and the same geometry. An example path for our configuration:

  • you need to export/backup the HyperHDR configuration from RPi (interface: General tab)
  • import the configuration backup on Windows (interface: General tab)
  • on Windows change the light source to 'udpraw' (interface: 'LED hardware' tab) and set the target IP to the Raspberry Pi address
  • on Windows disable USB grabber and enable DirectX grabber (interface: Video capturing tab)
  • on Windows disable 'Smoothing' (interface: Image Processing tab) Otherwise, it will be processed twice: once on Windows and once on RPi
  • on Windows disable 'Image processing' settings that may have affect LED colors. For example for the red/green/blue gamma settings, change 1.5 to neutral 1 (interface: Image Processing tab). Otherwise, they will be processed twice: once on Windows and once on RPi
  • on Raspberry Pi, you need to enable the UDP server (interface: Advanced->Network Services tab, screenshot above)

How to read HyperHDR statistics

The statistics are available on the main HyperHDR application page (the 'Overview' tab)

stats

  1. Overall CPU usage with per core visualization (by all running applications, not only HyperHDR)
  2. Available system memory
  3. Temperature reported by your hardware.
  4. Verify that the Raspberry Pi's built-in hardware sensor reported an under-voltage event to the system logs. Note that over-voltage events are not reported but often accompany under-voltage events and only the fuse remains in the way to prevent damage to the Raspberry Pi in the extreme case (the fuse does not protect if you are powering the Raspberry Pi via GPIO). Cheap power supplies, when unable to provide enough power, tend to spike voltage too high in response to the situation. Problems caused by insufficient or unstable power supply are very difficult to diagnose. Most likely, your USB grabber will fail first, but this is not always the case: Raspberry Pi modules, such as the embedded USB controller or network card, may also stop working properly.
  5. USB grabber performance: FPS and average time to decode a frame. You also usually want the FPS to be as high as possible with the lowest resolution possible at the same time. There are settings in the USB grabber configuration that can help reduce the resolution if the grabber doesn't have a hardware scaler. Keep decoding time below 20ms.
  6. Each user configured light source is listed here. Each instance takes input from e.g. your USB grabber or flatbuffer source and forwards it further: directly to the LED device if smoothing is off or to the smoothing sampler unit otherwise.
  7. If smoothing processing is enabled, it can increase or decrease the frame rate of the stream delivered by the instance. The result is then routed to the final LED device driver.
  8. Light source/LED strip driver. Here you have information on how many frames were received directly from the smoothing unit or from the instance, and how many actually went to the device. Note that some protocols do not guarantee that the frames actually arrived on the device, e.g. UDP used by WLED or the Philips Hue Entertainment API (and no error will be detected or reported).

Remotely controlling HyperHDR (JsonAPI)

HyperHDR has a built-in tool thanks to which you can quickly generate the necessary commands with a few clicks of the mouse. With it, you can remotely control some HyperHDR functions, turn devices such as USB grabbers or LEDs on or off, etc.

obraz

For example:

You can remotely control the HDR tone mapping state using for example Home Automation system. If you are using Home Assistant and Denon amplifier, it's possible to automatically switch it depending on the actual video stream format (example).

Turning HDR tone mapping OFF: http://IP_OF_HYPERHDR:8090/json-rpc?request={%22command%22%3A%22videomodehdr%22%2C%22HDR%22%3A0}

Turning HDR tone mapping ON: http://IP_OF_HYPERHDR:8090/json-rpc?request={%22command%22%3A%22videomodehdr%22%2C%22HDR%22%3A1}

Getting HDR tone mapping state (search for videomodehdr property): http://IP_OF_HYPERHDR:8090/json-rpc?request={%22command%22%3A%22serverinfo%22}


Configuring and enabling sound visualization effects

One of unique HyperHDR feature is sound visualization on your ambient ecosystem. Digital capture devices are preferred as there is no analog sound filter at input available in the current version. To make it work you must make sure that the grabber receive audio stream. Be aware that some amplifiers like Denon can block it in typical configuration (use of ARC/EARC is necessary then).

Let's configure the hardware first in the 'Effect' tab. On the following screen I selected Ezcap 269 device. Do not confuse it with some other system devices. obraz

Next we need to enable 'Activate' option and save our settings. obraz

Navigate to the 'Remote control' tab and turn on video preview to verify result. obraz

For the beginning use 'Equalizer' from the list to test if everything works OK. obraz

If you activated & set up your grabber correctly and it receives a audio stream you should see jumping equalizer's bars. obraz

If they are flat something is wrong. In 99% cases your grabber doesn't receive a sound. Navigate to the 'Logs' tab to confirm it. Otherwise you can have fun with other music effects. See them in the video visualizer to know how they work.

Let's see the logs...

obraz

It's a final confirmation: you have enabled some audio device and it works but it provides no sound...only silence. Maybe you've selected wrong device but more probably there is no sound on the grabber's input.