Comment by ashirviskas

Comment by ashirviskas a day ago

14 replies

Why are dummy plugs a thing? What can you do with them that you cannot do in software? (asking as a person who had no issues with having 18 virtual displays and no dummies).

ndiddy 21 hours ago

One example: I use software called Looking Glass on my PC for interacting with a Windows virtual machine. I have two GPUs in my computer, an AMD one for the Linux host and an NVidia one that gets passed through to the Windows guest. Looking Glass then captures the NVidia GPU's output and displays it in a window on my desktop. This allows me to use Windows software in the VM and get acceptable performance (Windows has basically required graphics acceleration to run acceptably after 7). The problem is that the NVidia GPU will not do anything without having a display connected. NVidia Quadro GPUs support dumping a monitor's EDID and then mapping that file to an output (so the GPU always thinks that monitor is connected to that output), but their consumer-grade GPUs don't support this. That's where the dummy plug comes in.

antgiant 17 hours ago

They make it super simple for someone on the move to do a zoom out teams call with one screen and still have access to PowerPoint’s presenter view.

Basically use the dummy plug screen for PowerPoint’s output and the laptop screen for the presenter notes. Then share the dummy plug’s screen.

Might not be the best answer for the citizens of Hacker News but so, so easy for teachers and salespeople.

SXX a day ago

A lot of OS / GPU / driver combinations dont actually let you setup virtual displays with arbitrary settings. And you might want it for setting streaming with OBS or games streamings via Steam / Parsec / etc.

Some years ago it's kind a worked for me on Linux with Xorg and open source drivers and Windows with Nvidia, but when it comes to MacOS or Windows+AMD or Intel GPU it simply doesn't work that well.

detaro 11 hours ago

In addition to what's already been mentioned, I remember there being issues with macs not unlocking the full abilities of the GPU if there was no display present. Maybe there is some software workaround, but a HDMI dummy is cheap and quick and won't disable itself on updates etc.

leonheld 20 hours ago

We use it for testing binary embedded Linux distros where tricking the OS to think there's a display connected introduces a new variable that is not present in the user's deployment - and it's a cheap hardware solution. Buying and installing them is probably more cost-effective than having an engineer writing the `echo on > /sys/whatever` and the logic around it.

pfych a day ago

Dummy plugs are a lot easier for most people. I added a fake 4K monitor to my desktop via software for remote game streaming, and it was a lot more complicated than I expected[^1].

[^1]: https://pfy.ch/programming/4k-sunshine.html

  • lyu07282 21 hours ago

    What gpu/driver were you using?

    • pfych 19 hours ago

      I was using an AMD 5700xt at the time with Mesa drivers

dd_xplore a day ago

I have a moded chromebox(booting windows and linux), which refuses to boot without any video device attached to hdmi port. So I had to use a dummy plug.

RunSet 2 hours ago

> What can you do with them that you cannot do in software?

It lets a macbook operate with the lid closed.

TheJoeMan a day ago

Raspberry Pi’s with remote desktop won’t render the desktop unless a monitor is physically plugged in… easiest solution for say a PhD student.

immibis an hour ago

Convince a locked-down Fuck You Inc (I mean Apple) device to use a specific display resolution over VNC.

tshaddox a day ago

Presumably it’s for devices which do not have easily modifiable software.

TiredOfLife 11 hours ago

It seems that linux doesn't support virtual displays. On Windows you can either install a dummy display or have Apollo do it automatically. No such thing on linux.