I’m a robotics researcher. My interests include cybersecurity, repeatable & reproducible research, as well as open source robotics and rust programing.
I fell for it. It took me a minute into the game time to figure what was up and double check today’s date.
I’m using a recent 42" LG OLED TV as a large affordable PC monitor in order to support 4K@120Hz+HDR@10bit, which is great for gaming or content creation that can appreciate the screen real estate. Anything in the proper PC Monitor market similarly sized or even slightly smaller costs way more per screen area and feature parity.
Unfortunately such TVs rarely include anything other than HDMI for digital video input, regardless of the growing trend connecting gaming PCs in the living room, like with fiber optic HDMI cables. I actually went with a GPU with more than one HDMI output so I could display to both TVs in the house simultaneously.
Also, having an API as well as a remote to control my monitor is kind of nice. Enough folks are using LG TVs as monitors for this midsize range that there even open source projects to entirely mimic conventional display behaviors:
I also kind of like using the TV as simple KVMs with less cables. For example with audio, I can independently control volume and mux output to either speakers or multiple Bluetooth devices from the TV, without having fiddle around with repairing Bluetooth peripherals to each PC or gaming console. That’s particularly nice when swapping from playing games on the PC to watching movies on a Chromecast with a friend over two pairs of headphones, while still keeping the house quite for the family. That kind of KVM functionality and connectivity is still kind of a premium feature for modest priced PC monitors. Of course others find their own use cases for hacking the TV remote APIs:
Looks like Moonlight does have their app up on the Apple store or iOS, and Sunlight has binaries for most operating systems. Personally, instead of Sunlight’s server, I still use Nvidia’s GeForce Experience software to stream games, as it takes less effort to configure. Of course, Nvidia may not be applicable if you’re using integrated or AMD graphics instead.
Although, with Nvidia recently deprecating support for it’s shield device, Sunlight provides support for the same protocol that Moonlight was originally developed against, but it’s also open source. I’ve not used multi monitor streaming with GeForce Experience, something Sunlight would be much more flexible in configuring.
As for connectivity, I’m unsure if iOS supports the same USB network feature that Android has. I’d imagine at least the iPhone would, as that’s a core feature/option for mobile hotspot connectivity, but maybe that’s nixed from iPad iOS? Alternatively you could get yourself a USB C hub or dock with an ethernet adapter and pass through power delivery, so you can connect your iPad with a wired network and charge simultaneously.
Or you could just use Wi-Fi, but with wireless networks dropping and retrying packets, that’ll impact latency or bitrate quality when casting displays. Although for something mostly static like discord windows, that’s probably less of an issue. Windows 11, and maybe 10, also have a hotspot mode, where you could share your wired network via your PCs wireless radio via and ad hoc Wi-Fi SSID. That could reduce latency and improve signal reception, but you’d have to start the hotspot setting every session or whenever the device disconnects from windows’ hotspot for more than 15 minutes or something.
You could try other remote display streaming software as well, like Parsec. However they have a online account login requirement with the freemium model, so I prefer the open source client Moonlight instead. However parsecs a lot easier too use when streaming from outside your home, or when remotely single screen co-oping with friends, without having to configure firewalls or domain names.
If you already have a similarly sized tablet, you could just buy a dummy HDMI plug, a few dollars, to add a second virtual desktop and then simply cast that screen to the mobile device.
There are pretty nice Android tablets now with 2.5k 120 hz HDR OLD screens. You can just connect it directly to the computer via USB, enable USB network tethering, then use something like the Moonlight client app with Sunshine screen casting server. With the wired connection, and a high bit rate such as 150 Mbps, you can get single digit millisecond latency and hardly tell the difference from an native HDMI display.
Tablets like those might be on the high end, but at least you’d have nice secondary display that’s a bit more multifunctional. Or just go with a cheaper LCD based tablet or old iPad, if color accuracy, refresh rate, or resolution isn’t a priority.
Private Eye - essential for staying online 24/7
What was that device, an early cellular modem or 802.11 wireless bridge? The thing ontop of the briefcase looks like a head visor with an antenna. Google search keywords are just noise.
Scrum 's a thing that can’t get no love from me
Woops, yep.
Pro tip: If you check the conical URL (youtube.com) first, the Lemmy web UI will help catch reposts before they are accidentally submitted.
I switched from using the short (e.g youtube.be) or external URL mirrors for that same reason, and just let the bots comment with privacy mirrors for those who prefer. Using the conical URL, aside from cross post detection, also ensures the thumbnail image and preview text get cached consistently.
Windows 11 (and 10?) supports multiple desktops. Only in the shape of a 1xN grid through, like a 1D array. Still handy for multitasking:
Have you tried multiple virtual desktops, or do you prefer to have a single alt-tab stack of windows? I love using a 3x3 grid of desktops, as the special zoning helps me to organize, separate, and spread out my multiple work tasks.
What about a semi transparent terminal window? When I started out learning linux command line interfaces, it helped having the docs just behind my shell session for reference when all I had was a tiny old laptop. But now I don’t bother ricing up my DE anymore. I just want some default window tilling keybindings that work out of the box, and I’m good to go.
Can you imagine the eye strain one would get programming on a translucent screen every day? One where your always having to keep your eyes focused on semi transparent text and graphical interfaces in the foreground, and not the distracting and ever changing background, continuously shifting in parallax as you adjust your head and viewing angle. Not having my display buttressed up against a wall, or having to deal with glare and screen reflections, or even low contrast monitors in general are all things I find infuriating already.
But I guess the Sci-Fi future of ergonomics is holograms. *You must have your migraines, and you must enjoy them.
Could go the other way though. Ask them nicely if they’d be willing to free up their heap of inventory, and if they return you a cart overflow, you know you’ve stumbled upon the ultimate zero day coupon.
Just need to put a JIT compiled language logo inside the blue car and caption it as “Containerise once, ship anywhere”.
Image Transcription: Meme
A photo of an opened semi-trailer unloading a cargo van, with the cargo van rear door open revealing an even smaller blue smart car inside, with each vehicle captioned as “macOS”, “Linux VM” and “Docker” respectively in decreasing font size. Onlookers in the foreground of the photo gawk as a worker opens each vehicle door, revealing a scene like that of russian dolls.
I’m a human volunteer content transcriber and you could be too!
Out in the wild? Perhaps quite a few. For example, for teleoperated robotic thoracic surgeries, I imagine medical grade HID should mandate safety certified hardware that doesn’t rely on electrically noisy mechanical potentiometers, subject to Dead zone drift, or non-deterministic dead man behavior under failure modes. Although I’m certain there’s various reasons not to use hall effect sensing devices even within the same facility as MRI machines.
Textual transcription of meme: (two panels)
First panel: laughing group of people gathered around a phone, captioned:
“The Internet laughing at the Titan submersible using a Logitech controller”
Second panel: the Awkward Look Monkey Puppet meme, captioned:
“The robotics community”
Related: [META] We should help transcribe our memes for RBlind!
I suspect this comment was posted to spell out the meme for those unfamiliar, but I wanted to thank you for transcribing it into text for those that also may be blind or visually impaired. With the loss of r/TranscribersOfReddit , I salute your contribution! Please keep at it!
https://www.theverge.com/2023/6/23/23771396/reddit-subreddit-community-transcribers-accessibility
Yep, I’ve seen reporting of Navy’s using them for controlling periscopes on submarines (now that most are drive by wire), or Air forces using them for piloting drones, as well as for teleoperated robotic thoracic surgeries.
The widespread user familiarity and benefits in transferable hand coordination skills with common gaming based HID economics is hard to refute. Although, I’m guessing the market for safety certified joysticks will uptick.
Tagging an image is simply associating a string value to an image pushed to a container registry, as a human readable identifier. Unlike an image ID or image digest sha, an image tag is only loosely associated, and can be remapped later to another image in the same registry repo, e.g
latest
. Untagging is simply removing the tag from the registry, but not necessarily the associated image itself.