DuckDNS has long enough latency (over 2000ms) where Google Assistant can’t connect. I moved to FreeDDNS and my Home Assistant issues went away.
- 0 Posts
- 63 Comments
ShortFuse@lemmy.worldto
Linux Gaming@lemmy.world•Hollow Knight: Silksong Patch 4 is out now with lots of bug fixes and a major controller input changeEnglish
2·5 months agoSilksong might be one of the “easiest” ones if I ever did a RenoVK. Basically, you check the swapchain size, and any 8bit texture that the game tries to build that matches that resolution gets upgraded to 16bit. And done. That alone will get the SDR layers to stop banding. (We actually do 16bit float because we want above SDR level brightness, but 16uint would be a perfect, less problematic banding fix).
I might look at vkBasalt. That’s basically how ReShade ended up building an addon system. You have to be able to inject shaders, create textures, and monitor backbuffer to do postprocess. Instead of just doing it at the end, it allows us to listen for render events and act accordingly. That’s the basis for most our mods. Every game will use DX/GL/VK commands so it’s much easier to tap into that instead of compiled CPU code.
ShortFuse@lemmy.worldto
Linux Gaming@lemmy.world•Hollow Knight: Silksong Patch 4 is out now with lots of bug fixes and a major controller input changeEnglish
4·5 months agoI wrote the RenoDX mod if you’re talking about that. I don’t think there’s anything like Reshade’s addon system for Linux games. We’ve done OpenGL and Vulkan mods but that still relies on intercepting the Windows implementation. Silksong primarily needs a 16bit float render to solve most of its banding, but not sure how you can do the same on Linux.
We avoid per-game executable patching intentionally, but sounds like that would be the best choice here. Getting the render to 16bit would solve most banding, but you’d still need to replace shaders if your goal were HDR (or fake it as a postprocess with something like vkBasalt).
The point is to show it’s uncapped, since SDR is just up to 200 not. It’s not tonemapped in the image.
But, please, continue to argue in bad faith and complete ignorance.
This is a trash take.
I just wrote the ability to take a DX9 game, stealthy convert it to DX9Ex, remap all the incompatibility commands so it works, proxy the swapchain texture, setup a shared handle for that proxy texture, create a DX11 swapchain, read that proxy into DX11, and output it in true, native HDR.
All with the assistance of CoPilot chat to help make sense of the documentation and CoPilot generation and autocomplete to help setup the code.
All in one day.

Definitely not. NoJS is not better for accessibility. It’s worse.
You need to set the ARIA states over JS. Believe me, I’ve written an entire component library with this in mind. I thought that NoJS would be better, having a HTML and CSS core and adding on JS after. Then for my second rewrite, I made it JS first and it’s all around better for accessibility. Without JS you’d be leaning into a slew of hacks that just make accessibility suffer. It’s neat to make those NoJS components, but you have to hijack checkbox or radio buttons in ways not intended to work.
The needs of those with disabilities far outweigh the needs of those who want a no script environment.
While with WAI ARIA you can just quickly assert that the page is compliant with a checker before pushing it to live.
Also no. You cannot check accessibility with HTML tags alone. Full stop. You need to check the ARIA tags manually. You need to ensure states are updated. You need to add custom JS to handle key events to ensure your components work as suggested by the ARIA Practices page. Relying on native components is not enough. They get you somewhere there, but you’ll also run into incomplete native components that don’t work as expected (eg: Safari and touch events don’t work the same as Chrome and Firefox).
The sad thing is that accessibility testing is still rather poor. Chrome has the best way to automate testing against the accessibility tree, but it’s still hit or miss at times. It’s worse with Firefox and Safari. You need to doubly confirm with manual testing to ensure the ARIA states are reported correctly. Even with attributes set correctly there’s no guarantee it’ll be handled properly by browsers.
I have a list of bugs still not fixed by browsers but at least have written my workarounds for them and they are required JS to work as expected and have proper accessibility.
Good news is that we were able to stop the Playwright devs from adopting this poor approach of relying on HTML only for ARIA testing and now can take accessibility tree snapshots based on realtime JS values.
Came here for this. Pickles are like a researcher’s??? Progeny? A researcher’s what? I must know.
Also something about Harry Potter in the first lines.
Edit. It’s grumpy! But I don’t know what!
ShortFuse@lemmy.worldto
World News@lemmy.world•Dreams ‘crushed’ as Ticketmaster cancels fans’ Oasis ’25 ticketsEnglish
4·1 year agoExactly. Is it really loyalty when you only have one option?
ShortFuse@lemmy.worldto
Technology@lemmy.world•Bill proposed to outlaw downloading Chinese AI models.English
14·1 year agoRip iPhone.
ShortFuse@lemmy.worldto
Technology@lemmy.world•Developer Creates Infinite Maze That Traps AI Training BotsEnglish
13·1 year agoscrape.maxDepth = 5
Imagine the world we would have lived in if there were oil rigs there instead.
ShortFuse@lemmy.worldto
Technology@lemmy.world•Sysadmins slam Apple’s SSL/TLS cert lifespan cutsEnglish
1·2 years agoI have just dumped code into a Chrome console and saved a cert while in a pinch. It’s not best practices of course, but when you need something fast for one-time use, it’s nice to have something immediately available.
You could make your own webpage that works in the browser (no backend) and make a cert. I haven’t published anything publicly because you really shouldn’t dump private keys in unknown websites, but nothing is stopping you from making your own.
ShortFuse@lemmy.worldto
Technology@lemmy.world•Sysadmins slam Apple’s SSL/TLS cert lifespan cutsEnglish
1·2 years agoThat’s what NodeJS and Deno are.
The point of the browser support means it runs on modern Web technologies and doesn’t need external binaries (eg: OpenSSL). It can literally run on any JS, even a browser.
ShortFuse@lemmy.worldto
Technology@lemmy.world•Sysadmins slam Apple’s SSL/TLS cert lifespan cutsEnglish
11·2 years agoJust going to mention my zero-dependency ACME (Let’s Encrypt) library: https://github.com/clshortfuse/acmejs
It runs on Chrome, Safari, FireFox, Deno, and NodeJS.
I use it to spin up my wildcard and HTTP certificates. I’ve personally automated it by having the certificate upload to S3 buckets and AWS Certificates. I wrote a helper for Name.com for DNS validation. For HTTP validation, I use HTTP PUT.
ShortFuse@lemmy.worldto
linuxmemes@lemmy.world•type the distro you use and is and let your keyboard finish it
4·2 years agoWindows 10 and it’s not a good idea
ShortFuse@lemmy.worldto
Programming@programming.dev•What is your preferred API error response and why?
14·2 years agoDon’t use JSON for the response unless you include the response header to specify it’s
application/json. You’re better off with regular plaintext unless the request header Accept asked for JSON and you respond with the right header.That also means you can send a response based on what the request asked for.
403 Forbidden (not Unauthorized) is usually enough most of the time. Most of those errors are not meant for consumption by an application because it’s rare for 4xx codes to have a contract. They tend to go to a log and output for human readers later, so I’d lean on text as default.
ShortFuse@lemmy.worldto
Selfhosted@lemmy.world•ICANN approves use of .internal domain for your networkEnglish
3·2 years agoI’ve also used .local but .local could imply a local neighborhood. The word itself is based on “location”. Maybe a campus could be .local but the smaller networks would be .internal
Or, maybe they want to not confuse it with link-local or unique local addresses. Though, maybe all .internal networks should be using local (private) addresses?
ShortFuse@lemmy.worldto
Technology@lemmy.world•uBlock Origin developer recommends switching to Firefox as Chrome flags the extensionEnglish
4·2 years agoI’ve been using uBOLite for about a year and I’m pretty happy with it. You don’t have to give the extension access to the content on the page and all the filtering on the browser engine, not over JavaScript.
ShortFuse@lemmy.worldto
Selfhosted@lemmy.world•What self hosting feels like (It's painful, please help 🥲)English
3·2 years agoI just recently started working with ImGui. Rewrite compiled game engines to add support for HDR into games that never supported it? Sure, easy. I can mod most games in an hour if not minutes.
Make the UI respond like any modern flexible-width UI in the past 15 years? It’s still taking me days. All of the ImGui documentation is hidden behind closed GitHub issues. Like, the expected user experience is to bash your head against something for hours, then submit your very specific issue and wait for the author to tell you what to do if you’re lucky, or link to another issue that vaguely resembles your issue.
I know some projects, WhatWG for one, follow the convention of, if something is unclear in the documentation, the issue does not get closed until that documentation gets updated so there’s no longer any ambiguity or lack of clarity.
If you’re talking browsers it’s poor. But HDR on displays is very much figured out and none of the randomness that you get with SDR with user varied gamma, colorspace, and brightness. (That doesn’t stop manufacturers still borking things with Vivid Mode though).
You can pack HDR in JPG/PNG/WebP or anything that supports a ICC and Chrome will display it. The actual formats that support HDR directly are PNG (with cICP) and AVIF and JpegXL.
Your best bet is use avifenc and translate your HDR file. But note that servers may take your image and break it when rescaling.
Best single source for this info is probably: https://gregbenzphotography.com/hdr/