Hacker Newsnew | past | comments | ask | show | jobs | submit | shrinks99's commentslogin

What a bummer. It seems like what they're asking for here (a written agreement that users will be able to access 3rd party app stores) would be a win win win for Core Devices, Rebble, and users. Core Devices gets to look like a super good guy (ideally driving interest in the product), Rebble gets to look like a huge winner maintaining something for the community (as they are), and users get an open ecosystem.

There's still a chance for a win here, but looks like the door is closing.


Pixar also ships Renderman support for Blender https://rmanwiki-26.pixar.com/space/RFB26


You can see everything in your field of vision, but the area DIRECTLY in the centre has the highest level of detail. This image has high frequency animated details that are not cognisized equally by your entire FOV. The animated bit right in the middle at any given time is where your brain processes the most detail and also where you are looking.


I had to think about it, but are you saying all the stars are animated to rotate, but the amount they move between frames is too small for you to see unless it's in your fovea?


They're just so small that you only see shapeless blur outside your fovea. If you applied an artificial blur filter to the whole screen, you'd also not see any movement anymore because all high-resolution detail is removed. A 3x3 box blur will erase differences between

  X X        X
   X   and  X X
  X X        X


They are tiny and the ones not on your fovea don't register enough "pixels" for your brain to recognise the rotation.


Oh cool so it’s about the frequency?


Spatial frequency, ie. small detailed things, not temporal frequency (in this example).


Generally yes, but we're still working on it all these years later! This article by Chris Brejon offers a very in-depth look into the differences brought about by different display transforms: https://chrisbrejon.com/articles/ocio-display-transforms-and...

The "best" right now, in my opinion, is AgX, which at this point has various "flavours" that operate slightly differently. You can find a nice comparison of OCIO configs here: https://liamcollod.xyz/picture-lab-lxm/CAlc-D8T-dragon


Wow, those links are a goldmine, thanks!

I went down the tonemapping rabbit hole for a hobby game engine project a while ago and was surprised at how complex the state-of-the-art is.


Most of Blender's icons are actually made in Penpot which is also what the Blender foundation uses for UI prototyping. The brush icons are made in Blender though!

https://penpot.app/penpothub/libraries-templates/blender-con...

https://code.blender.org/2024/11/new-brush-thumbnails/


Serif (I guess Canva now) maintains their own which uses the Lensfun database.


Yeah, but Lensfun (the library they use for this) doesn't have anywhere to donate.


That does make things a bit more complicated.


I'd buy some of these explinations, except the depth estimation, colorization, and super-resolution ML models they use in the app DO run locally and are still subscription-gated.

Apple has been doing on-device machine learning for portrait blurs and depth estimation for years now, though based on the UI, this might use cloud inference as well.

Granted, these aren't the super heavy ones like generative fill / editing, and I understand that cloud inference isn't cheap. A subscription for cloud-based ML features is something I'd find acceptable, and today that's what has launched... The real question is what they plan to do with this in 2-5 years. Will more non-"AI" features make their way into the pro tier? Only time will tell!


Cole (the author of didder) also has a GUI version called Dithertime: https://makew0rld.itch.io/dithertime


A friend of mine wrote one for Go with the goal of creating the best and most complete dithering library out there and I think he did a decent job. Worth a look if you're looking for reference implementations!

Here's his: https://github.com/makew0rld/dither


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: