Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Some screens will scan out top to bottom at 60Hz and mostly avoid that skew. If you took an OLED that does that, and added another mechanism to black out lines after 3ms, you'd have a pretty good match to the timing of the incoming signal


I don't want that kind of flicker, though. I just want the image to look good:

> I want a deinterlacer that can figure out how to make the (cough) best image possible so deinterlacing artifacts aren't noticeable.

> I want some kind of machine-learning algorithm that can handle the fact that the top of the picture was captured slightly before the bottom of the picture; then generate a 120p or a 240p video.

---

If we want to black out lines after 3ms, we might as well bring these back: https://www.youtube.com/watch?v=ms8uu0zeU88 ("Video projectors used to be ridiculously cool", by Technology Connections.)


Okay, so you're not interested in the flicker versus persistence tradeoffs.

In that case you just need a 60Hz-synced scanout, and you can get screens that do that right now. That will beat any machine learning stabilizer.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: