Hacker Newsnew | past | comments | ask | show | jobs | submit | alexbock's commentslogin

I'm guessing you're referring to https://news.ycombinator.com/item?id=39309409


Thanks for this!!! My memory isn’t that good, and if you see I had commented in the previous post as well :-)


If you want to play with any of these lens descriptions (or look at code for simulating them), I made a free and open source visual web UI for lens design. The default project when you visit it is a double gauss lens similar to the one shown in the article.

https://alexbock.github.io/open-optical-designer/


Is there a framework or template base for these kind of (usually scientific) demonstration apps? It’s a common design language of inputs and output that I’ve seen in many pages, often self-explanatory. I like it.


Thanks. I did not use any frameworks/libraries/dependencies for this project. It's vanilla JavaScript/HTML/CSS from scratch. The general concept of a spreadsheet-like data editor next to a visual view is a standard paradigm in commercial lens design software like Quadoa/OSLO/CODE V.


I absolutely love this, and the development philosophy. Nice work!


Took a look and I'm impressed how easy it is to use. Thanks for sharing this.


amazing tool - thanks for sharing


> (This is mathematically worse for a reason I don't remember; it causes any picture taken at a concert with blacklights or blue spotlights to look bad.

The iPhone camera sensor is prone to saturating and clipping the blue channel when strong light from a blue LED is in the frame. Once the blue channel clips at the maximum value, a typical HDR gain map won't do anything to restore more nuance to it because they're not designed to add high-frequency detail to a blob of clipped pixels with identical values in the base image.


Tesla vehicles display error descriptions prominently whenever an error code is presented, and detailed error diagnostics are available for anyone to browse in the service mode menu on the touchscreen. (Service mode is publicly accessible but does require looking up online how to open it.)


I've always been fascinated by this topic as well. As a further experiment, you may be interested to know that these IR lights can pass straight through red wine that looks totally dark and opaque to the human eye. I took some photos to demonstrate this with a DSLR with the IR filter removed here [1], but you can test this yourself by using a smartphone to look at the IR light of a TV remote with a glass of red wine in between them.

[1] https://alexbock.github.io/blog/nir-water-red-wine-compariso...


I've been pondering some sort of "night vision"[1] system for my car after noticing how excellent low light is on my VIOFO dash cam.

The dash cam has a video-out port, but unfortunately it appears to be NTSC resolution. I'd love some sort of setup that outputs to >=8" 1080p display attached to my dash. It would help so much in my rural area with wildlife in the road, as well as the constant random pedestrian walking on an unlit rural highway in dark clothing.

Ideally, if I could get great quality, low noise low light video like the VIOFO, I could then start playing with object identification with OpenCV.

1. I worked on such spectral systems in a past military life but don't want to attach something big like a Cadillac FLIR unit, or something expensive, like nearly every viable consumer FLIR option. All the "affordable" consumer FLIR options suffer from low resolution and/or low response time.


> All the "affordable" consumer FLIR options suffer from low resolution and/or low response time.

When I experimented with connecting two thermal cameras to a VR headset for stereo thermal vision, I used two Seek CompactPRO FastFrame units. They're 320x240@15Hz for $400 which is a lot more usable than the typical 80x60@9Hz consumer thermal, and it's easy to integrate the Android model into custom applications. They also have a 320x240@25Hz model for $1000.

I'm still impatiently waiting for affordable 640x480 thermal cameras, but in my opinion 320x240 at moderate frame rate is past the good-enough threshold to be legitimately useful for high contrast situations like identifying warm-blooded life on the side of a rural road.

> I'd love some sort of setup that outputs to >=8" 1080p display attached to my dash.

The Tesla Cybertruck has an option to display the view from the front bumper camera on the 18.5" main screen, but front camera display is unfortunately not available in any of Tesla's other models. With the proliferation of large touchscreens and camera arrays, more vehicles may support this from the factory soon.


I learned of the interesting "spite clause" described in the will section of this article while reading about the common law rule against perpetuities, which limits how far into the future wills can operate.

[The word "bizarre" in the title quote has been replaced with "unusual" as HN appears to automatically delete the former word from titles.]


It sounds like you're thinking of separating coins using eddy current braking [1]. This works even for non-magnetic coins because the effect is a function of the metal's electrical conductivity.

If you have a silver coin or a small piece of copper pipe and a large, strong neodymium magnet, then you can easily observe this effect at home by putting the metal sample on a table and quickly waving the magnet past it as close as you can without touching it. The metal will slide across the table following the magnet, despite the metal itself not being magnetic, because the moving magnet induces eddy currents which temporarily create a magnetic field like an electromagnet. Other metals besides silver and copper exhibit weaker responses due to higher electrical resistivity.

[1] https://en.wikipedia.org/wiki/Eddy_current_brake


Thanks for explaining. Wondered how non-ferro coins were handled.

TD Bank in Canada had coin counting machines for customers for a while. I dumped a few hundred $ in and was very happy when it kicked back some silver coins to me.

How nice of them!


I've always been surprised that Apple added this functionality to the iOS home screen without having a solution to the reorder vs nesting UI problem. Trying to move an app into a folder often results in the folder deciding to fly out of the way and let the item take its place when you're really trying to drop something on the folder to insert it.


Android has the same problem. Sometimes it lets you drop inside, sometimes it flies away.


In IOS, I'm pretty sure you cannot add an app that currently sits on the far right of the screen to a folder anywhere beneath it.


Just tried this, and you can. But it definitely takes more finesse than any other column.


The single most important and most revealing aspect of this announcement is Apple's framing of the promotional images showing the face display on the front of the headset.

I think the large R&D investments tech companies are making into VR/AR headsets are ultimately centered on the idea that years in the future, an affordable, comfortable, and socially acceptable headset for all-day wear in daily life could hypothetically replace smartphones, tablets, laptops, and screens.

If a future daily wear headset as a phone/computer replacement reached a critical mass of adoption, then control of the platform will provide the owner with a huge profit source in selling virtual goods that can be "worn" or "placed" in the real world and are seen equally by any other person wearing a headset. I won't speculate on exactly what would be popular, but it is conceivable that this could include things like: buying virtual wall posters of licensed characters, virtual landscaping objects placed outside a house, filters that virtually "repaint" the exterior or interior of a house, or personal adornments like virtual clothing or appearance modifiers. That is to say, a digital layer of adjustment on top of the real world where everyone wearing a headset is automatically shown the virtual objects or adjustments that anyone else has made (within the scope of the latter's own appearance and owned spaces - random members of the public could not place publicly visible digital objects in the middle of a NYC street). Note that this virtual economy is a profit motive for a company to build AR, but not the selling point for a headset adopter.

Apple is not trying to sell this headset to consumers for $3500. They're showing off future hardware that they believe represents the bare minimum for what average people might be willing to wear regularly, with the expectation that they will be able to produce essentially this same unit and sell it for perhaps a third of the current price in several years. The way it's presented is also an early form of reputation management for the product space trying to influence public perception of how someone wearing a headset is viewed by others around them.

Standard see-through AR headset designs face fundamental implementation limits with the display technology that generally result in accepting one of two unacceptable limitations: a display that projects an image over the real world but cannot render black or otherwise draw anything darker than the scene behind it, or a liquid crystal light modulator with a polarizer that permanently makes the glass tinted dark like sunglasses even indoors. Apple is instead making a VR headset that is completely enclosed, displaying the world through pass-through cameras and drawing the wearer's eyes on a front display for everyone else to see.

The front lenticular OLED shows how Apple is approaching the social aspect of trying to market the acceptability of wearing a headset in the company of other people. In the long term, establishing a virtual economy for digital world overlays is fundamentally dependent on the social acceptance of wearing an AR device regularly. This announcement seems to be trying to thread that needle in advance of when this technology could eventually be priced and sold as a consumer product. I.e. establish an image of an Apple headset positioned without the kind of negative associations Google Glass quickly garnered.

I have no idea whether AR will succeed in replacing smartphones years from now or fade into obscurity, but what I find interesting about this announcement is that it makes a timeline where AR does take off at least appear conceivable. They've only taken a first and early step into trying to make it happen, but they haven't made a fatal mistake yet.


I enjoy monochrome infrared photography at wavelengths longer than 1000 nm using a Sigma sd Quattro (no Bayer filter and the sensor IR filter is reversibly user-removable). To reduce exposure times as much as possible, I've been testing an AR-coated near infrared achromatic doublet (AC254-050-B-ML) with a 3D-printed housing to attach it to the camera mount with a manual focusing barrel. The next step is to design a fast double gauss lens that can similarly be assembled from readily available NIR-coated stock lens elements.


I've wondered about doing this a lot! How is the image quality with the doublet? I love the Quattro series of cameras but hate the software personally. I may someday pick up a sd Quattro H...


The center of the image it produces is perfectly sharp, but using a single achromatic doublet as a photographic lens at f/2 necessarily produces significant eighteenth-century softness as you move out to the edges. This image [1] taken to demonstrate that red wine (left) is clear and nearly as transparent as water (right) in the near infrared provides a decent example of how the images it produces wide open soften outside the center much more quickly than modern commercial camera lenses which typically don't lose sharpness at maximum aperture until the corners.

However, the sd Quattro is sensitive enough from 1000 nm - 1100 nm that I can take handheld shots outdoors on a sunny day while stopped down to f/5.6, and the smaller aperture gives more consistent sharpness across the frame. It also only takes a few seconds exposure on tripod to capture astrophotography of red giant stars that emit significant infrared like Betelgeuse.

Incidentally, the original reason I wanted an infrared-sensitive camera and a 1000 nm long pass filter was to photograph stars in the sky during the middle of the day, taking advantage of the quartic dependence on wavelength in Rayleigh scattering to remove the overpowering brightness of the sky.

[1] https://alexbock.github.io/blog/nir-examples/near-infrared-8... (note: this image used an 850 nm long pass filter rather than 1000 nm but was taken with the same doublet described before)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: