Hacker Newsnew | past | comments | ask | show | jobs | submit | sraquo's commentslogin

(Goodcover dev here)

It could be, along with say family size. But the UX cost of asking such questions is real, especially on an insurance application form. People get worried why we want to know such things so early on.

Today we're able to provide a pretty good estimate with no personal details, nothing but a zip code in fact. We might work the zip code into temp housing default at some point, but it's not without issues. Changes to the property <> temp housing link built into our rating require solid data and regulatory approvals.


Sounds to me like an A/B test is warranted.


We do plan to do some tests, to the extent that we're able. We can A/B test much of the UI but we can't A/B test the insurance terms that we give people because that would require different pricing and thus different rating, which needs to be the same for everyone legally (also any pricing changes require approvals regardless of A/B testing or not).

Don't take this too literally as I'm just a developer, not a licensed insurance agent. But this kind of thing is harder in a heavily regulated business such as ours.


Scala.js too! Been a great experience overall. (I work at Goodcover)


Because virtual DOM is not the only way to achieve efficient DOM updates, and even within the concept of virtual DOM there could be many different ways to achieve efficient diffing (e.g. React vs Snabbdom). There is nothing special about a particular virtual DOM spec to deserve a place in web standards.


Web standards could specify an API (which is very simular across all virtual dom inplementations) and let browsers chise their own implementations and optimisations.

Since the browser has full access to the entire object model, and memory layouts, and dom optimisations, and..., and..., it’s a shame that we have to write code that is the prerogative of the browser.


There is enough differentiation in virtual DOM designs[1] that I don't think implementing a lowest common denominator that is flexible enough to support competing strategies would be useful. Being a browser API it would need to be a rather timeless, low level design. Essentially, we already have that – the imperative DOM API.

With WASM the performance gap between libraries and native browser code will be reduced even further. Focusing on that has the benefit of lifting all boats, not just a particular DOM building paradigm that has been popular recently.

[1] Just some examples that come to mind:

- logic that decides when to update an element vs when to create a new one (including but not limited to having the concepts of components, thunks, deciding to look at class names, ability to move nodes, etc.)

- design of lifecycle methods and hooks, as well as any performance-oriented optimizations such as asynchronous / delayed rendering, short circuiting logic, etc.

- handling of props vs attributes vs reflected attributes


> Essentially, we already have that – the imperative DOM API. With WASM the performance gap between libraries and native browser code will be reduced even further.

Why waste time implementing and “reducing it even further” and not just implement it in the browser?

> Just some examples that come to mind

First bullet point is relevant for internals mostly, little bearing on actual API.

Second bullet point is nearly identical in all virtual doms and similar lifecycle hooks exist in WebComponents (which squandered the opportunity to introduce a declarative API).

Third bullet point is valid, but the main problem isn’t the difference between library APIs. The main problem is that there is no browser API, so everyone has to reinvent the wheel. A simple {attrs: {}, props: {}} would render the differences moot (or you would have very thin wrappers on top for your favorite syntax)


> Why waste time implementing and “reducing it even further” and not just implement it in the browser?

Because the browser should not offer inflexible implementations of complex, highly opinionated and currently overhyped paradigms, because those standardized APIs will stay with us for much longer than they will be useful, wasting everyone's time.

We're only talking about virtual dom here because it's a popular concept with good library implementations. We don't need to reimplement it in all major browsers because we already have it working well.

Even aside from that, it is easier to build one library than implement the same spec in all major browsers. Moreover, library designs compete with each other, and can be improved faster than APIs baked into a browser which will have to be maintained in a backwards compatible manner for more than a decade.

> First bullet point is relevant for internals mostly, little bearing on actual API.

The concepts of Components, State, Context, Thunks, Fibers, Plugins, etc. are very important differentiators between various virtual DOM APIs. Either the presence or absence, let alone the specific design of those concepts strongly affects the API surface and what users can do with it and how. Don't mistake React's API for some kind of standard.

Once the hype inevitably moves on from virtual DOM to whatever the next declarative UI paradigm will be (e.g. FRP with precision DOM updates) this whole standardization and reimplementation exercise will be rendered a giant waste of time.


> because those standardized APIs will stay with us for much longer than they will be useful, wasting everyone's time.

Oh wow. You just described the existing imperative DOM APIs, haven't you? Inflexible implementation wasting everyone's time. When is the last time you used actual DOM APIs? The story of web development has been: "JFC, these things are impossible to work with, let's waste time creating actual useful abstractions on top of them".

> it's a popular concept with good library implementations. We don't need to reimplement it in all major browsers because we already have it working well.

You know what one of the goals of jQuery was? To become a disappearing library. As browser APIs got better and better, the need for jQuery would diminish and it would disappear, becoming just a browser API.

Why would browsers implement querySelector and querySelectorAll? We already had it in jQuery and it was working well.

Why would browsers implement fetch? We already had jQuery.ajax, axios, superfetch and dozens of others, and they were working well.

Why would browsers implement <insert any improvement>? We already have <insert doezns of libraries> and they are working well.

> it is easier to build one library than implement the same spec in all major browsers.

It doesn't mean we should freeze the spec in the same state it was in 1998.

> The concepts of Components, State, Context, Thunks, Fibers, Plugins, etc. are very important differentiators between various virtual DOM APIs.

None of those refer to actual APIs. Those are parts of internal implementations or additional implementations/additions/APIs on top of virtual dom.

We are talking about one thing specifically: we need a browser-native declarative DOM API with browser-native DOM-diffing that wouldn't require us implement it in userland.

The rest like thunks, state management, plugins, whatever can be provided by actual libraries on top of actual built-in high performant built-in virtual-dom API.

Because the browser knows infinitely more about what's happening to the DOM than userland libraries and has access to infinitely more optimisations. All userland code needs to do is to tell the browser: this and that changed.

Funnily enough, browser implementors are now spending considerable amounts of time implementing CSS Containment [1] (emphasis mine):

--- quote ---

Browser engines can use that information to implement optimizations and avoid doing extra work when they know which subtrees are independent of the rest of the page.

Imagine that you have a big HTML page which generates a complex DOM tree, but you know that some parts of that page are totally independent of the rest of the page and the content in those parts is modified at some point.

Browser engines usually try to avoid doing more work than needed and use some heuristics to avoid spending more time than required. However there are lots of corner cases and complex situations in which the browser needs to actually recompute the whole webpage.

--- end quote ---

Wow. Browsers (and browser implementors) actually want the developers to tell them what exactly changes on the page so that they don't do extra work. And wow, you can actually implement the same spec in all major browsers (eventually).

So why not virtual DOM?

> this whole standardization and reimplementation exercise will be rendered a giant waste of time.

So what you're saying is essentially this: new things are hype, DOM APIs should never get updated because who cares about the needs of developers.

[1] https://blogs.igalia.com/mrego/2019/01/11/an-introduction-to...


> Oh wow. You just described the existing imperative DOM APIs, haven't you? Inflexible implementation wasting everyone's time.

Existing DOM APIs are very simple, low level and unopinionated. A pleasure to build libraries on. That's what durable platform APIs should look like.

It is very easy to build virtual dom and importantly other DOM management paradigms on top of those low level APIs.

The same can not be said about virtual dom - it's a very opinionated, very rigid paradigm. I know because I built FRP UI libraries based on Snabbdom and on native DOM APIs. The latter is much simpler to deal with and more performant. Virtual DOM only works well if that's exactly what you want. It has no place among browser APIs, at least not in any recognizeable shape or form.

Regarding performance, the whole point of the virtual dom is that the diffing engine does not know which elements changed or didn't change. It gets a new virtual subtree and has to diff it with the previous one. The browser would be doing all the same diffing work, just closer to the metal. But we will soon be able to do the same with just WASM.


> I know because I built FRP UI libraries based on Snabbdom and on native DOM APIs. The latter is much simpler to deal with and more performant.

I built something on native APIs and on userland APIs. Native APIs are more performant.

Really? That surprises you?

> The browser would be doing all the same diffing work, just closer to the metal.

Exactly my point


The things a VDOM get you is a simpler interface than the browser DOM, with an often faster comparison than the browser actually provides. There have been significant real browser DOM improvements since React came out, but I'm pretty sure an optimized VDOM in WASM could be faster because of issues of interaction with the real/full browser DOM. There are also side effects wrt the full/real DOM in practice.

I agree though, nothing that requires a place in web standards at all.


Well... faster than what exactly? You have to do the virtual DOM pattern (generating new DOM state and then diffing with old state) with virtual elements. You can't compare proper virtual dom to using real DOM elements instead of virtual ones in a virtual dom pattern, it wouldn't make any sense.

But there are other non-virtual-DOM ways to manage DOM state efficiently and in a maintainable manner. For example, my own library uses Observables to drive precise DOM updates and works with trees holding real (not virtual) DOM elements, so it doesn't need to do any diffing at all: https://github.com/raquo/Laminar

I don't think it's a given which of these techniques would be faster, it depends heavily on the particular use case and the implementation of diffing (for virtual DOM) and Observables (for my pattern). If both are well optimized I'd expect virtual DOM to lose in a lot of cases.


Like I said, could, it probably depends on actual use... DOM navigation for read or update can be optimized, but depending on how it is done may not work as well. React itself is moving towards diffing against the browsers real DOM iirc. Browsers have gotten a lot better than in the past. That said, actually comparing each node for updates against large trees may be more costly than updating and diffing against a partial abstraction.


What I'm saying is outside of the virtual DOM paradigm you might not need to diff any elements at all, real or virtual, and so you wouldn't care about the performance of DOM reads, as you're not doing them.

Then it becomes a matter of DOM write performance, but that is the same for everyone assuming the native DOM API commands issued by the libraries are the same, which is a more or less reasonable assumption for well optimized libraries even if they use different paradigms to calculate what those commands should be.


Templates are faster than vdom and are a browser feature now.


ScalaTags is indeed not relevant here, but Scala.js has its fair share of reactive UI libraries such as my own Laminar https://github.com/raquo/Laminar


Your DOM builder library is also great for lower-level DOM manipulation. Thanks for your work, by the way.

https://github.com/raquo/scala-dom-builder/


Spoiler alert: the best setup is to hole up in a nice town in Canada working remotely for a client in the US.

Vancouver numbers I know from friends, mostly from a couple years ago, web application dev:

- Starting salary around 60K CAD

- Mid range 70-90K

- Senior 90-130K

Companies like Amazon pay ~25% more than this, but are also more demanding of time and sanity

Go browse AngelList, lots of offers with salaries there from smaller companies.


Wow, those are really low numbers. I'm pretty confident you can fetch very similar salaries in Montreal, and housing is 3-5 times cheaper? Or even more?

I (and about 50 of my developer coworkers) make those sort of salaries here in small-town Quebec, with significantly cheaper housing than even Montreal.

I'm sure Vancouver is a beautiful city, but comparing those salary numbers against the housing over there makes it extremely unattractive.


Yup, I've seen more than one friend leave Vancouver for Ontario and Quebec. But hey if you work remotely, you're in the same timezone as SF!


I’ve found AngelList and the Who’s Hiring threads here on HN to be fairly accurate.


> Spoiler alert: the best setup is to hole up in a nice town in Canada working remotely for a client in the US.

Do you have experience with this? :)


Yes, I worked remotely for both US and Canadian clients. Dealing with Quebec clients is a pain (need to file their sales tax, _in french_), otherwise it's been a smooth sailing. Just remember that you're a contractor, not an employee.


I'm from Quebec as well. Do you work directly for a client, or through a consultancy?


Directly. One of my friends is holed up in Yukon working via a website similar to upwork. Seems to work for him although these kind of sites generally get bad rep on HN.


I tried Dvorak for a month a few years ago. For me it was just useless agony, not only with key positions but also with app/OS shortcuts that are very much designed for qwerty layout.

I have since switched to a split ergonomic keyboard (Diverge 3) and adjusted the layout somewhat (kept qwerty, but put all the modifier keys in custom locations). That took me about a month to exceed my previous speed. Typing on standard keyboards when needed is not a problem either.

I never cared to measure my typing speed though.


It would be interesting to design a keyboard whose key arrangement is weighted toward standard shortcut combinations. Maybe put them mostly on the left hand side to allow (right-handed) mousing without looking at the keyboard.


Keyboards like Diverge and Ergodox are fully programmable so nothing's stopping you from doing that today!

Also, I found that putting a small touchpad^ in a thumb-accessible area (near where a laptop touchpad would be) is a good enough replacement for a mouse.

^ http://www.ergonomictouchpad.com/


Well yeah, but what about the rest of the keys? It's one thing to put all the shortcut keys on the left or right side, but quite another to arrange the keys so that typing is...possible.


You don't have to change from the qwerty layout for general typing.

For example, you can program the keyboard in such a way that pressing X+A sends Ctrl+C keys to the OS, where X is a custom modifier button (Not Ctrl but a separate button that you assign for this use case) and A is, well, whatever other button on the keyboard you want to use for this, not necessarily "C".

Or you can have a single button programmed to send Alt+Tab, or... whatever, really, the firmware is very flexible.


pressing X+A sends Ctrl+C

Yeah, I know about remapping and that's not going to happen. What a nightmare! But the more I think about my idea the worse it sounds. What are the 12 most common shortcut keys? It's hard to narrow them down to that, and I'm not sure everybody would agree. Still, ergonomically it would be an interesting challenge.


> What are the 12 most common shortcut keys?

FWIW the WhatPulse app can tell you that (for your own typing) e.g. https://i.imgur.com/SuPLw9h.png

I used its data to drive my custom layout decisions.


I always put another Control in the CapsLock key, just that change makes it far easier to use shortcuts with the left hand only.


It would be great if you had a public preview of the content there. I've seen it done before, so there must be some kind of a slack app that lets you do that.

As a remote contractor I sometimes feel that it would be nice to have a chat like that, but a few startup-themed slack channels that I tried out turned out to be no good for me.


Thought about this, and agreed that it’s a good idea. I know @levelsio had one for Nomad List’s Slack, but I don’t think he open-sourced the code. If someone knows of a lib to do that, let me know!


Take a look at http://slackarchive.io if you haven't already


Earlier this year I bought a Developer Edition (Ubuntu) of Dell Precision 5520, which is the business version of XPS 15 9560. Same chassis and hardware, better factory QA and different GPU options.

I very much regret this decision. I spent ~100 hours trying to get it to work properly and to configure Ubuntu (including supposedly simple things like switching Alt and Ctrl keys). At the hourly rate I'm charging, this is more than enough to buy ANY laptop. Next time I'll swallow my pride and just buy a top of the line Macbook Pro.

Out of the box Ubuntu works great, but it is very fragile to updating. For example, updating BIOS to a version that fixes an important CPU bug causes the computer to freeze and shutdown every few minutes at random.

Updating the OS itself caused weird bugs such as being unable to click on app menu items with a USB touchpad. Ditching the default Unity for Gnome3 fixed this particular bug.

The recovery image that Dell provides for Ubuntu simply does not work a few months after release – craps out when it's unable to either fetch or install some package. So I don't even have a way of getting a working version of Ubuntu on this laptop if anything happens to my hard drive.

Oh and the touchpad configs are appallingly bad out of the box. Impossible to use. I fixed 98% of palm interference issues with a few lines of xinput config. I don't know why Dell didn't do that themselves.

And of course, Dell only provides only 7 days of Ubuntu support. After that you're on your own.

Now I'm trying to sell this laptop, but no one on craigslist wants it even at 35% ($900CAD) off the original price. A few months old laptop in perfect condition that is still on warranty.

Speaking of Ubuntu itself, it has been a profound disappointment as well. I will not be trying Dell or linux on desktop for another 7 years at least. My 9 year old Macbook Pro is far superior to this mess.

PS I also tried hackintoshing this laptop, and it worked 95% of the way, but it requires the latest BIOS to work, and that causes random shutdowns regardless of the OS.


I've been using the XPS 9560 full-time since February, and have experienced precisely 0 of the issues you describe whilst running a number of Linux variants, including several Ubuntu variants, Fedora and Solus.

> So I don't even have a way of getting a working version of Ubuntu on this laptop if anything happens to my hard drive.

You could simply download Ubuntu directly, or any other distribution. Dell isn't doing anything special to the version they distribute.

> For example, updating BIOS to a version that fixes an important CPU bug causes the computer to freeze and shutdown every few minutes at random.

Have you tried talking to Dell about this issue? Their support, particularity around BIOS issues on their XPS and Precision range has been fantastic in my experience, and they often supply pre-release BIOS versions if they believe it will help, otherwise they tend to replace the laptop.


> Dell isn't doing anything special to the version they distribute.

From what I've read on their website, they actually do bundle it with custom software like drivers. But really my bigger point is that a lot of things do not work the way they're supposed to if you deviate from the happy path just a tiny bit. In my experience.

> Their support, particularity around BIOS issues on their XPS and Precision range has been fantastic in my experience

Hm, I'll give it a shot, thanks for the heads up.


Try Fedora on it. I switched from Ubuntu to Fedora a while back and couldn't be happier. Not saying you will have a perfect experience but it is worth a shot if you at least like the hardware?


I don't think so. A lot of my issues are with BIOS and various linux apps or the window manager, not technically the distribution flavour itself. Also, I'm done spending time on this :)


Totally understand. Just thought I would do my bit to promote Fedora when I can ;)

Any idea what you will move onto?


> For example, updating BIOS to a version that fixes an important CPU bug causes the computer to freeze and shutdown every few minutes at random.

I'd return it at this point.

It's defective by design.


I was told they would send me a replacement if they determine it to be a hardware problem. Return wouldn't be an option (I'm past the first 30 days). I'll try to negotiate with them anyway.


Scala.js has been an amazing tool for my personal projects. I literally hit euphoria sometimes writing stuff in it. I am yet to truly grasp the benefits of full stack Scala (shared models, Autowire, etc.) but even for purely front-end development it's great, it beats Typescript and Flow out of the water in features, soundness and stability.

I wish I could use it at work since we (Hootsuite) already use Scala heavily on the backend, but I am reluctant in part because Scala.js does not quite have financial support of Lightbend. Or so it seems, it's a bit hard to tell where Lightbend ends and the non-profit Scala Center begins. The latter did pay to get some features implemented but reading their advisory board minutes, I'm not sure if they would have enough funding to pay for the majority of Scala.js development, which if I understand correctly happens for free as part of a PhD right now (note: my information might be wrong/outdated!)

So, if anyone involved with Scala.js is reading this and has better insights on the situation, it would be nice to know.

But I will keep using it regardless. It's marvellous.


You are mostly accurate wrt. the "financial" situation of Scala.js. Except that the Scala Center has an explicit Recommendation to "ensure the continuity of Scala.js" [1]. At least that means that, should I (the Ph.D. student doing Scala.js right now) stop working on it, the Scala Center has a mission to ensure someone keeps working on it. As with all Scala Center Recommendations, this is not a guarantee. But given the popularity of Scala.js within the Scala eco-system, I am confident that solutions will be found! Read more about the Scala Center and how it works at [2]

[1] https://github.com/scalacenter/advisoryboard/blob/master/min... [2] https://scala.epfl.ch/


> the Scala Center has a mission to ensure someone keeps working on it.

Heh, I hope it's you! :)

Seriously, scala.js has been amazing for us! Thank you for all that hard work!

EDIT: Of course you might still be beaten out by scala.native -> WebAssembly -> Browser... nah, not really. :)


(Too late to edit my comment.)

Were you at ScalaDays 2017 CPH? I don't think I saw you there, but you deserve a round of applause, you Magnificent Bastard, you!


Thank you. No I wasn't there this year. I was furiously preparing this release :-p

See you probably in London for Scala eXchange ;)


Thanks! You know, if it comes to that, Scala.js community is a passionate bunch, we could probably crowdfund it :)


It would be shocking if they hired anyone but the creator of Scala.js -- hopefully after you finish your doctorate this winter Lightbend and/or Scala Center will step up and make it happen.


What if he wants to quit scala.js and go code trading bots to get filthy rich :)


Unlikely. I love compilers too much.


I work in a trading shop (not filthy rich yet, though). Do you know that Scala to FPGA compilers based on Chisel are all in vogue these days? ;)


Ah ah!


Are there any public Scala to FPGA compilers?



> I am reluctant in part because Scala.js does not quite have financial support of Lightbend. Or so it seems, it's a bit hard to tell where Lightbend ends and the non-profit Scala Center begins.

As the Scala team lead at Lightbend, I'd love to have a few members of my team focus on scala.js and scala-native. We do financially support their development (most recently, as part of our funding of the Scala Center).

As a business, it's a bit of a chicken-and-egg problem: our customers usually indicate they are hesitant to switch from JS for their frontend work. It would be great to have more customers provide feedback like yours (via our customer surveys)!


If I was Lightbend (whose work I use/enjoy/thank for) I think I would put Scala.js inside the Play Framework (as opposed to say Coffeescript)



I am exactly in same situation and experiences with Scala.JS as you... my #1 choice atm.


I built http://hnapp.com specifically to generate custom RSS & JSON feeds for HN. It's opensource on github too.

For example: http://hnapp.com/?q=haskell%20type%3Astory%20score%3E20


Thanks that is what I was really looking for. I will now use it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: