This looks incredibly cool; I'm wowed by the fact that the model has learnt to negate words in if-else statements, though I struggle to think of a case where that particular completion would have been useful.
At the same time, I'm less excited about the fact that the model is cloud-only, both for security/privacy reasons and because I spend a not-insignificant amount of my time on limited-bandwidth/high-latency internet connections.
I'm also curious as to why the survey didn't ask about GPU specifications; most of the time I use my laptop to code whilst plugged in, and I'd happily use only LSP completions when on battery, so power consumption wouldn't be an issue (though fan noise might), and allegedly my GPU (a GTX 1050) can pull off almost 2 TFLOPs, which is well over the "10 billion floating point operations" mentioned in the post.
> I'm wowed by the fact that the model has learnt to negate words in if-else statements
I know it learned natural languages from using GPT-2, but I am surprised it didn't get "confused" since words are used in such a different way in programming.
For example strong appears as the html tag <strong> with no corresponding <weak> tag. And weak appears in weak_ptr in C++ and there's no such thing as a strong_ptr.
An interpreter directive is far more common than a piece of configuration specific to an editor. It also serves a practical purpose on *nix style systems whereas editor configuration is a matter of preference.
Sorry, brain fart – you have to upgrade between non-LTS releases sequentially, as far as I know there's no supported way to upgrade to 19.04 from anything but 18.10. (in general, to get to release n, you must either be on release n-1, or n must be an LTS and you must be on the previous LTS.)
Sorry, totally missed that. Looks like some other people have had similar issues with upgrading to previous versions: https://github.com/Microsoft/WSL/issues/3489 (though the issue is for (I think) 16.04->18.04, someone mentions that it also affects 18.04->18.10)
it's strange that that's all the output you get, though. Apparently it puts logs in /var/log/dist-upgrade/, is there more there?
AHH, thank you! /var/log/dist-upgrade/main.log explains why! It's indeed another instance of a bug I already saw when upgrading another system [1] and couldn't figure out -- there's a missing dependency in some Ubuntu package:
Which is, IMHO, a ridiculously short-sighted approach that ignores the difference between theory and practice.
If there is a vuln in (or before) the GPG signature check, using HTTPS has a good chance of making it a lot harder to exploit (because the attacker will likely need to get into a trusted position instead of MitMing any HTTP connection).
One difference between ctrl-tab and alt-~ (or alt-`, depending on your keyboard layout) is that tabs within a window are very close to each other, and obviously distinct from windows in their own right, whereas it's frequently not immediately obvious whether two windows belong to the same application.
> installing and managing apps through it is also easy
but updates are not automatic (unless you root)
I don't dispute that F-Droid is not terribly difficult to use, but the original statement that "It's easier to use than the Play Store" is obviously false as soon as you take into account the mechanics of getting it installed.
That's from the "tabs" permission; I'm not sure why the extension requests it, as it doesn't appear that it's required by the code (the chrome.tabs API does not require it, you only need the permission if you're reading the URL/title/favicon of existing tabs AFAIK).
Because it's not true! Threads in Python can muck around with the state of any other thread, at any time – it would be a language change to assume otherwise.
At the same time, I'm less excited about the fact that the model is cloud-only, both for security/privacy reasons and because I spend a not-insignificant amount of my time on limited-bandwidth/high-latency internet connections.
I'm also curious as to why the survey didn't ask about GPU specifications; most of the time I use my laptop to code whilst plugged in, and I'd happily use only LSP completions when on battery, so power consumption wouldn't be an issue (though fan noise might), and allegedly my GPU (a GTX 1050) can pull off almost 2 TFLOPs, which is well over the "10 billion floating point operations" mentioned in the post.