I think GP was taking about the general nature of “previously assumed to be unbreakable” methods being broken. Not sure if he has implying using a checksum also for encryption
What do you mean by "previously assumed to be unbreakable" ? SHA-1 has been known to be unsafe for a dozen years, we just went from "assumed to be breakable" to "yep, definitely breakable, here's how one exact attack will work".
I can see why backups might be needed for a dozen years, and I can see why encrypted backups might be needed, but outside plainly fake requirements like those of "national security" why would encrypted backups be needed for a dozen years? Aren't we throwing everything sensitive away after seven years? After that isn't it mostly about preserving history? Even things like balance sheets that might be sensitive today will be too out-of-date to be sensitive a dozen years from now.
The obvious counter-example is my library, however old my photos or music or videos are I'd like to keep them for as long as possible, and because they're private I'd like to keep them in an encrypted form
If you use SHA-256 to encrypt your backup, then I just need to steal your backup and wait 20 years, until that is cracked, and then I can decrypt your backup, even though today you’re using the “correct” encryption.
To be excessively pedantic you can encrypt securely (but slowly for the SHA series) with a hash function of sufficient capacity by running the hash function in CTR mode. You turn it into a stream cipher. Ideally you also MAC the ciphertext, nonce, and other associated data. That's is pretty easy with such a hash function (either use HMAC or a MAC mode of the hash function if supported).
Salsa20 & ChaCha20 cores are hash functions (though not collision-resistant compression functions since it's not needed for their design goal and would be slower) run in CTR mode.[1]
Can you explain the relevance? If I put N items randomly into >> N buckets the chance of there being a second item in a particular bucket is small (as opposed to there merely being a bucket with two items, as in the birthday "paradox").
From my numerical experiments (I hope I didn't mess up...) using the random oracle model, the probability that a given key is collision-free is 99.6% if the input is one byte shorter than hash, 1/e if input is same size as hash and 6.6e-112 if the input is one byte longer than hash.
And this holds basically irrespective of key size.
If you're planning to brute-force count through 2^(128x8) possible bit inputs, it will be quite a few decades indeed. And you'll need a few spare solar systems to annihilate to get enough energy to drive your counting engine through that many states.
Hashing is a separate problem from encryption. There is no proof that one way functions (the idea behind hashing) even exist (by proving this, you would actually prove P!=NP, IIRC).
Encryption has a slightly better track record of being broken. AES still holds its promise and is also secure against quantum computing (you might want longer keys, but that's it).
And if you want really, provably unbreakable encryption, there is still OTP. But then you'd need a key, that is as long as the data you want to encrypt.
The best known attack against AES reduces attack complexity by about two bits over brute force. Given the history of block ciphers, the idea that AES might not be broken in this life time is not uncommon.
I wonder what do you mean by "not useful"? They just have to participate in the reputation system, but that's an issue only when the certificate is young.
Here's an excerpt from MSDN:
> Detractors may claim that SmartScreen is “forcing” developers to spend money on
certificates. It should be stressed that EV code signing certificates are not required
to build or maintain reputation with SmartScreen. Files signed with standard code
signing certificates and even unsigned files continue to build reputation as they
have since Application Reputation was introduced in IE9 last year. However, the
presence of an EV code signing certificate is a strong indicator that the file was
signed by an entity that has passed a rigorous validation process and was signed
with hardware which allows our systems to establish reputation for that entity more
quickly than unsigned or non-EV code signed programs.
I didn’t say “not useful”. Clearly they’re useful. I said non-EV certs “aren’t as useful”. Which is just a fact (as evidenced by the Smart Screen “reputation boost” that EV certs get).
I already read that blog post. I’m person that linked to it in the forum post.
However, isn’t getting an EV certificate impossible for a natural person? You’d have to be some sort of legally recognized organization. Not exactly suitable for small-scale Open Source development.
KeePass: This isn’t an EV certificate (has only OID 2.23.140.1.4). Certum also clearly states, topmost on the description of how to get an EV Code Signing certificate:
> We do not issue EV Code Signing certificates to natural persons!
Yarn: Not an EV certificate either: "Organizationally validated certificates used to sign standard objects." (2.16.840.1.114412.3.1 in addition to 2.23.140.1.4.1).
I don't think you actually understand cracking if you're claiming your protection is uncrackable. You're certainly not the first licensing company to sell that lie, if that is what you're claiming. I can explain why what you just said is easily crackable if you'd like.
But I'll just give you the benefit of the doubt and say you didn't actually understand the question.
(Also, I'm certain I'll be downvoted for commenting on a competitor's product, but licensing companies that lie to customers is a particular pet peeve of mine).
I didn't mean to claim that the product is uncrackable; I only meant that the API is secure and produces cryptographically sound tokens and license keys. Keygen does nothing to circumvent users from modifying a product's source code. It is only an API, and not a way to obfuscate an app; that's up to the discretion of the company/person developing the app.
There will always be ways to bypass licensing, especially for apps built on web tech, e.g. web apps, Electron apps, NW.js apps, etc. There are ways around it, sure. But that part isn't what Keygen is for. Keygen uses a combination of serial keys for licensing, as well as hardware-locked licensing by tracking machine fingerprints. It's up to the developer to enforce these, however.
Also, Keygen solves a very different problem that Nalpeiron, Lime LM, Agilis, Cryptlex, etc. do not solve: easy licensing for web-based apps. All of the solutions I've seen are cumbersome, unintuitive and are of course primarily designed for compiled apps. All of that has lead me (and others) to developing licensing systems in-house that behave more or less identically.
What Ubisoft did a few years ago with Settlers VII was to put required pieces of code in the DRM; e.g. without an internet connection, the software would not function at all; it took over a year and a lot of hard work before they found a way to write their own server to serve up the required bits, and it was just for that game, not a general solution.
True :) It was also very badly programmed, making your PC die in agony when you'd play it, even if you had the highest end stuff on the market. Still a very fun game though.
a previous company I worked for spent many pound coins on using metafortress to make our softeare exceptionally difficult to crack.
It went from 5 minutes in a hex editor to being a rather involved job. So it stopped it for a while.
Then people realised that instead of cracking licensed program, it was much more simple to crack the license server. (this also made detection much harder.) It also had the advantage of allowing rafts of other software not made by us work as well.
Hey! We've used limelm in our (now defunt product). Everything was a pleasure to use, both the client libs and the web frontend! That said, we obviously had to camouflage calls to the client lib and apply a few other tricks to make mocking out the calls harder. Great product!
Based on reading his posts here (site doesn't work), his product seems to be something you'd run on the server rather than on the customers side. Sounds like it does licensing for web applications, not for software that you download to your computer.
Your explanations are very nice. I have just one thing I am wondering. What if a customer is using something like VMWare to activate the product and then distribute in the company a VMWare image. Can your hardware based licensing scheme work?
Partly because a bad workman chooses poor tools to begin with. Tools do matter. It's easier to find a mistake in say, an SQL query or some R code than it is to find a mistake in an Excel spreadsheet, where you are trying to catch the difference between SUM(A3:A12) and SUM(A3:A10) in a thousand different cells.
> where you are trying to catch the difference between SUM(A3:A12) and SUM(A3:A10) in a thousand different cells.
Excel does a pretty good job in highlighting which cells are selected when you edit a formula, and it does a pretty good job of maintaining the meaning of the formula under sheet transformations. For example, if you inserted a row between rows 8 and 9 then the two formulae would be =SUM(A3:A13) and =SUM(A3:A11) respectively.
This may have been a genuine error, but the two researchers definitely started the process with a goal in mind, and when the results agreed with their goals they didn't bother to check.
This may have been a genuine error, but the two researchers definitely started the process with a goal in mind...
What would that goal be? According to Megan Mcardle, Rogoff was a mild proponent of stimulus. For example, he said this in 2012:
"Back in 2008-9, there was a reasonable chance, maybe 20% that we’d end up in another Great Depression. Spending a trillion dollars is nothing to knock that off the table."
Yes, it does highlight the cells. But if a mistake does somehow make it in, it is still exceptionally tedious to find the error. It is easy enough to expand a formula for additional cells but accidentally miss a cell, leaving it with the old formula. If you were using SQL, R, or Python, this class of error would never happen.
I guess it is just that to do a proper audit you need to have your logic on one page. Otherwise it is extreemly hard to follow the code. In excel every formula is sitting essentially on it's own page. It is easy to code this way but hard to audit.
Recent versions of excel do warn you if formulas omit adjacent cells. I agree with your sentiment to some extent but with excel 2011/2013 you have to actively suppress these warnings.
I'm not sure I understand why it is easier with SQL than Excel. Both seem to offer similar roadblocks and both seem to offer similar solutions.
I think we're thinking there is a technological solution to a human problem, and personally I don't think there is or could ever be.
This is a process problem. And by process I mean the process of building complex datasets, validating them for "correctness," and judging the quality/correctness of different data sets.
Unit tests are a massive asset to programmers. I wonder if unit tests (i.e. "sanity checks") would also help in this situation? I mean you would have to force people to write them and monitor them, but once they've been created they pay for themselves by picking up unexpected errors.
One of the big problems with science is the proliferation of Excel "databases". It is very easy to look at a lot of numbers and make quick calculations with them. However, when you start getting into extremely large datasets, your propensity to make mistakes increases. A2:A10 here, B3:B11 there, etc... This is one reason why recent versions of Excel warn you when your formulas aren't in sync.
However, all of this is fixed when you are using SQL to properly query a database. Why? Because you are forced to write a SQL statement that details exactly what you want done. With Excel it can all be hidden away behind the cells. With SQL, it's out in front, so it's easier to check.
People like to use Excel because they can get an answer quickly without all that "programming". The problems start to arise when you need better tools, but only know Excel. So, in this case, it's not a matter of a craftsman blaming their tools, it's closer to an amateur trying to pretend to be a professional.
Excel is a wonderful spreadsheet. It is a horrible database.
Unless you write SQL queries that use stored procedures or pre-calculated results tables, and then you just wind up in the same situation as Excel.
I mean Excel is at its heart a query language. So your logic equally applies to Excel, why not write a massive query in Excel that does all the calculations in one go so you can see the inner workings?
All you're really doing is playing musical chairs with the data. SQL query language for Excel query language, and data moved from tables to worksheets.
As I said earlier, unless there are procedural changes upstream nothing will change. A tool is just a tool. You can use it in a way to minimise mistakes, or not. Humans are the weak point.
Now there are tools that automatically identify common mistakes but neither Excel or SQL/relational database engines are in that class.
> I mean Excel is at its heart a query language. So your logic equally applies to Excel
The least readable query language ever. Instead of names, everything must be addressed as column,row. Imagine a C program that instead of declaring variables with sane names as needed, simply created a large array of each type and used constant numeric indecies. That's what Excel requires.
Excel cells have been nameable since forever... It also has supported types since forever. I'm literally talking Excel 2000 or older functionality[1][2].
With respect do you even know how to use Excel? Because everything you just said is incorrect.
Only one of the tens of heavy users of Excel I have worked with know of this feature on others like it. I am not a heavy use of Excel but my theory is that this is due to a failure of explanation by the software. Excel does not by its design encourage good coding, and most of the users of Excel do not have any programming background so they do not realize that these features should even be there so they wont know to look it up in the manual.
And it's easy to write horrifically bad, but working, code in C or C++. How exactly do you propose Excel butt into your workflow to expose advanced features?
If you plan to use Excel to aid you in making serious money (as countless business do), can't you fork out a thousand extra for an employee who actually knows how to use Excel properly?
Naming a single cell doesn't solve the problem of accidentally using a different range of values. In a normal programming language you would define a variable such as GDP containing a list of values for the countries of interest, in Excel every formula has to restate the range, which is what caused the mistake.
I hate it when people generate these contrived examples where they intentionally stack the scenario to favour whichever way proves their point the most.
Which is easier to understand:
=SUM(profits01_2012:profits12_2012)
Or:
SELECT * FROM dassaddasads WHERE date > 1325394000 AND date < 1356930000
I have seen literally hundreds of Excel spreadsheets from banks, consultancies, and VC firms and I have never, ever seen anyone name individual cells the way you are proposing here. If I had to name individual cells in a monthly model spanning the next 10 years over 250 line items, I'd quit that job.
I'm not saying that Excel can't do it, but a) it sounds like it would take a huge amount of time, so b) it is not practically done in "the real world."
They aren't stereotypical of the kind of Excel that gets written by people who use it as a major part of their job. They're stereotypical of people writing poor Excel queries against unnamed cells via location reference.
This release causes Internet Explorer 6 to freeze just by including the script in the page. This is a non-starter. I'll submit a bug report about it later today, just thought you guys would like to know in the meantime.
EDIT: It looks like the freeze on IE6 might be caused by FancyBox (http://fancybox.net/ ). Though it didn't freeze with any previous version of jQuery.
A reduced bug report would be good. We haven't seen any reports of this elsewhere (and the jQuery test suite runs to completion in IE 6) so I'm not entirely sure what could be happening here. Any insight you can provide (and a bug report) would be appreciated.
Yes, quite a bit more than 50K/year, but we're only a web startup in the sense that we sell all our products via the web. Our 2 main products are an updater suite for Windows apps (http://wyday.com/wybuild/ ) and licensing & online activation for Windows, Linux, and Mac OS X apps (http://wyday.com/limelm/ ).