True. However I don't think the article really implies that P = NP (although the title certainly does). From what I can see the article only addresses the class of NP-complete problems. Factoring is not known or believed to be NP-complete so it wouldn't directly prove that factoring was in P. Please correct me if I'm wrong.
Edit : For some reason I can't reply to posts so I'll do it here.
RiderOfGiraffes> It would will leave in question those problems in NP, but not NPC. Currently it is generally believed, but not known, that factoring is such a problem.
This is exactly my point. P = NPC doesn't imply anything about factorisation. Since factorisation is in NP we haven't shown that P = NP
kd0amg> NP-complete problems ... are, in essence, the hardest problems in class NP.
From what I understand they a 'thought to be' the toughest but this isn't proven. If this article turned out to be true then they are reduced to being as hard as P. So who is still standing as a contender for the hardest problem in NP? - factorization of course as it has never been demonstrated to be polynomially reducible to NPC.
Proving that any single NPC problem is in P will be enough to prove that every NP problem is in P, and not just the NPC ones. Suppose A is is NP, B is in NPC, and further suppose that solving B is polynomial. Reduce A to B (a polynomial operation because B is in NPC), solve B (a polynomial operation by assumption), and convert the solution back to a solution of A (a polynomial operation because B is in NPC), and that gives a polynomial solution of A.
Proving that any single NPC problem is not in P is enough to prove that all NPC problems are not in P. It would will leave in question those problems in NP, but not NPC. Currently it is generally believed, but not known, that factoring is such a problem.
Added in edit, to reply to your edit:
RoG> It would will leave in question those problems
RoG> in NP, but not NPC. Currently it is generally
RoG> believed, but not known, that factoring is such
RoG> a problem.
dejb> This is exactly my point. P = NPC doesn't imply
dejb> anything about factorisation. Since factorisation
dejb> is in NP we haven't shown that P = NP
Read my entire comment again. In particular, the second paragraph. In particular:
Proving that any single NPC problem is in P will be
enough to prove that every NP problem is in P.
More specifically, proving 3-SAT is in P will prove that every NP problem, not just the NPC problems, but every NP problem, is in P. Including factoring.
Let me add a little more.
1. Suppose 3-SAT is in P.
2. Let A be any problem in NP.
3. 3-SAT is in NPC.
4. Therefore A can be converted to A', an instance of 3-SAT.
5. By (1), A', as an instance of 3-SAT, can be solved in polynomial time
6. Hence A has been solved in polynomial time.
Replace "A" with factoring, and thus we've shown that P=NPC implies that factorisation is in P.
The NP-complete problems are a subset of NP ("nondeterministic polynomial") problems, specifically those to which all NP problems are polynomial-time reducible. They are, in essence, the hardest problems in class NP. It's pretty trivial to show that integer factoring is in NP; what's not known is whether it's NP-complete. If any NP-complete problem is in P, then via polynomial-time reduction, all problems in NP are in P.
factorization of course as it has never been demonstrated to be polynomially reducible to NPC.
Every NP problem is polynomial-time reducible to SAT. I do not recall the title of the paper off the top of my head, but it has been proven that the definition of an NP problem (a question that can be answered given some additional piece of information, a "certificate") is polynomial-time reducible to SAT.
The methodology was basically an algorithm taking as input a description of such a solution, and producing a boolean expression on the certificate and some other values needed to maintain integrity. The expression is satisfiable if and only if the answer is affirmative.
The reason that factorization is not known to be NP-complete is that the reverse is not known. That is, no NP-complete problem is known to be polynomial-time reducible to factorization.
From what I understand they a 'thought to be' the toughest but this isn't proven.
Yes, it is. Any NP problem can be many-one reduced to SAT, which means that there's a function that transforms an instance of the NP problem to an instance of SAT, and the answer to whether that formula is satisfiable is the same as whether that instance is a member of the NP problem (in its decision version).
So who is still standing as a contender for the hardest problem in NP?
If P = NP, then all problems in NP are as easy as each other, so this question is meaningless.
To be pedantic, not all problems in P are equally hard. Sorting for example is harder than finding the median.
The provable lower bound for the worst-case performance of searching is O(n log n). Finding the median takes linear time i.e. O(n).
Sorting and finding a median reduce to each other polynomially, of course.
Interesting subsets of polynamial problems are e.g. linear problems, or highly parallelizeable problems whose runtime tends to O(log n) as the number of processors tends to infinity. Adding a list of numbers is not only linear but also highly parallelizeable. Proving a problem to be not parallelizeable like that, is also interesting.
There's also the question whether randomness helps. It does in practice, but we don't know whether access to random bits helps in theory.
Speaking pedantically, P is in NP also. I suspect that NP is being used as shorthand for NP-complete, not just NP. As for actual deterministic factorization Wikipedia says that
'It is suspected to be outside of all three of the complexity classes P, NP-complete, and co-NP-complete'
So even if this showed 'P = NP-complete' it may not imply imply factorization is in P. I'm at the limits of my rusty complexity knowledge here so if any knows better please correct me.
Actually, cperciva is right here. Factoring is in NP, therefore if P=NP then there is a polynomial-time algorithm for factoring. Many problems harder than NP are NP-complete, so saying factoring is in NP-complete would not imply the conclusion.
Also, there are problems in NP that are neither in P, nor are they NP- or co-NP-complete.
Cperciva is right, but I think you're mistaken when you say "many problems harder than NP are NP complete", NP complete problems are both NP and NP hard. NP hard is a class of problems such that every problem in NP can be reduced to any problem in NP hard. Meaning if P=NP then every problem in NP(including the NP complete problems) can be solved in polynomial time; all it takes is one algorithm that solves an NP complete problem in polynomial time (note that this does not show that P=NP hard, a much stronger claim).
No, there actually are problems in "NP-intermediate" class (if P!=NP) although they are artificial. :) And, of course you are right, for many other problems researchers suspect they might be in NPI. See http://cstheory.stackexchange.com/questions/79/problems-betw...
It's been proven that if P != NP, then there are problems in NP, which are neither in P nor NP-complete. We don't know what those problems are, because identifying such a problem would prove P != NP. There are candidates, such as factoring, for which there are neither polynomial algorithms nor proofs of NP-completeness, but definitively identifying them is at least as hard as proving P != NP, and possibly harder.
The point I was trying to make (badly) was that the actual claim doesn't imply P = NP, only that P = NP-complete. Given that factoring isn't known to be NP-complete (or co-NP-complete) this paper doesn't imply that factoring is in P.
If a single NP-Complete problem is in P, then P=NP. This is what NP-Complete means. In particular, factoring would be in P. Conversely, factoring is not NP-complete, so if factoring were in P, we could not draw any conclusions about P=?NP.
> The question is whether Apple can come up with the next ipod or iphone without him.
I think the question is more 'Will people BELIEVE they can come up with the next ipod or iphone without him?'. The confidence that Apple products will win out has helped drive their success and allowed them to popularise novel products like tablet computers where others could not hope to. Whether the confidence and the 'reality distortion field' will be retained without Jobs is what I see as in doubt. If they introduce another great new product - will they get the same rate of adoption?
It implies that the people of New Orleans were basically to blame for there problems which some think is unfair.
> Australia just happens to resemble the best parts of America rather than the worst parts, in this regard.
Anyone who was sensing racial overtones in the first 2 sentences pretty much has their confirmation here. I don't know if the author intended this or not but if he did have a point regarding race and/or culture to make then it should be made with more intellectual honesty and evidence/reasoning. Instead it comes of as snide at best.
I though the outrage for New Orleans was most about about the authorities response to the floods. In Brisbane the response is seen as being pretty good.
The last flood in 1974 was slightly worse and there have been worse ones in 1893 and 1841. So I guess you could say we are a bit prone to flooding. A damn was built in 1984 to mitigate the effects but it was close to full due to exceptional rain leading up to the main deluge. The damn still helped considerably and allowed them to block some water flow at high tide, but there is some suggestion that more proactive emptying earlier would have helped.
The Queensland Police facebook page http://www.facebook.com/QueenslandPolice is pretty interesting. Nearly everyone I know subscribed. Did a great job keeping people up to date, stopping panic, dispelling rumors, organizing workers. Pretty much the first time in my life I've appreciated Web 2.0 :) (Well, until the mobile towers went down...)
One thing that the whole event showed was how the government websites don't stand up to high loads whereas the mega sites (google, facebook, twitter) have no problem with such a localised situation. To cope with emergencies like this, the relevant govt agencies will have to look at distributed architectures possibly incorporating offshore cloud based services.
I think the police usage of Facebook was inspired. It's free for the police, there's an existing distribution mechanism in place, it's highly scalable and it's canonical, which can quickly cut down rumors. Twitter became a festering swamp of rumors and BS, whereas being subscribed to the QPS facebook page meant high quality information.
I tried to get onto the Moreton shire council to find a road closures list, and was faced with a downloadable MS Word document. There's the two extremes for you.
One thing that governments everywhere should learn from this is that maintaining and using Twitter and Facebook from established, trusted accounts is a very good way of getting information out there. Even if only a tiny fraction of the population are subscribed, they can quickly disseminate information to neighbours and family members.
The last flood in 1974 reached higher levels in Brisbane, but I haven't seen anything explaining to what extent (if any) the flood levels were lower this time because of the dam? Would it have been even worse without the dam?
At the very least it let them control the flow a little bit so we didn't just get hit by a massive wave all in one go.
I think the general consensus is that the flooding would have been worse than 1974 had the dam not mitigated what it did. Investigations and modelling will determine to what extent the releases from the dam increased the flood level.
I expect the outcome will be a new operating manual for the Dam, and more frequent minor flooding in keeping with an aggressive plan of keeping the dam at 100% (water storage) level, leaving the extra 100% of flood mitigation level free as soon as possible.
Another possible outcome may be more dams on some of the other river systems, such as the Bremer river.
The feminist script she describes is 1990's notion that a couple can obtain a satisfactory sex life through a process in which the woman overcomes the mans negative sexist conditioning by using a verbal contract to stipulate her limits. There some provisions made about people with no respect for women like date rapists and people with the "frat boy" mentality.
The example she gives is of a man who is by external measures an enlightened guy - wealthy, polite, and educated. We are given no reason to believe he is disrespectful of women generally speaking. But when it comes to sex he is unable to get it up unless he is explicitly pushing a woman beyond what she feels comfortable doing. For emphasis: this is a part of his sexual experience and not a part of his general attitude to women.
The fact that this single event occurred is sufficient to demonstrate that the 'How to have a mutually enjoyable sex life' script does not work universally and so requires revision for a full explanation of the world. It proves the unpleasant fact that dominance beyond enthusiastic consent can form an integral part of a mans sexual identity.
In this section she establishes that a breach in this perception of sex has occurred. In the rest of her article she uses online pornography to argue the size of the breach.
> The fact that this single event occurred is sufficient to demonstrate that the 'How to have a mutually enjoyable sex life' script does not work universally
I'm sure there are wealthy, polite, and educated necrophiliacs also and single instances of sexual fetishes and requirements so strange as to be unimaginable by most of us. Does his mean that we need to adjust our entire model of human sexuality for each of these? Maybe for people so limited of thinking as to believe that any one model of sex could achieve 100% coverage, this could actually be useful information. But to automatically assume anything about 'ordinary' sexuality is ridiculous.
I can't believe this isn't the most popular comment. Kinda makes HN look a bit like a knowledge vacuum with all these recent discussions of how to ban results when a google service that's existed for nearly 5 years can do the job. The format of the results isn't quite a nice as the normal google search though.
I'm on the "wrong" side of the Flash/HTML5 debate, so my average post value is really low. It wasn't a big deal until the HN algorithm got tweaked recently.
I'm pretty sure remember hearing that that microwaving removed more nutrients than steaming, but less than boiling. I suspect in each case it somewhat depends how long and intensely you heat it for. Attempting to propagate vague notions I have about things :)
Edit : For some reason I can't reply to posts so I'll do it here.
RiderOfGiraffes> It would will leave in question those problems in NP, but not NPC. Currently it is generally believed, but not known, that factoring is such a problem.
This is exactly my point. P = NPC doesn't imply anything about factorisation. Since factorisation is in NP we haven't shown that P = NP
kd0amg> NP-complete problems ... are, in essence, the hardest problems in class NP.
From what I understand they a 'thought to be' the toughest but this isn't proven. If this article turned out to be true then they are reduced to being as hard as P. So who is still standing as a contender for the hardest problem in NP? - factorization of course as it has never been demonstrated to be polynomially reducible to NPC.