Wed, Nov. 18th, 2015, 09:34 am
P and NP
I don't know if anyone reads this blog anymore. I certainly don't expect anyone to read this post. But I'm going to write it anyway, because it's a serious personal pet peeve of mine.
You hear a lot of computer scientists (and in particular, computer theorists) talk about the famous "P vs NP" problem. It's become so well known that it's entered popular culture in a number of different places, including an entire indie film that used it as its main plot device. However, there is so much misconceptions about this problems out there, even among people with computer science degrees.
Most computer scientists and software engineers have a pretty good idea about what P means. It's the problems with a known polynomial time solution. In most cases (although not all) these are problems that run relatively efficiently.
The category NP throws people, though. A disarming number of people seem to use this term to mean "Not-Polynomial". But it couldn't be farther from the case! NP actually refers to all the problems with a known "non-deterministic polynomial" solution. It means these problems can be solved in polynomial time (that is, relatively efficiently) if you have access to something called a non-deterministic Turing machine.
By definition, all problems in P are also in NP. Sorting numbers? That's in NP. Finding the shortest path between two points? NP again. Calculating the area in a rectangle? NP, of course. All of the easiest problems are in NP because they are in P, and all problems in P are also in NP.
So at this point, if you've followed the P vs NP problem at all, you might be a little confused. "Isn't the point was that we didn't know if these things were the same? I thought NP meant really hard!!!" Ah yes, see, we know that all problems in P are in NP. What we don't know is whether or not all problems in NP are also in P! Because some things that are easy to do with a non-deterministic computer seem to be very, very hard to do on a real computer.
One famous example is the famous "traveling salesman" problem. What is the absolute most efficient way for a person to visit every city in his state, and return home? While it's possible to quickly come up with an answer that's close to the most efficient, there is no fast method for finding one that's certain to be the absolute fastest. The only way we really know how to do that is to basically try an exponential number of possible routes. Which takes a very long time if the number of cities is very large at all.
This is an example of what is known as an "NP-hard" problem. These are the problems that are so hard for computers to solve. NP-hard problems are not necessarily in NP. NP-hard simply means it's at least as hard as the hardest problems in NP. It means that given a program that can solve an NP-hard problem, you can translate any other problem in NP in polynomial time to be solved by your NP-hard solving program. In other words, if you could solve the traveling salesman problem efficiently, you could solve all the other problems in NP relatively efficiently as well.
Traveling salesman is both NP-hard and in NP. I won't go into too many details about why this is, because over simplifications of NP often lead to more misunderstandings. (I'll save that for another post, if anyone is interested). Problems that are both NP-hard and in NP are known as "NP-complete". I believe when many people use the word "NP", what they really mean is "NP-complete". These are problems that (as far as we know) are very hard to solve on real computers, but are quite fast to solve on the theoretical device known as a non-deterministic Turing machine.
One common misconception out there is that a quantum computer is a non-deterministic Turing machine. This is not true. Quantum computers (if they ever really get built) are simply nowhere near that powerful. However, it is possible to use them to factor prime numbers quickly - something that could break the most popular form of encryption these days. Another common misconception is that prime factoring is NP-hard. It isn't. (If it was, quantum computers would be non-deterministic Turing machines). While we are pretty sure prime factoring is not as hard as NP-hard problems, we also don't known a polynomial time algorithm for it yet for traditional computers. There is one for quantum computers, though. I think this is part of what makes some people mistakenly think quantum computers are non-deterministic Turing machines. Sorry, no such luck.
So what's the big deal about the whole P vs NP problem? Well, after years and years of trying, computer scientists have been unable to do two different tasks. The first is to try and come up with a polynomial time solution on a standard computer for an NP-complete problem. If they could, it would mean P and NP are actually the same thing. All problems that can be solved on a non-deterministic computer in polynomial time could also be solved on a regular computer in polynomial time. This would be huge, as it would mean countless problems that are deemed too hard to solve today could potentially be solved relatively efficiently.
Most computer scientists think this is impossible, though, for reasons I can go into in another post. However, despite all their trying, no one has been able to write up a bullet-proof math proof showing why this is impossible. Most of us are pretty sure it is, but there's a difference in mathematics between having good reasons to think something is impossible and actually proving it. In other sciences, they'd probably just say "evidence shows this isn't true, so unless something changes, we are going to say P != NP". However in mathematics and computer theory, this doesn't cut it. These people expect perfect proofs, not just good evidence. (This is one of the major differences between mathematics and empirical science.) That said, from a working computer scientists perspective, it's generally just accepted that P != NP, although a very tiny minority still hold out hope.
And that, I believe, should help counter many of the major misconceptions about the famous P vs NP problem. P and NP are not opposites. The question is, are they exactly the same thing, or is NP everything in P, plus a lot more?
The 1980's - The Professional's Internet: The proto-Internet exists almost entirely for the purposes of researchers, academics, engineers and the like. This is the most serious the Internet will ever be.
The 1990's - The Nerd's Internet: The early Internet adopters were almost all heavy computer nerds, techies and other's who were first intrigued by the power of the Internet to share ideas, data, technology. Even the pop culture online in the 1990's was very nerdy. The politics associated with the Internet were more libertarian than liberal.
The 2000's - The Mainstream Internet: The majority of homes in America begin to have some kind of Internet connection. The Internet begins to lose its associated with being for geeks. The idea of buying something online isn't just for hardcore geeks any more, but is something anyone might do. Social networks start springing up that are just for nerds, but intended for normal people. Even Grandma likely has an Internet connection (even if all she ever uses it for is forwarding you horrible e-mail rumors).
The 2010's - The Clickbait Internet: Virtually everyone is online, and almost everyone is heavily involved in some kind of online social networking. Internet marketing from even mainstream sources becomes more and more debased in an effort to drive people to their websites. The "culture" of the Internet is no longer associated with science and technology, but with pop culture of the lowest common denominator. The biggest users of the Internet are no longer the tech nerds, but those who are into socializing and spreading various pop culture "memes". Even sites like ThinkGeek now primarily sell items related to TV shows and movies.
The 2020's [Prediction] - The Post-Internet: With the Internet of things, streaming TV, smart watches, etc, what is and isn't the "Internet" (or a "computer" for that matter) becomes completely blurred. All technology is connected in some way or another. There is no longer an "Internet culture", but various cultures that are blurred across the various ways we communicate and entertain ourselves.
According to the latest polls, over half of all Republicans would prefer a candidate with zero
experience in government, and almost a third of Democrats would prefer a candidate who literally considers himself a socialist.
What in the world has happened to our country?USA Today 2016 poll tracker
So I've really been enjoying the new New Order album lately ("Music Complete"). For those who haven't heard it yet, it's one of the best things they've done in ages. Ironically, it's also the first album they've done with out their legendary bass player, Peter Hook, who left the band a number of years ago. For those unfamiliar with the band, Peter Hook had a very iconic style of bass playing that often supplied the melody as much as it did the bass line.
It got me thinking about Peter Hook's bass parts, and how pretty much all of his really classic bass parts came from either Joy Division or really early New Order songs. Ceremony, Transmission, Disorder, Leave Me Alone, Love Will Tear Us Apart, etc. They were all in the real early days of the band. As the band progressed, and became more electronic and production heavy, Hooky's bass lines became more just added flavor on top of the songs, instead of something integral to the music. The real bass would end up being supplied by synthesizers, and Hook's part was just something added on top. Even on the track's that were more rock-guitar based, his part was not nearly as interesting and intertwined as it was on their early material.
I think part of it is the way they changed how they wrote music. In the early days they were rooted in the punk movement. Each song was crafted for the four of them to play live, probably largely written in collaboration in a room together. The bass, guitar and drums all work in sync, puzzle pieces that were designed to fit together. Even the early synthesizer lines were intertwined in this same way, written in a live setting to be played in a live setting.
As time went on, though, they began adopting the more modern, heavy production techniques that's dominated most mainstream music. With the exception of some indie type bands, songs are no longer crafted by a group of friends in a room together. Instead, they're often driven by layering different ideas on top of each other in a studio. The recording ends up being more complex, but the result ends up being harder to pull of live, and the individual elements are rarely anything special in their own right.
New Order's latest album is no exception to this. In fact, it's probably the heaviest production they've done yet. It's not necessarily a bad thing, it's just a different thing. But it also explains why Peter Hook's bass lines were no longer what they used to be. I think if Hooky wants to really re-capture the class Joy Divison, early New Order sound with his new endeavors, he'd be wise to try and really strip down the production to old school punk levels again. Make the bass part the focus of a song, with some minimalistic guitar and drums intertwined. That was the formula that worked so great on all those early classic songs. It'd be interesting to see him try to recapture that again.
Yesterday was "Back to the Future Day". We had finally arrived at the point in time when Mary and Doc Brown had traveled to the "future" in Back to the Future II. Like a lot of people, I re-watched the movie, as well as many of the fun videos people have been creating to celebrate the moment. Something I noticed that's kind of interesting in the films is the relationship between Marty and Doc Brown. You have a young protagonist who's best friend and mentor is apparently an eccentric old genius who lives in the neighborhood.
While this seems like an unusual pairing in reality, it seems to be quite common in many popular adventure stories. I'm sure I'm not the first person to notice this (it's probably canonized in the monomyth somewhere) but there is definitely an archetypal character of the "Old Wizard Mentor" in a lot of these popular stories, who often befriends and mentors the young protagonist. Here's some examples:
Merlin and Arthur (perhaps the origin of the archetype?)
Gandalf and Bilbo
Gandalf and Frodo (Gandalf got to do this a lot, apparently)
Obi Wan Kenobi and Luke Skywalker
Yoda and Luke Skywalker
Doc Brown and Marty McFly
The Doctor and Companions (Doctor Who)
Dumbledore and Harry Potter
Doctor Xavier and various X-Men
In many cases, it's a literal wizard who helps the protagonist. (Obi Wan Kenobi is literally called an "old wizard" in one line in Star Wars). In other cases, it's an eccentric older genius of some kind. In either case, the character interaction is very similar. Often times they are represented as a "doctor" or professor of some kind. In the case of Dumbledore, he is both professor and wizard.
While I don't think this kind of pairing is as common in real life as it is in fantasy/sci-fi stories, I can attest to having one of these friendships when I first started working at the lab. An older scientist (who has since past away) was a major mentor to me when I first began working here. He had the role for a number of young people, I think, and was even nicked name something like "Mr. Wizard" by school children he occasionally taught science to. If you think about it, the original Mr. Wizard was himself something of this archetype as well.
What about you guys? Can you think of any more examples, either in real life or fiction, that I am missing? Would love to hear your thoughts.
Mon, Oct. 5th, 2015, 09:55 am
What I Do
I am such a weirdo when it comes to what I do in the overall computer science and software engineering world. I'm not an academic computer scientist, but I'm not a true software engineer either. As somebody who's personality is probably closest to the "ENFP/INFP/INTP" types -- I love discovering patterns and logically analyzing things, but I have an incredible difficult dealing with the tedious nature of both pure research or pure engineering. Luckily I have a career that lets me do an interesting mix of the two that I actually do have the attention span for.
See, pure research is tedious because it involves either a lot of experimentation, or a lot of rigorous mathematical proofs. Either one involves a serious amount of tedious, exacting work. Neither of which are things people with my personality traits are good at handling. I like discovering cool proofs when my intuition can figure them out, but working through a long, exacting, tedious set of mathematical deductions is much more difficult for me. I have trouble even reading them - not because I don't understand them, but because I don't have the attention span for it.
Likewise, I struggle with pure engineering for similar reasons. Pure engineering involves large amounts of tedious, exacting, testing to make sure every minor piece of your project is completely fool proof. All aspects of your software should be well-documented, with a consistent set of programming standards and coding practices applied, coupled with large amounts of unit testing. While I can do all of these things fine, I find myself getting quite bored by these more routine aspects of software engineering if I end up doing them for long periods of time.
This is why the position I have at PNNL is so awesome. You might call what I do "research engineering" or "research development". Some people even call it "mathematical programming", among other things. It's not pure research - I don't always have to be publishing research papers on what I do (although I occasionally do). But it's not pure engineering either. The main point is to get the software working - not to have it full-proof against every possible issue a user could have with it. If it gets to the point they want to turn the software into production code with large user base, I almost always pass it on to someone else (although they might call me in to help with various parts of it).
It's great because this sweet-spot between engineering and research is the least tedious aspect of computer science and software engineering I know of. I get to think of interesting ideas, implement them fairly quickly, and see if my results show any interesting promise. If I really want to promote them in academic circles, I can work with more pure researchers to do that. If I need to turn it into more robust engineering, I can work with experts on that as well. And I get to utilize my own skillset to its best ability. It's a pretty great place for me to be in.
If you like computer science and software development, but find pure research or pure engineering too tedious, I'd recommend looking into positions like mine. You might find you really enjoy it after all.
For a while when I was a kid (around 3rd or 4th grade), I decided it might be cool to be a mathematician when I grew up. I remember wondering how feasible it would be to actually make money doing this, and I remember deciding that what a mathematician probably did was have their name in the phone book under "M" for mathematicians, and when someone was having a difficult mathematical problem which they needed help on, they would call him up to come to the house and help them figure it out. In my mind a mathematician was basically like a plumber, except for math problems.
Saw this on a forum, and thought it was about the best explanation of Portland you could do in three sentences:
* It's solid Portlandia east of the river out to about 82nd.
* West of the river is solid yuppie, something that gets rarely mentioned about Portland.
* East of 82nd it's solid Wal-Martian and Real America.
Only thing I would correct about this at all is that there is definitely part of Portlandia in the downtown area west of the river, but besides that part of downtown this is absolutely accurate.
Some of you may have seen my link on Facebook
about the problems associated with holding a "belief in a just world". Psychological experiments have been performed that show a large percentage of people will tend to blame people being in bad situations, and praise people for being in good ones, even if those situations have nothing at all to do with what the person themselves has done. The hypothesis is that we do this out of a way to uphold our belief that the universe is ultimately a just place.
I think the concept of "cognitive dissidence" may also be at play here. If we ourselves can't help a person out, we want an explanation of why this is the case. (e.g., "I can't give money to the people with the cardboard signs, because they're all clearly conniving panhandlers looking to abuse people"). As someone who rarely hands out money or anything to panhandlers, I'll have to admit it's hard to avoid thinking things like this to justify yourself for not being more concerned for them.
All of this has got me thinking about what the word "justice" really means, anyway. As someone who's done a certain level of theological digging, I have to admit that I sometimes struggle with what "justice" really means, especially on a social level, aka, "social justice".
When it comes to some types of interactions, the idea of justice is pretty straight forward. If I make an agreement with someone and I don't uphold my end of the bargain, I've committed a form of injustice. But what about situations in the agreement is more nebulous, or other circumstances cause things to fall through? Who's to blame? If I sell someone my house, and it ends up to be the location of an earthquake, I'd only be guilty of injustice if I kept some sort of seisomological knowledge to myself. But the guy who's house is destroyed is just as bad off whether or not I was malicious or innocent. From his perspective, he probably feels like the victim of injustice either way.
While I don't really know much about the topic of "social justice", I feel like it must be filled with similar subtle issues in which there is no clear answer. For instance, what is exactly "justice" when it comes to income? Is the most just thing that we all earn exactly the same amount? But that hardly seems like justice for the person who works extremely hard and productively to make the same amount of money as someone who avoids working and is literally counter-productive. (The story of the Little Red Hen comes to mind here.)
So what if everyone were to be paid in a way that works out perfectly to their productivity and work ethic? Some people would clearly make more money than others. And their children would obviously have more income and possessions than those with less money. And so we again see some people have more money not because of what they've done but because of the situation they are born in. You could try and put a stop to that by controlling everything everyone does with their lives, but in the end you would have a tyrannical government that's even more unjust than the situation you began with.
This is one of the reasons the whole concept of "social justice" seems so weird to me. Maybe people need to stop worrying so much about this elusive "justice" concept, and start just trying to make everyone's lives a little bit better.
It all kind of reminds me of the parable of the prodigal son. The son in that story certainly deserved the consequences of his poor life choices, but his father didn't leave him to suffer those consequences, even though he deserved it. He was merciful and gave him back more
than he deserved. The older son even complained that the father's mercy was actually a kind of injustice, not understanding that he didn't really deserve what he got either.
"Progressive Revelation" is one of those concepts I don't believe I ever heard until I became a member of a Reformed Church. However, I think most Christians probably believe it, to at least some extent, even if they don't ever use the term or talk about it. The concept of "progressive revelation" is that God has revealed more and more things over time, and revealed them more and more clearly. I think it sounds a bit "liberal" to some ears at first (what with the name "progressive" in the title and everything), but it's actually a very logical concept for all believers to hold to, once you start to think about it.
I would actually make the case that everyone who believes the traditional view of the Bible also believes in progressive revelation. And that is because the traditional view of the Bible is not that it fell out of heaven and landed on Adam's lap, but that it was in fact written by a number of different people over a very long period of time (thousands of years after Adam even existed). There was additional information revealed as time went on, as more people were called to write down God's word.
A non-controversial example would have to do with specifics revealed in the New Testament. Like the fact Jesus was a carpenter's son. While there are plenty of Messianic prophecies in the Old Testament, none to my knowledge deal with that fact. Likewise, what is prophesied in a later prophecy (for instance, the virgin birth in Isaiah), is not declared in earlier books (by Moses, for instance). So clearly, there is more revealed to people as time went on than was revealed from the very beginning.
What I think sometimes trips people up is that this progressive nature of revelation goes beyond basic facts about the Messiah, to more broad theological concepts. One example would be the doctrine of the resurrection. This is one of the central tenants of Christianity, and yet through out most of the Old Testament, this idea is very fuzzy at best. Only in a few places do we see what looks like a strong case for the future resurrection of our bodies, most particularly in Daniel 12. Even with a traditional dating of the books, the passages that talk most clearly about resurrection are from the era of Babylonian captivity or later.
Other concepts that are central to Christianity, but even more fuzzy in the Old Testament include salvation by faith, the new birth and the Trinity. But, at the same time, most traditional Christians teach that believers in the Old Testament will be saved the same way as us, by grace through faith in Christ. But how is that even possible given the clearly fuzzy understanding of all things theological they must have had?
I believe this is a clear example to us about what salvation by faith apparently doesn't mean. It doesn't mean having a systematic knowledge of the doctrines of salvation. It's something much more spiritual than that. The Bible says saving faith is a gift from God (Ephes 2:8), not something we get by memorizing facts (James 2:9). This also explains things like John the Baptist showing evidence of miraculous faith from the womb (Luke 1:39-45), and yet at the same time, wondering whether or not Jesus was the Messiah later in life (Luke 7:18-20)! And these passages are from the very same book!
As a Christian, I don't believe anyone "deserves to be saved". Salvation is by grace - by definition something better than what we deserve. The fact that people were saved with such fuzzy understandings (at best) about what that even meant, tells me that God's grace is more amazing than I think we often want to make it. It's not "ascribe to this list of theological doctrines and you will live forever". It's something a lot deeper, spiritual and more mysterious than that. Or, at least it seems to me. :-)