|
So I decided to try to implement the cookie thing I posted about a few days ago. I grabbed a git clone of the WebKit source and built it on my Mac. After poking around for a bit, I found the relevant cookie jar code for the mac platform. Turns out the cookies are actually stored in a system-global singleton cookie jar on Mac OS X (Documentation link).
Well, that's kind of weird and unexpected. Does that mean every app running on my machine has access to my Safari cookies? Let's see... So I wrote a quick Objective-C test app to access the sharedHTTPCookieStorage, and sure enough, I could read out cookies that were set from Safari.
I'm not sure this is a security hole per se, since all apps running on my machine are supposed to be trusted. If there's anything malicious there then they can do worse than steal cookies. But still, to me this seems like a rather odd design decision. I guess it makes sense if you want to have a more unified experience for all apps across your platform. However it's interesting to note that on iOS, the cookies are not shared across applications. Maybe Apple decided that the potential cost wasn't worth the benefit?
Anyway, since I can't trivially change the WebKit cookie jar code, at least on the mac platform, I guess it's time to dive into Mozilla instead...
[ 0 Comments... ]
Instead of trying to sue P2P file sharers, the recording industry should involve them and make them a legitimate distribution channel. The way it would work is each p2p user would be allowed to "sell" a music file to other users in exchange for a e-coin issued from a bank. (This is cryptographically possible, I've seen a number of papers on it). The e-coins would be bought from a bank for real money, and would be redeemable for real money. The catch is that they are redeemable for less than they cost to buy. The difference is what the recording industry takes as their cut. Another way of looking it at is that each peer is allowed to sell a file and take a cut of the price. This profit incentive will drive them to sell rather give away the files, and the recording industry and artists get to stay in business. Everybody's happy, except people who think this is a pyramid scheme (which it might be, I'm not sure of the formal definition).
[ 16 Comments... ]
Since I'm writing a Firefox plugin for a course on privacy-enhancing technologies that I'm taking right now, I found myself thinking of other Firefox extension that might be useful from a privacy/security point of view. One that I would really like to see is a plugin that separates browser storage for webpages by the origin of the top-level window. For those who aren't familiar with the guts of browsers and the web, here's how that breaks down:
- "Browser storage" includes things like cookies, history information, etc..
- "Origin" refers to the combination of URI scheme, host, and port information. So if you went to http://google.com/ that would have an origin of {"http", "google.com", "80"} (80 being the default HTTP port). Any page under that domain would have the same origin, but https://google.com:80/ would be under a separate origin (since the scheme changed to "https"). The origin of a page is already used in browsers to prevent communication across domains - so for example if you pull in another page into yours using an iframe, you can't programatically read or modify that page if it comes from a different origin. Same thing applies to XMLHttpRequest - you can't make XHR calls across origins.
- "Top-level window" refers to the "main" page being loaded in the browser or the tab. Subwindows could be things like frames or object tags that point to other documents that are embedded into the main page.
So what my suggestion boils down to is stricter separation of pages in different origins. Right now there are two huge issues that I can think of off the top of my head that would be fixed with this.
The first is the ability for websites to track you across the web trivially. Right now, sites like Google and Facebook can easily generate a pretty comprehensive list of websites you visit, simply because they have a presence on all of those websites. Any site that uses Google Analytics (and that's a lot of sites) includes a script from Google's servers. When your browser fetches that script, the request includes the Google cookies stored in your browser. And just like that, Google knows you (as in, the user logged in to Google in your browser) visited that page. The same thing is true of the Facebook "like" button you see plastered all over the web. If you're logged in to Facebook in that browser, Facebook knows you were there. In both cases, even if you aren't logged in, they can record your IP address and link up the information when you do log in later. For anybody concerned about privacy, this is a huge gaping hole that has been known for a while and that nobody seems to want to do anything about.
The second problem is that of clickjacking. This is a vulnerability that hit the public eye in 2008 but that has been around for much longer. The gist of it is that a malicious webpage can load another site in an iframe, hide it, and move it around following your mouse cursor. That way, when you click on something on the malicious page, you're really clicking on the other site via the iframe. This way, even if the malicious site can't manipulate the other site's contents programmatically, they can still get you do to do things that you don't want to do (e.g. click on a Paypal confirmation link to send them money). Again, this relies on the fact that you're logged in to the other site, so that when you do the click the other site accepts that as a valid action coming from you, the authenticated user. If you weren't logged in to the other site then the possible repercussions are severely reduced.
And that's where the origin-based separation of cookies comes in. In both of these scenarios, there's one site that's the top-level window in your browser, and another site that you're logged in to that's being accessed from that top-level window. The cookies being sent to the logged-in site make both of these vulnerabilities possible. If the cookies don't get sent, then neither of these attacks works.
Now, the solution I'm proposing here is somewhat similar to #6 proposed by Collin Jackson here, except that his solution required a new attribute to be added to the Set-Cookie header, and implemented by websites. The one I'm proposing can be done by a purely browser-based change. Unfortunately I don't think it's doable with just a browser plugin for most browsers, it requires a deeper change than just that.
Of course, there are a couple of problems with this solution. One is that this might affect valid workflows currently in place on the web. If there is a website at origin A that redirects you to a website at origin B to log in, and then redirects you back to origin A, which now includes content from origin B, that won't work anymore. I can't really think of any websites I use on a regular basis that do this, though. The best way to find out is to just implement it and see what breaks. (Although I think site architectures like this are bad in general, and shouldn't really be that much work to fix).
The second problem is the solution doesn't really cover all the ways websites can track you. There are a lot of other things that can be used, such as your IP address, User-Agent header, flash cookies, etc. that could be used to identify you on the web and track which websites you visit. But all of these are solvable as well - you could use Tor for anonymous IP routing, make minor modifications to your UA header on each origin, enforce the same origin-based separation for flash cookies, and so on.
Anyway, that's my two cents for the day. If I have more time later I'll try to implement this, but if anybody else (particularly current browser developers) want to jump on it go right ahead.
[ 6 Comments... ]
It strikes me that what Wikileaks has done by publishing all those government documents is equivalent of a gross violation of governmental privacy. It's kind of ironic that the US government is so outraged, since that same government doesn't think twice before grossly violating their own citizens' privacy. Somehow I don't think this episode will make them value privacy more than they did before.
Then again, I guess the concept of privacy doesn't really apply to governments; in that context a violation of privacy is just considered increased transparency.
[ 3 Comments... ]
Not too long ago I read Surely You're Joking, Mr. Feynman!, the pseudo-autobiography of Richard Feynman (Nobel-winning physicist). I say "pseudo" because really it's a bunch of pretty entertaining anecdotes from his life in more-or-less chronological order. I highly recommend everybody read it, not only because it's entertaining but also sneakily thought-provoking.
One of the things it made me think about was how many famous scientists (and famous people in general) started off at a very early age. In the book, Feynman recounts how he did experiments to satisfy his curiosity as a kid - you can find similar stories in the lives of many famous people. I think this is generally true because it takes a lot of practice to build up the skills and intuition it takes to be really good at something, and people who start early and love what they do tend to get that practice, while others don't.
Anyway, one of the things that struck me was that how while I also get curious about random things, I usually end up going to Wikipedia or such to find the answer rather than doing experiments to figure out the answer myself, like Feynman did. And that's probably true for a lot of people. That makes me wonder - how many children growing up now are going to be worse off with respect to experimentation and deduction skills? In order to practise experimenting when you're young, you need to have intriguing but simple problems - if the problem is not intriguing enough then there's no motivation, and if the problem is too hard then the experiment is unlikely to be successful. And I think that in that class of problems, a vast majority of the answers are in easy reach on the web. So budding scientists might end being deprived of a certain amount of practice when it comes to these things.
Now there's definitely counterexamples to this - every now and then there are stories about high school kids building nuclear reactors which shows that scientific creativity and experimentation in kids isn't dead yet. And if you look at the pattern, it seems like these outlier kids are tackling harder classes of problems than they used to. Building a fusion reactor in a garage isn't trivial, and certainly would have been impossible even 20 years ago.
As I see it, the main thing that is allowing these kids to work on harder problems is how technology is becoming more accessible to individuals. For the fusion reactor, all the parts needed were found on "eBay and the hardware store". These days you can buy all sorts of interesting stuff online and do who-knows-what with them.
However, I'm not sure if it's enough. Tackling harder problems also requires knowing more. While it's true that higher levels of education are being pushed down to younger kids, I wonder if at some point we'll reach a threshold where it's simply not possible to learn and acquire everything you need in order to perform meaningful experimentation and practise those skills. I guess time will tell.
[ 3 Comments... ]
My teeth have been sensitive to cold for a while. I started using Sensodyne a while back which helped considerably, but my teeth are still somewhat sensitive to cold. Today I went for my regularly scheduled dental cleaning, and all was going fine. Towards the end, after doing the cleaning, the dentist was rinsing out my teeth with water from their little water-dispensing tool, and it was really cold. Naturally I winced since my teeth starting hurting.
The dentist noticed my discomfort and stopped rinsing, and allowed me rinse using a cup from the more adequately-temperatured tap water. She also mentioned that they just received a batch of samples for a new Colgate toothpaste that is supposed to provide "immediate" relief from tooth sensitivity, and handed me one of these sample tubes along with the usual free toothbrush and floss as I was leaving.
Now this seems like a perfectly ordinary sequence of events, but as I was thinking about it, I realized that in the past whenever they used the rinse device I've never felt that much discomfort. Also I haven't really noticed my overall tooth sensitivity go up lately; I think it's been holding steady for a while. At this point my paranoia intervened and led me to a rather absurd-sounding conclusion. What do you think?
[ 4 Comments... ]
So a few months ago, Stephen Hawking said that we shouldn't try to make contact with aliens because they'd likely just wipe us out and move on. His argument? "We only have to look at ourselves to see how intelligent life might develop into something we wouldn't want to meet." While this is certainly a possibility, it reminded me of a realization I came to last year about human life on Earth. The following two paragraphs were extracted from an IM conversation I had wherein I described said realization (I just cleaned it up a bit, but didn't edit it much, so apologies if it seems a bit out-of-place in blog form).
The realization was that humans really suck at living on Earth, but that we will get better. I realized that people who are good at something usually do it in a highly efficient manner. In fact, it's very similar to somebody who doesn't do it at all. On the other hand, somebody who's bad at something will make this huge mess and end up with imperfect results. For example, really good programmers can do more with one line of well-crafted code than crappy programmers can do with a pageful. Ju-jitsu masters can throw people across a room while hardly moving, whereas beginners have to heave and grunt. But the thing is it's all part of the learning process. At the beginning you don't understand how it works clearly in your mind, so you flail about and do some stuff. Then, over time, it makes more and more sense and you can strip out all the flailing and just keep that one essential flail that does the job. And that's what the experts do.
So then I realized that what humans have done on the planet so far has been the equivalent of flailing. Before we were "intelligent" we lived in harmony with nature and stuff; once we become experts we will be able to again live in harmony with the environment and only have a minimal footprint. Already we're starting to do that a bit with more and more miniaturization of tech and more efficient power sources and things like that. But once we really get the hang of it it'll be not much different from the initial state.
My point is that I disagree with Stephen Hawking, to a certain extent. I think that the human race (and any alien civilizations) will eventually either self-destruct (possibly taking the universe with them), or reach the point where they can peacefully co-exist with the rest of the universe. The only question is: will we (or aliens) expand out into space before or after that decision is made? It seems to me that if, right now, we started expanding out to other planets, it would remove the pressure on us to reach the peaceful co-existence state. For instance, colonizing a planet with fossil fuels means we no longer have any need to seek out more efficient and renewable energy sources. Colonizing just about any other Earth-like planet means we don't need to worry about global warming. This means that we would treat any other planet the same way that we treat Earth now, which isn't all that great.
If, on the other hand, we were forced to stay here on Earth until we reached the peaceful co-existence state, and THEN we expanded to other planets, we would take that culture with us. We would inhabit new planets with a minimal footprint, co-existing with any life that we found there, and generally being good. In related news, Stephen Hawking also recently said that we should expand beyond Earth or face extinction. While I would like to see that happen, I also hope we don't do it too soon, and that we are forced to mend our ways first. Otherwise, we may well turn out to be the destructive, resource-hungry, aliens that Hawking is afraid we might run into. I'd much rather see the human race go extinct than see them destroy half the universe.
[ 0 Comments... ]
There's a really comprehensive (and long) article in The New York Times about memories on the Internet. It highlights one of the things that's been stewing in the back of my mind for a few months now, ever since I decided to leave my job. The basic idea is that pretty much everything should have an end, and that people should know it.
The NYT article above talks about a lot of things relating to privacy on the Internet, and discusses the possibility of implementing expiration dates on data, after which the data would no longer be accessible (page 5 if you want to skip right to it). Given the way computers work now, and how easy it is to make digital copies, implementing this is practically impossible without a ground-up redesign of a lot of things. But it's still a good idea.
The specific expiration date that was relevant to me a couple of months ago was my last day at work. I wondered what it would be like if every employee, instead of having an open-ended full-time contract, had a limited-term contract which had to be renewed when it expired. This would allow companies to let go of employees who were under-performing, which increases the incentive for employees to perform at their best. Of course, it works both ways - the employer too must do their best to retain the good employees, since they would be free to go anywhere else once the contract expired. This would increase competition on both ends, presumably resulting in better conditions for everybody.
Note that all human societies that I'm aware of already use expiration dates in some important domains. As an example, consider how most elected officials are elected for a fixed term; without this sort of expiration date democracies wouldn't work at all. Human life itself has an expiration date (although it's not specifically known, and is more variable), without which evolution (both physical and societal) would not be possible. In both cases the presence of the expiry date allows for change and improvement at a much faster pace than would be otherwise possible.
With the concept of expiration, we also have to consider the concept of renewal. A lot of human-instituted expiration dates allow for renewal, where the expiration date is pushed back. For example, it's hard to imagine a fixed-term employment contract that doesn't allow for renewal. Why is it, then, that the president of the USA isn't allowed to serve for more than two terms?
I think that the more important the contract, the more important it is that the expiration of the contract NOT be allowed to renew. Death, after all, is not a renewable expiration date. In a sense, allowing it to renew undermines the expiration date in the first place. If you know that the expiration date can be pushed back, then you're going to behave accordingly; without that "the end is near" feeling, there's no impetus to complete whatever task needs completion, and no impetus to build a legacy worth remembering. Allowing the possibility of renewal also means that there has to be an entity that decides whether or not the renewal is justified. That introduces a whole raft of problems with criteria for renewal and subjective evaluation; enough problems, in fact, that entire religions have been created on this topic. Literally.
Another area in which I think expiration dates are becoming increasingly important are laws. Things like the USA PATRIOT Act came with built-in expiration dates on some of the sections, so that powers granted to the FBI and other agencies would only threaten civil liberties until the terrorist threat was taken care of. Unfortunately, some of those expiration dates were renewed and others ignored, resulting in the occasional abuse of power. And of course, everybody's heard of some of the really out-dated laws that are technically still valid. Not all of these problems would be solved by expiration dates more widely, but some certainly would.
There's a lot of other domains out there that could benefit from having expiration dates on contracts (using the term loosely), both of the renewable and the non-renewable kind. They just need to be selected and used judiciously, which won't happen unless people start thinking of them as a tool.
[ 5 Comments... ]
I was watching Barefoot Ted's talk at Google and one of the phrases he used got stuck in my mind (see video starting 50:25): can you handle being "other"? ("Other" in this context, if you don't want to watch the video, refers to doing something outside socially-acceptable norms.) I was thinking about what it meant and what it implied, and to me personally, there's a lot of richness to that one question.
Most people, by definition, aren't "other". They're "normal", they live according to society's rules, and they go about their lives without really questioning most of what they do or understanding why they do it. People who are "other" usually fall into two categories: one that actually gets a kick out of being labeled "other", and the other that has consciously chosen to become "other" because of some significant benefit it provides. To put in terms of the barefoot running example: there are people who will run barefoot just to freak out others by being "weird", and there are people (like Barefoot Ted) who run barefoot despite being labeled "weird" because it's actually good for you. It's important to distinguish between the two groups, and the stuff I'm talking about refers to people in the second group.
So why is being "other" important? Well recently I've been finding out more and more that things I take for granted because they're "normal" are really incredibly bad. In fact, a staggering amount of the lifestyle choices we make (in "developed" countries) at least are staggeringly bad for our health and/or happiness. We walk wrong. We eat wrong. We over-sanitize. We even poop wrong! It's no wonder that half a million to a million Americans die every year due to "lifestyle disease".
And the thing is, this didn't use to happen. A few hundred years ago "normal" was good. People had healthy lifestyles, for the most part. Sure they had shorter lives, but mostly because medicine back then wasn't as good as it is now. So how did this happen? There isn't really a single thing we can point to and say "Aha! That's where it all started going wrong!" If you've watched the Jamie Oliver TED talk, you can sort of see the progression though. He claims that people aren't taught at home or in school how to cook, so they end up making bad food choices. The question that follows is: why aren't they taught how to cook? Because their parents and teachers didn't think it was worth teaching. Why? Because they themselves didn't realize the benefits of proper nutrition over the convenience of fast food. Why? Well, until we started doing things wrong there was no reason to realize that we were doing things right to begin with.
So there it is: the human race was going along, doing things the same way that they had been for ages with proper nutritional meals. Then somebody came along and invented more convenient but less healthy food. We switched over to it, not realizing just how bad it was for us. I'm sure it was generally realized that it wasn't quite as healthy but the convenience factor overrode that by a wide margin. Then things start going real bad and everybody became obese. And now we scramble to try and fix it, after having learnt from the mistake and with newfound knowledge about the importance of nutrition. Fair enough. It seems like a pretty natural cycle, and I'm sure human history and evolution is full of cycles like this, where we make a mistake and then add to our store of knowledge about why it was wrong.
There's two things that come out of this, though. The first is that there is a lesson to be learnt from the "other" category. The very existence of the "other" category means that there's something they've discovered that is not general knowledge. By being open-minded and thinking about what they are saying, it's possible that you too can learn about the mistakes we've made as a species and can get back on the recovery curve faster. Once the "other" category spreads their knowledge back into the general population, they stop being "other". Being "other" is something like a transient state that exists only while something is wrong.
The second thing is that it's possible to realize the cycle is happening early on and try to prevent it. I know I keep harping on the same topics over and over, but I believe that driving is in the "omg this is awesome, let's all do it" phases of this cycle. It's been shown that driving is a significant cause of stress and promotes lack of exercise (not to mention that driving-related accidents are a leading cause of death). It's just all-around bad for you. People already know this, but... boy, it sure is convenient! Starting to sound familiar yet?
My dad came to visit recently, and one of the things he kept telling me was that I should get a car and how it's "essential for day-to-day life". He's no different than those parents in Huntington, West Virginia, who sacrificed health for convenience without even realizing it, and are now pushing the same choice onto their children. In this case, I have no problem being "other" by ignoring my dad and choosing more healthy forms of locomotion.
Your mission, should you choose to accept it, is to find a way to be "other" if you're not already. The point is to explore the choices we make every day, and to realize that we could already be wrong in a lot of ways that we don't realize. If you're completely stuck on ideas, this is a good one to start with.
[ 7 Comments... ]
Idealism is the responsibility of the young.
[ 0 Comments... ]
|