papers | collections | search login | register | forgot password?

Infranet: Circumventing Web Censorship and Surveillance
by Nick Feamster, Hari Balakrishnan, Magdalena Balazinska, David Karger, Greg Harfst
url  show details
You need to log in to add tags and post comments.
Tags
Public comments
#1 posted on Apr 13 2010, 02:09 in collection UW-Madison CS 740: Advanced Computer Networking -- Spring 2012
What I saw is paper making and clever but ad hoc hacking. Some random thoughts follow.

Papers begin with motivations, and this paper doesn't have a strong one. For HTTP, an obvious/simple/principled/well-established solution is SSL. But the paper rules it out solely based on the possibility that SSL could *presumably* be blocked by the censor. This not-so-convincing argument serves as a major justification for the necessity of this work.

One trick in paper engineering is to cook a list of design "goals" that are perfectly suited by what's being proposed. E.g., 'plausible deniability'.

The 'sophisticated' system described in the paper may have made a great application (e.g., a browser plugin). Unfortunately, it not only requires a lot from the client, but also too much efforts from the server. It simply won't fly.

Censors not only block sensitive data content, but sometimes also certain data sources (specific URLs or entire domains). URL blocking cannot solved by SSL, but is partially addressed in this paper. However, the server-enabled solution looks rather unwieldy. Instead of permuting a static set of URLs, why not (let the client) use random salt and the public key to dynamically generate URLs? BTW, HTTPS URLs are encrypted.