[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

[tor-commits] [ooni-probe/master] Remove oonib from ooni-probe repo.



commit 5565ed2de4421a4e754d60a4d1f82e8b074be16e
Author: Arturo Filastò <art@xxxxxxxxx>
Date:   Wed May 8 18:27:07 2013 +0200

    Remove oonib from ooni-probe repo.
---
 oonib/INSTALL                     |    4 -
 oonib/README.md                   |  117 ----------------------
 oonib/__init__.py                 |   24 -----
 oonib/config.py                   |   58 -----------
 oonib/db/__init__.py              |   30 ------
 oonib/db/tables.py                |  123 -----------------------
 oonib/models.py                   |  129 -------------------------
 oonib/oonibackend.py              |   74 --------------
 oonib/report/__init__.py          |    5 -
 oonib/report/api.py               |  106 --------------------
 oonib/report/file_collector.py    |  193 -------------------------------------
 oonib/requirements.txt            |   24 -----
 oonib/runner.py                   |   81 ----------------
 oonib/testhelpers/__init__.py     |    5 -
 oonib/testhelpers/dns_helpers.py  |   16 ---
 oonib/testhelpers/http_helpers.py |  154 -----------------------------
 oonib/testhelpers/ssl_helpers.py  |    9 --
 oonib/testhelpers/tcp_helpers.py  |   72 --------------
 18 files changed, 0 insertions(+), 1224 deletions(-)

diff --git a/oonib/INSTALL b/oonib/INSTALL
deleted file mode 100644
index 1622707..0000000
--- a/oonib/INSTALL
+++ /dev/null
@@ -1,4 +0,0 @@
-BEWARE: This requires python 2.7.3
-storm (Storm ORM)
-transaction (zope transaction)
-
diff --git a/oonib/README.md b/oonib/README.md
deleted file mode 100644
index d11f876..0000000
--- a/oonib/README.md
+++ /dev/null
@@ -1,117 +0,0 @@
-# Dependencies
-
-The extra dependencies necessary to run OONIB are:
-
-* twisted-names
-* cyclone: https://github.com/fiorix/cyclone
-
-We recommend that you use a python virtualenv. See OONI's README.md.
-
-# Generate self signed certs for OONIB
-
-If you want to use the HTTPS test helper, you will need to create a certificate:
-
-    openssl genrsa -des3 -out private.key 4096
-    openssl req -new -key private.key -out server.csr
-    cp private.key private.key.org
-    # Remove passphrase from key
-    openssl rsa -in private.key.org -out private.key
-    openssl x509 -req -days 365 -in server.csr -signkey private.key -out certificate.crt
-    rm private.key.org
-
-Don't forget to update oonib/config.py options helpers.ssl.private_key and
-helpers.ssl.certificate
-
-# Redirect low ports with iptables
-
-The following iptables commands will map connections on low ports to those bound by oonib
-
-    # Map port 80 to config.helpers.http_return_request.port  (default: 57001)
-    iptables -t nat -A PREROUTING -p tcp -m tcp --dport 80 -j REDIRECT --to-ports 57001
-    # Map port 443 to config.helpers.ssl.port  (default: 57006)
-    iptables -t nat -A PREROUTING -p tcp -m tcp --dport 443 -j REDIRECT --to-ports 57006
-    # Map port 53 udp to config.helpers.dns.udp_port (default: 57004)
-    iptables -t nat -A PREROUTING -p tcp -m udp --dport 53 -j REDIRECT --tor-ports 
-    # Map port 53 tcp to config.helpers.dns.tcp_port (default: 57005)
-    iptables -t nat -A PREROUTING -p tcp -m tcp --dport 53 -j REDIRECT --tor-ports 
-
-# Install Tor (Debian).
-
-You will need a Tor binary on your system. For complete instructions, see also:
-
-    https://www.torproject.org/docs/tor-doc-unix.html.en
-    https://www.torproject.org/docs/rpms.html.en
-
-Add this line to your /etc/apt/sources.list, replacing <DISTRIBUTION>
-where appropriate:
-
-    deb http://deb.torproject.org/torproject.org <DISTRIBUTION> main
-
-Add the Tor Project gpg key to apt:
-
-    gpg --keyserver keys.gnupg.net --recv 886DDD89
-    gpg --export A3C4F0F979CAA22CDBA8F512EE8CBC9E886DDD89 | sudo apt-key add -
-    # Update apt and install the torproject keyring, tor, and geoipdb
-    apt-get update
-    apt-get install deb.torproject.org-keyring tor tor-geoipdb
-
-# Update ooni-probe/oonib/config.py
-
-    Set config.main.tor_binary to your Tor path
-    Set config.main.tor2webmode = False
-
-# (For Experts Only) Tor2webmode:
-
-WARNING: provides no anonymity! Use only if you know what you are doing!
-Tor2webmode will improve the performance of the collector Hidden Service
-by discarding server-side anonymity.
-
-You will need to build Tor from source. At the time of writing, the latest stable Tor is tor-0.2.3.25. You should use the most recent stable Tor.
-
-Example:
-
-    git clone https://git.torproject.org/tor.git
-    git checkout tor-0.2.3.25
-    git verify-tag -v tor-0.2.3.25
-
-You should see:
-
-    object 17c24b3118224d6536c41fa4e1493a831fb29f0a
-    type commit
-    tag tor-0.2.3.25
-    tagger Roger Dingledine <arma@xxxxxxxxxxxxxx> 1353399116 -0500
-    
-    tag 0.2.3.25
-    gpg: Signature made Tue 20 Nov 2012 08:11:59 AM UTC using RSA key ID 19F78451
-    gpg: Good signature from "Roger Dingledine <arma@xxxxxxx>"
-    gpg:                 aka "Roger Dingledine <arma@xxxxxxxxxxxxx>"
-    gpg:                 aka "Roger Dingledine <arma@xxxxxxxxxxxxxx>"
-
-It is always good idea to verify.
-
-    gpg --fingerprint 19F78451
-    pub   4096R/19F78451 2010-05-07
-          Key fingerprint = F65C E37F 04BA 5B36 0AE6  EE17 C218 5258 19F7 8451
-    uid                  Roger Dingledine <arma@xxxxxxx>
-    uid                  Roger Dingledine <arma@xxxxxxxxxxxxx>
-    uid                  Roger Dingledine <arma@xxxxxxxxxxxxxx>
-    sub   4096R/9B11185C 2012-05-02 [expires: 2013-05-02]
-
-Build Tor with enable-tor2web-mode
-
-    ./autogen.sh ; ./configure --enable-tor2web-mode ; make 
-    
-Copy the tor binary from src/or/tor somewhere and set the corresponding
-options in oonib/config.py
-
-# To launch oonib on system boot
-
-To launch oonib on startup, you may want to use supervisord (www.supervisord.org)
-The following supervisord config will use the virtual environment in
-/home/ooni/venv_oonib and start oonib on boot:
-
-    [program:oonib]
-    command=/home/ooni/venv_oonib/bin/python /home/ooni/ooni-probe/bin/oonib
-    autostart=true
-    user=oonib
-    directory=/home/oonib/
diff --git a/oonib/__init__.py b/oonib/__init__.py
deleted file mode 100644
index ab7419c..0000000
--- a/oonib/__init__.py
+++ /dev/null
@@ -1,24 +0,0 @@
-# -*- encoding: utf-8 -*-
-#
-# :authors: Arturo Filastò
-# :licence: see LICENSE
-"""
-In here we shall keep track of all variables and objects that should be
-instantiated only once and be common to pieces of GLBackend code.
-"""
-__all__ = ['database', 'db_threadpool']
-
-from twisted.python.threadpool import ThreadPool
-
-from storm.uri import URI
-from storm.twisted.transact import Transactor
-from storm.databases.sqlite import SQLite
-
-__version__ = '0.0.1'
-
-from oonib import config
-
-database = SQLite(URI(config.main.database_uri))
-db_threadpool = ThreadPool(0, config.main.db_threadpool_size)
-db_threadpool.start()
-transactor = Transactor(db_threadpool)
diff --git a/oonib/config.py b/oonib/config.py
deleted file mode 100644
index 8598e00..0000000
--- a/oonib/config.py
+++ /dev/null
@@ -1,58 +0,0 @@
-from ooni.utils import Storage
-import os
-
-def get_root_path():
-    this_directory = os.path.dirname(__file__)
-    root = os.path.join(this_directory, '..')
-    root = os.path.abspath(root)
-    return root
-
-backend_version = '0.0.1'
-
-# XXX convert this to something that is a proper config file
-main = Storage()
-
-# This is the location where submitted reports get stored
-main.report_dir = os.path.join(get_root_path(), 'oonib', 'reports')
-
-# This is where tor will place it's Hidden Service hostname and Hidden service
-# private key
-main.tor_datadir = os.path.join(get_root_path(), 'oonib', 'data', 'tor')
-
-main.database_uri = "sqlite:"+get_root_path()+"oonib_test_db.db"
-main.db_threadpool_size = 10
-#main.tor_binary = '/usr/sbin/tor'
-main.tor_binary = '/usr/local/bin/tor'
-
-# This requires compiling Tor with tor2web mode enabled
-# BEWARE!! THIS PROVIDES NO ANONYMITY!!
-# ONLY DO IT IF YOU KNOW WHAT YOU ARE DOING!!
-# HOSTING A COLLECTOR WITH TOR2WEB MODE GIVES YOU NO ANONYMITY!!
-main.tor2webmode = True
-
-helpers = Storage()
-
-helpers.http_return_request = Storage()
-helpers.http_return_request.port = 57001
-# XXX this actually needs to be the advertised Server HTTP header of our web
-# server
-helpers.http_return_request.server_version = "Apache"
-
-helpers.tcp_echo = Storage()
-helpers.tcp_echo.port = 57002
-
-helpers.daphn3 = Storage()
-#helpers.daphn3.yaml_file = "/path/to/data/oonib/daphn3.yaml"
-#helpers.daphn3.pcap_file = "/path/to/data/server.pcap"
-helpers.daphn3.port = 57003
-
-helpers.dns = Storage()
-helpers.dns.udp_port = 57004
-helpers.dns.tcp_port = 57005
-
-helpers.ssl = Storage()
-#helpers.ssl.private_key = /path/to/data/private.key
-#helpers.ssl.certificate = /path/to/data/certificate.crt
-#helpers.ssl.port = 57006
-
-
diff --git a/oonib/db/__init__.py b/oonib/db/__init__.py
deleted file mode 100644
index 8ddaff0..0000000
--- a/oonib/db/__init__.py
+++ /dev/null
@@ -1,30 +0,0 @@
-__all__ = ['createTables', 'database', 'transactor']
-
-from twisted.python.threadpool import ThreadPool
-from twisted.internet.defer import inlineCallbacks, returnValue, Deferred
-
-from storm.locals import Store
-from storm.uri import URI
-from storm.databases.sqlite import SQLite
-
-from oonib import database, transactor
-from ooni.utils import log
-
-@inlineCallbacks
-def createTables():
-    """
-    XXX this is to be refactored and only exists for experimentation.
-    """
-    from oonib.db import models
-    for model_name in models.__all__:
-        try:
-            model = getattr(m, model_name)
-        except Exception, e:
-            log.err("Error in db initting")
-            log.err(e)
-        try:
-            log.debug("Creating %s" % model)
-            yield tables.runCreateTable(model, transactor, database)
-        except Exception, e:
-            log.debug(str(e))
-
diff --git a/oonib/db/tables.py b/oonib/db/tables.py
deleted file mode 100644
index 908a295..0000000
--- a/oonib/db/tables.py
+++ /dev/null
@@ -1,123 +0,0 @@
-# -*- encoding: utf-8 -*-
-#
-# :authors: Arturo Filastò
-# :licence: see LICENSE
-
-from twisted.internet.defer import inlineCallbacks
-
-from storm.locals import Store
-from storm.properties import PropertyColumn
-from storm.exceptions import StormError
-
-from storm.variables import BoolVariable, DateTimeVariable, DateVariable
-from storm.variables import DecimalVariable, EnumVariable
-from storm.variables import FloatVariable, IntVariable, RawStrVariable
-from storm.variables import UnicodeVariable, JSONVariable, PickleVariable
-
-def variableToSQLite(var_type):
-    """
-    We take as input a storm.variable and we output the SQLite string it
-    represents.
-    """
-    sqlite_type = "VARCHAR"
-    if isinstance(var_type, BoolVariable):
-        sqlite_type = "INTEGER"
-    elif isinstance(var_type, DateTimeVariable):
-        pass
-        sqlite_type = ""
-    elif isinstance(var_type, DateVariable):
-        pass
-    elif isinstance(var_type, DecimalVariable):
-        pass
-    elif isinstance(var_type, EnumVariable):
-        sqlite_type = "BLOB"
-    elif isinstance(var_type, FloatVariable):
-        sqlite_type = "REAL"
-    elif isinstance(var_type, IntVariable):
-        sqlite_type = "INTEGER"
-    elif isinstance(var_type, RawStrVariable):
-        sqlite_type = "BLOB"
-    elif isinstance(var_type, UnicodeVariable):
-        pass
-    elif isinstance(var_type, JSONVariable):
-        sqlite_type = "BLOB"
-    elif isinstance(var_type, PickleVariable):
-        sqlite_type = "BLOB"
-    return "%s" % sqlite_type
-
-def varsToParametersSQLite(variables, primary_keys):
-    """
-    Takes as input a list of variables (convered to SQLite syntax and in the
-    form of strings) and primary_keys.
-    Outputs these variables converted into paramter syntax for SQLites.
-
-    ex.
-        variables: ["var1 INTEGER", "var2 BOOL", "var3 INTEGER"]
-        primary_keys: ["var1"]
-
-        output: "(var1 INTEGER, var2 BOOL, var3 INTEGER PRIMARY KEY (var1))"
-    """
-    params = "("
-    for var in variables[:-1]:
-        params += "%s %s, " % var
-    if len(primary_keys) > 0:
-        params += "%s %s, " % variables[-1]
-        params += "PRIMARY KEY ("
-        for key in primary_keys[:-1]:
-            params += "%s, " % key
-        params += "%s))" % primary_keys[-1]
-    else:
-        params += "%s %s)" % variables[-1]
-    return params
-
-def generateCreateQuery(model):
-    """
-    This takes as input a Storm model and outputs the creation query for it.
-    """
-    query = "CREATE TABLE "+ model.__storm_table__ + " "
-
-    variables = []
-    primary_keys = []
-
-    for attr in dir(model):
-        a = getattr(model, attr)
-        if isinstance(a, PropertyColumn):
-            var_stype = a.variable_factory()
-            var_type = variableToSQLite(var_stype)
-            name = a.name
-            variables.append((name, var_type))
-            if a.primary:
-                primary_keys.append(name)
-
-    query += varsToParametersSQLite(variables, primary_keys)
-    return query
-
-def createTable(model, transactor, database):
-    """
-    Create the table for the specified model.
-    Specification of a transactor and database is useful in unittesting.
-    """
-    if not transactor:
-        from oonib.db import transactor
-    if not database:
-        from oonib.db import database
-    store = Store(database)
-    create_query = generateCreateQuery(model)
-    try:
-        store.execute(create_query)
-    # XXX trap the specific error that is raised when the table exists
-    except StormError, e:
-        print "Failed to create table!"
-        print e
-        store.close()
-    store.commit()
-    store.close()
-
-@inlineCallbacks
-def runCreateTable(model, transactor=None, database=None):
-    """
-    Runs the table creation query wrapped in a transaction.
-    Transactions run in a separate thread.
-    """
-    yield transactor.run(createTable, model, transactor, database)
-
diff --git a/oonib/models.py b/oonib/models.py
deleted file mode 100644
index 22567ad..0000000
--- a/oonib/models.py
+++ /dev/null
@@ -1,129 +0,0 @@
-__all__ = ['Report', 'TestHelperTMP']
-from storm.twisted.transact import transact
-from storm.locals import *
-
-from ooni.utils import randomStr
-from oonib import transactor
-
-def generateReportID():
-    """
-    Generates a report ID for usage in the database backed oonib collector.
-
-    XXX note how this function is different from the one in report/api.py
-    """
-    report_id = randomStr(100)
-    return report_id
-
-class OModel(object):
-
-    transactor = transactor
-
-    def getStore(self):
-        return Store(database)
-
-    @transact
-    def create(self, query):
-        store = Store(database)
-        store.execute(query)
-        store.commit()
-
-    @transact
-    def save(self):
-        store = getStore()
-        store.add(self)
-        store.commit()
-
-class Report(OModel):
-    """
-    This represents an OONI Report as stored in the database.
-
-    report_id: this is generated by the backend and is used by the client to
-               reference a previous report and append to it. It should be
-               treated as a shared secret between the probe and backend.
-
-    software_name: this indicates the name of the software performing the test
-                   (this will default to ooniprobe)
-
-    software_version: this is the version number of the software running the
-                      test.
-
-    test_name: the name of the test on which the report is being created.
-
-    test_version: indicates the version of the test
-
-    progress: what is the current progress of the report. This allows clients
-              to report event partial reports up to a certain percentage of
-              progress. Once the report is complete progress will be 100.
-
-    content: what is the content of the report. If the current progress is less
-             than 100 we should append to the YAML data structure that is
-             currently stored in such field.
-
-    XXX this is currently not used.
-    """
-    __storm_table__ = 'reports'
-
-    createQuery = "CREATE TABLE " + __storm_table__ +\
-                  "(id INTEGER PRIMARY KEY, report_id VARCHAR, software_name VARCHAR,"\
-                  "software_version VARCHAR, test_name VARCHAR, test_version VARCHAR,"\
-                  "progress VARCHAR, content VARCHAR)"
-
-
-    id = Int(primary=True)
-
-    report_id = Unicode()
-
-    software_name = Unicode()
-    software_version = Unicode()
-    test_name = Unicode()
-    test_version = Unicode()
-    progress = Int()
-
-    content = Unicode()
-
-    @transact
-    def new(report):
-        store = self.getStore()
-
-        print "Creating new report %s" % report
-
-        report_id = generateReportID()
-        new_report = models.Report()
-        new_report.report_id = unicode(report_id)
-
-        new_report.software_name = report['software_name']
-        new_report.software_version = report['software_version']
-        new_report.test_name = report['test_name']
-        new_report.test_version = report['test_version']
-        new_report.progress = report['progress']
-
-        if 'content' in report:
-            new_report.content = report['content']
-
-        print "Report: %s" % report
-
-        store.add(new_report)
-        try:
-            store.commit()
-        except:
-            store.close()
-
-        defer.returnValue({'backend_version': backend_version, 'report_id':
-                            report_id})
-
-
-class TestHelperTMP(OModel):
-    __storm_table__ = 'testhelpertmp'
-
-    createQuery = "CREATE TABLE " + __storm_table__ +\
-                  "(id INTEGER PRIMARY KEY, report_id VARCHAR, test_helper VARCHAR,"\
-                  " client_ip VARCHAR, creation_time VARCHAR)"
-
-    id = Int(primary=True)
-
-    report_id = Unicode()
-
-    test_helper = Unicode()
-    client_ip = Unicode()
-
-    creation_time = Date()
diff --git a/oonib/oonibackend.py b/oonib/oonibackend.py
deleted file mode 100644
index 868d7b7..0000000
--- a/oonib/oonibackend.py
+++ /dev/null
@@ -1,74 +0,0 @@
-# ooni backend
-# ************
-#
-# This is the backend system responsible for running certain services that are
-# useful for censorship detection.
-#
-# In here we start all the test helpers that are required by ooniprobe and
-# start the report collector
-
-from twisted.application import internet
-from twisted.internet import  reactor
-from twisted.application import internet, service
-from twisted.application.service import Application
-from twisted.names import dns
-
-from cyclone import web
-
-import txtorcon
-
-from oonib.testhelpers import dns_helpers, ssl_helpers
-from oonib.testhelpers import http_helpers, tcp_helpers
-
-from ooni.utils import log
-
-from oonib import db_threadpool
-from oonib import config
-
-application = service.Application('oonibackend')
-serviceCollection = service.IServiceCollection(application)
-
-if config.helpers.ssl.port:
-    print "Starting SSL helper on %s" % config.helpers.ssl.port
-    ssl_helper = internet.SSLServer(int(config.helpers.ssl.port),
-                   http_helpers.HTTPReturnJSONHeadersHelper(),
-                   ssl_helpers.SSLContext(config))
-    ssl_helper.setServiceParent(serviceCollection)
-
-# Start the DNS Server related services
-if config.helpers.dns.tcp_port:
-    print "Starting TCP DNS Helper on %s" % config.helpers.dns.tcp_port
-    tcp_dns_helper = internet.TCPServer(int(config.helpers.dns.tcp_port),
-                       dns_helpers.DNSTestHelper())
-    tcp_dns_helper.setServiceParent(serviceCollection)
-
-if config.helpers.dns.udp_port:
-    print "Starting UDP DNS Helper on %s" % config.helpers.dns.udp_port
-    udp_dns_factory = dns.DNSDatagramProtocol(dns_helpers.DNSTestHelper())
-    udp_dns_helper = internet.UDPServer(int(config.helpers.dns.udp_port),
-                       udp_dns_factory)
-    udp_dns_helper.setServiceParent(serviceCollection)
-
-# XXX this needs to be ported
-# Start the OONI daphn3 backend
-if config.helpers.daphn3.port:
-    print "Starting Daphn3 helper on %s" % config.helpers.daphn3.port
-    daphn3_helper = internet.TCPServer(int(config.helpers.daphn3.port),
-                            tcp_helpers.Daphn3Server())
-    daphn3_helper.setServiceParent(serviceCollection)
-
-
-if config.helpers.tcp_echo.port:
-    print "Starting TCP echo helper on %s" % config.helpers.tcp_echo.port
-    tcp_echo_helper = internet.TCPServer(int(config.helpers.tcp_echo.port),
-                        tcp_helpers.TCPEchoHelper())
-    tcp_echo_helper.setServiceParent(serviceCollection)
-
-if config.helpers.http_return_request.port:
-    print "Starting HTTP return request helper on %s" % config.helpers.http_return_request.port
-    http_return_request_helper = internet.TCPServer(
-            int(config.helpers.http_return_request.port),
-            http_helpers.HTTPReturnJSONHeadersHelper())
-    http_return_request_helper.setServiceParent(serviceCollection)
-
-reactor.addSystemEventTrigger('after', 'shutdown', db_threadpool.stop)
diff --git a/oonib/report/__init__.py b/oonib/report/__init__.py
deleted file mode 100644
index fcbf220..0000000
--- a/oonib/report/__init__.py
+++ /dev/null
@@ -1,5 +0,0 @@
-class MissingField(Exception):
-    pass
-
-class InvalidRequestField(Exception):
-    pass
diff --git a/oonib/report/api.py b/oonib/report/api.py
deleted file mode 100644
index b3d529d..0000000
--- a/oonib/report/api.py
+++ /dev/null
@@ -1,106 +0,0 @@
-"""
-/report
-
-/pcap
-
-This is the async pcap reporting system. It requires the client to have created
-a report already, but can work independently from test progress.
-
-"""
-import random
-import string
-import json
-import re
-import os
-
-from twisted.internet import reactor, defer
-
-from cyclone import web
-
-from ooni import otime
-from ooni.utils import randomStr
-
-from oonib import models, config
-from oonib.report import file_collector
-
-def parseUpdateReportRequest(request):
-    #db_report_id_regexp = re.compile("[a-zA-Z0-9]+$")
-
-    # this is the regexp for the reports that include the timestamp
-    report_id_regexp = re.compile("[a-zA-Z0-9_-]+$")
-
-    # XXX here we are actually parsing a json object that could be quite big.
-    # If we want this to scale properly we only want to look at the test_id
-    # field.
-    # We are also keeping in memory multiple copies of the same object. A lot
-    # of optimization can be done.
-    parsed_request = json.loads(request)
-    try:
-        report_id = parsed_request['report_id']
-    except KeyError:
-        raise MissingField('report_id')
-
-    if not re.match(report_id_regexp, report_id):
-        raise InvalidRequestField('report_id')
-
-    return parsed_request
-
-
-class NewReportHandlerDB(web.RequestHandler):
-    """
-    Responsible for creating and updating reports via database.
-    XXX this is not yet fully implemented.
-    """
-
-    @web.asynchronous
-    @defer.inlineCallbacks
-    def post(self):
-        """
-        Creates a new report with the input to the database.
-        XXX this is not yet implemented.
-
-        * Request
-
-          {'software_name': 'XXX',
-           'software_version': 'XXX',
-           'test_name': 'XXX',
-           'test_version': 'XXX',
-           'progress': 'XXX',
-           'content': 'XXX'
-           }
-
-          Optional:
-            'test_helper': 'XXX'
-            'client_ip': 'XXX'
-
-          * Response
-
-          {'backend_version': 'XXX', 'report_id': 'XXX'}
-
-        """
-        report_data = json.loads(self.request.body)
-        new_report = models.Report()
-        log.debug("Got this request %s" % report_data)
-        result = yield new_report.new(report_data)
-        self.write(result)
-        self.finish()
-
-    def put(self):
-        """
-        Update an already existing report with the database.
-
-        XXX this is not yet implemented.
-
-          {'report_id': 'XXX',
-           'content': 'XXX'
-          }
-        """
-        pass
-
-
-reportingBackendAPI = [
-    (r"/report", file_collector.NewReportHandlerFile),
-    (r"/pcap", file_collector.PCAPReportHandler)
-]
-
-reportingBackend = web.Application(reportingBackendAPI, debug=True)
diff --git a/oonib/report/file_collector.py b/oonib/report/file_collector.py
deleted file mode 100644
index 6d5584c..0000000
--- a/oonib/report/file_collector.py
+++ /dev/null
@@ -1,193 +0,0 @@
-import random
-import string
-import json
-import re
-import os
-
-from twisted.internet import fdesc
-
-from cyclone import web
-
-from ooni.utils import randomStr
-from ooni import otime
-
-from oonib.report import MissingField, InvalidRequestField
-
-from oonib import config
-
-def parseUpdateReportRequest(request):
-    #db_report_id_regexp = re.compile("[a-zA-Z0-9]+$")
-
-    # this is the regexp for the reports that include the timestamp
-    report_id_regexp = re.compile("[a-zA-Z0-9_\-]+$")
-
-    # XXX here we are actually parsing a json object that could be quite big.
-    # If we want this to scale properly we only want to look at the test_id
-    # field.
-    # We are also keeping in memory multiple copies of the same object. A lot
-    # of optimization can be done.
-    parsed_request = json.loads(request)
-    try:
-        report_id = parsed_request['report_id']
-    except KeyError:
-        raise MissingField('report_id')
-
-    if not re.match(report_id_regexp, report_id):
-        raise InvalidRequestField('report_id')
-
-    return parsed_request
-
-
-
-def parseNewReportRequest(request):
-    """
-    Here we parse a new report request.
-    """
-    version_string = re.compile("[0-9A-Za-z_\-\.]+$")
-    name = re.compile("[a-zA-Z0-9_\- ]+$")
-    probe_asn = re.compile("AS[0-9]+$")
-
-    expected_request = {
-     'software_name': name,
-     'software_version': version_string,
-     'test_name': name,
-     'test_version': version_string,
-     'probe_asn': probe_asn
-    }
-
-    parsed_request = json.loads(request)
-    if not parsed_request['probe_asn']:
-        parsed_request['probe_asn'] = 'AS0'
-
-    for k, regexp in expected_request.items():
-        try:
-            value_to_check = parsed_request[k]
-        except KeyError:
-            raise MissingField(k)
-
-        print "Matching %s with %s | %s" % (regexp, value_to_check, k)
-        if re.match(regexp, str(value_to_check)):
-            continue
-        else:
-            raise InvalidRequestField(k)
-
-    return parsed_request
-
-class NewReportHandlerFile(web.RequestHandler):
-    """
-    Responsible for creating and updating reports by writing to flat file.
-    """
-
-    def post(self):
-        """
-        Creates a new report with the input
-
-        * Request
-
-          {'software_name': 'XXX',
-           'software_version': 'XXX',
-           'test_name': 'XXX',
-           'test_version': 'XXX',
-           'probe_asn': 'XXX'
-           'content': 'XXX'
-           }
-
-          Optional:
-            'test_helper': 'XXX'
-            'client_ip': 'XXX'
-
-          (not implemented, nor in client, nor in backend)
-          The idea behind these two fields is that it would be interesting to
-          also collect how the request was observed from the collectors point
-          of view.
-
-          We use as a unique key the client_ip address and a time window. We
-          then need to tell the test_helper that is selected the client_ip
-          address and tell it to expect a connection from a probe in that time
-          window.
-
-          Once the test_helper sees a connection from that client_ip it will
-          store for the testing session the data that it receives.
-          When the probe completes the report (or the time window is over) the
-          final report will include also the data collected from the
-          collectors view point.
-
-        * Response
-
-          {'backend_version': 'XXX', 'report_id': 'XXX'}
-
-        """
-        # XXX here we should validate and sanitize the request
-        try:
-            report_data = parseNewReportRequest(self.request.body)
-        except InvalidRequestField, e:
-            raise web.HTTPError(400, "Invalid Request Field %s" % e)
-        except MissingField, e:
-            raise web.HTTPError(400, "Missing Request Field %s" % e)
-
-        print "Parsed this data %s" % report_data
-        software_name = report_data['software_name']
-        software_version = report_data['software_version']
-        test_name = report_data['test_name']
-        test_version = report_data['test_version']
-        probe_asn = report_data['probe_asn']
-        content = report_data['content']
-
-        if not probe_asn:
-            probe_asn = "AS0"
-
-        report_id = otime.timestamp() + '_' \
-                + probe_asn + '_' \
-                + randomStr(50)
-
-        # The report filename contains the timestamp of the report plus a
-        # random nonce
-        report_filename = os.path.join(config.main.report_dir, report_id)
-
-        response = {'backend_version': config.backend_version,
-                'report_id': report_id
-        }
-
-        self.writeToReport(report_filename,
-                report_data['content'])
-
-        self.write(response)
-
-    def writeToReport(self, report_filename, data):
-        with open(report_filename, 'w+') as fd:
-            fdesc.setNonBlocking(fd.fileno())
-            fdesc.writeToFD(fd.fileno(), data)
-
-    def put(self):
-        """
-        Update an already existing report.
-
-          {
-           'report_id': 'XXX',
-           'content': 'XXX'
-          }
-        """
-        parsed_request = parseUpdateReportRequest(self.request.body)
-
-        report_id = parsed_request['report_id']
-
-        print "Got this request %s" % parsed_request
-        report_filename = os.path.join(config.main.report_dir,
-                report_id)
-
-        self.updateReport(report_filename, parsed_request['content'])
-
-    def updateReport(self, report_filename, data):
-        try:
-            with open(report_filename, 'a+') as fd:
-                fdesc.setNonBlocking(fd.fileno())
-                fdesc.writeToFD(fd.fileno(), data)
-        except IOError as e:
-            web.HTTPError(404, "Report not found")
-
-class PCAPReportHandler(web.RequestHandler):
-    def get(self):
-        pass
-
-    def post(self):
-        pass
diff --git a/oonib/requirements.txt b/oonib/requirements.txt
deleted file mode 100644
index da6ec81..0000000
--- a/oonib/requirements.txt
+++ /dev/null
@@ -1,24 +0,0 @@
-PyYAML>=3.10
-Pygments>=1.5
-Twisted>=12.2.0
-argparse>=1.2.1
-cyclone>=1.0-rc13
-distribute>=0.6.24
-docutils>=0.9.1
-ipaddr>=2.1.10
-pyOpenSSL>=0.13
-pygeoip>=0.2.5
-#
-# This is a Tor Project mirror with valid SSL/TLS certs that is stable and fast
-#
-# Originally fetched from the hg repo on secdev.org:
-#   https://hg.secdev.org/scapy/archive/tip.zip#egg=scapy
-# Mirrored on Tor's webserver:
-https://people.torproject.org/~ioerror/src/mirrors/ooniprobe/scapy-02-25-2013-tip.zip
-storm>=0.19
-transaction>=1.3.0
-txtorcon
-wsgiref>=0.1.2
-zope.component>=4.0.0
-zope.event>=4.0.0
-zope.interface>=4.0.1
diff --git a/oonib/runner.py b/oonib/runner.py
deleted file mode 100644
index 407f10d..0000000
--- a/oonib/runner.py
+++ /dev/null
@@ -1,81 +0,0 @@
-"""
-In here we define a runner for the oonib backend system.
-We are just extending the
-
-"""
-
-from twisted.internet import reactor
-from twisted.application import service, internet, app
-from twisted.python.runtime import platformType
-
-import txtorcon
-
-from oonib.report.api import reportingBackend
-
-from oonib import config
-from ooni.utils import log
-
-def txSetupFailed(failure):
-    log.err("Setup failed")
-    log.exception(failure)
-
-def setupCollector(tor_process_protocol):
-    def setup_complete(port):
-        print "Exposed collector Tor hidden service on httpo://%s" % port.onion_uri
-
-    torconfig = txtorcon.TorConfig(tor_process_protocol.tor_protocol)
-    public_port = 80
-    # XXX there is currently a bug in txtorcon that prevents data_dir from
-    # being passed properly. Details on the bug can be found here:
-    # https://github.com/meejah/txtorcon/pull/22
-    hs_endpoint = txtorcon.TCPHiddenServiceEndpoint(reactor, torconfig,
-            public_port, data_dir=config.main.tor_datadir)
-    hidden_service = hs_endpoint.listen(reportingBackend)
-    hidden_service.addCallback(setup_complete)
-    hidden_service.addErrback(txSetupFailed)
-
-def startTor():
-    def updates(prog, tag, summary):
-        print "%d%%: %s" % (prog, summary)
-
-    torconfig = txtorcon.TorConfig()
-    torconfig.SocksPort = 9055
-    if config.main.tor2webmode:
-        torconfig.Tor2webMode = 1
-    torconfig.save()
-    d = txtorcon.launch_tor(torconfig, reactor,
-            tor_binary=config.main.tor_binary,
-            progress_updates=updates)
-    d.addCallback(setupCollector)
-    d.addErrback(txSetupFailed)
-
-class OBaseRunner():
-    pass
-
-if platformType == "win32":
-    from twisted.scripts._twistw import ServerOptions, \
-                                WindowsApplicationRunner
-
-    OBaseRunner = WindowsApplicationRunner
-    # XXX Current we don't support windows for the starting of Tor Hidden Service
-
-else:
-    from twisted.scripts._twistd_unix import ServerOptions, \
-                                UnixApplicationRunner
-    class OBaseRunner(UnixApplicationRunner):
-        def postApplication(self):
-            """
-            To be called after the application is created: start the
-            application and run the reactor. After the reactor stops,
-            clean up PID files and such.
-            """
-            self.startApplication(self.application)
-            # This is our addition. The rest is taken from
-            # twisted/scripts/_twistd_unix.py 12.2.0
-            startTor()
-            self.startReactor(None, self.oldstdout, self.oldstderr)
-            self.removePID(self.config['pidfile'])
-
-OBaseRunner.loggerFactory = log.LoggerFactory
-
-
diff --git a/oonib/testhelpers/__init__.py b/oonib/testhelpers/__init__.py
deleted file mode 100644
index 4dbb547..0000000
--- a/oonib/testhelpers/__init__.py
+++ /dev/null
@@ -1,5 +0,0 @@
-from . import dns_helpers
-from . import http_helpers
-from . import tcp_helpers
-
-__all__ = ['dns_helpers', 'http_helpers', 'tcp_helpers']
diff --git a/oonib/testhelpers/dns_helpers.py b/oonib/testhelpers/dns_helpers.py
deleted file mode 100644
index cb4ff9f..0000000
--- a/oonib/testhelpers/dns_helpers.py
+++ /dev/null
@@ -1,16 +0,0 @@
-from twisted.internet.protocol import Factory, Protocol
-from twisted.internet import reactor
-from twisted.names import dns
-from twisted.names import client, server
-
-class DNSTestHelper(server.DNSServerFactory):
-    def __init__(self, authorities = None,
-                 caches = None, clients = None,
-                 verbose = 0):
-        resolver = client.Resolver(servers=[('8.8.8.8', 53)])
-        server.DNSServerFactory.__init__(self, authorities = authorities,
-                                         caches = caches, clients = [resolver],
-                                         verbose = verbose)
-    def handleQuery(self, message, protocol, address):
-        print message, protocol, address
-        server.DNSServerFactory.handleQuery(self, message, protocol, address)
diff --git a/oonib/testhelpers/http_helpers.py b/oonib/testhelpers/http_helpers.py
deleted file mode 100644
index 3a76b9a..0000000
--- a/oonib/testhelpers/http_helpers.py
+++ /dev/null
@@ -1,154 +0,0 @@
-import json
-import random
-import string
-
-from twisted.application import internet, service
-from twisted.internet import protocol, reactor, defer
-from twisted.protocols import basic
-from twisted.web import resource, server, static, http
-from twisted.web.microdom import escape
-
-from cyclone.web import RequestHandler, Application
-
-from twisted.protocols import policies, basic
-from twisted.web.http import Request
-
-class SimpleHTTPChannel(basic.LineReceiver, policies.TimeoutMixin):
-    """
-    This is a simplified version of twisted.web.http.HTTPChannel to overcome
-    header lowercase normalization. It does not actually implement the HTTP
-    protocol, but only the subset of it that we need for testing.
-
-    What this HTTP channel currently does is process the HTTP Request Line and
-    the Request Headers and returns them in a JSON datastructure in the order
-    we received them.
-
-    The returned JSON dict looks like so:
-
-    {
-        'request_headers':
-            [['User-Agent', 'IE6'], ['Content-Length', 200]]
-        'request_line':
-            'GET / HTTP/1.1'
-    }
-    """
-    requestFactory = Request
-    __first_line = 1
-    __header = ''
-    __content = None
-
-    length = 0
-    maxHeaders = 500
-    requestLine = ''
-    headers = []
-
-    timeOut = 60 * 60 * 12
-
-    def __init__(self):
-        self.requests = []
-
-    def connectionMade(self):
-        self.setTimeout(self.timeOut)
-
-    def lineReceived(self, line):
-        if self.__first_line:
-            self.requestLine = line
-            self.__first_line = 0
-        elif line == '':
-            # We have reached the end of the headers.
-            if self.__header:
-                self.headerReceived(self.__header)
-            self.__header = ''
-            self.allHeadersReceived()
-            self.setRawMode()
-        elif line[0] in ' \t':
-            # This is to support header field value folding over multiple lines
-            # as specified by rfc2616.
-            self.__header = self.__header+'\n'+line
-        else:
-            if self.__header:
-                self.headerReceived(self.__header)
-            self.__header = line
-
-    def headerReceived(self, line):
-        try:
-            header, data = line.split(':', 1)
-            self.headers.append((header, data.strip()))
-        except:
-            log.err("Got malformed HTTP Header request field")
-            log.err("%s" % line)
-
-    def allHeadersReceived(self):
-        headers_dict = {}
-        for k, v in self.headers:
-            if k not in headers_dict:
-                headers_dict[k] = []
-            headers_dict[k].append(v)
-
-        response = {'request_headers': self.headers,
-            'request_line': self.requestLine,
-            'headers_dict': headers_dict
-        }
-        json_response = json.dumps(response)
-        self.transport.write('HTTP/1.1 200 OK\r\n\r\n')
-        self.transport.write('%s' % json_response)
-        self.transport.loseConnection()
-
-
-class HTTPReturnJSONHeadersHelper(protocol.ServerFactory):
-    protocol = SimpleHTTPChannel
-    def buildProtocol(self, addr):
-        p = self.protocol()
-        p.headers = []
-        return p
-
-class HTTPTrapAll(RequestHandler):
-    def _execute(self, transforms, *args, **kwargs):
-        self._transforms = transforms
-        defer.maybeDeferred(self.prepare).addCallbacks(
-                    self._execute_handler,
-                    lambda f: self._handle_request_exception(f.value),
-                    callbackArgs=(args, kwargs))
-
-    def _execute_handler(self, r, args, kwargs):
-        if not self._finished:
-            args = [self.decode_argument(arg) for arg in args]
-            kwargs = dict((k, self.decode_argument(v, name=k))
-                            for (k, v) in kwargs.iteritems())
-            # This is where we do the patching
-            # XXX this is somewhat hackish
-            d = defer.maybeDeferred(self.all, *args, **kwargs)
-            d.addCallbacks(self._execute_success, self._execute_failure)
-            self.notifyFinish().addCallback(self.on_connection_close)
-
-
-class HTTPRandomPage(HTTPTrapAll):
-    """
-    This generates a random page of arbitrary length and containing the string
-    selected by the user.
-    /<length>/<keyword>
-    XXX this is currently disabled as it is not of use to any test.
-    """
-    isLeaf = True
-    def _gen_random_string(self, length):
-        return ''.join(random.choice(string.letters) for x in range(length))
-
-    def genRandomPage(self, length=100, keyword=None):
-        data = self._gen_random_string(length/2)
-        if keyword:
-            data += keyword
-        data += self._gen_random_string(length - length/2)
-        data += '\n'
-        return data
-
-    def all(self, length, keyword):
-        length = 100
-        if length > 100000:
-            length = 100000
-        return self.genRandomPage(length, keyword)
-
-HTTPRandomPageHelper = Application([
-    # XXX add regexps here
-    (r"/(.*)/(.*)", HTTPRandomPage)
-])
-
diff --git a/oonib/testhelpers/ssl_helpers.py b/oonib/testhelpers/ssl_helpers.py
deleted file mode 100644
index 5c74996..0000000
--- a/oonib/testhelpers/ssl_helpers.py
+++ /dev/null
@@ -1,9 +0,0 @@
-from twisted.internet import ssl
-from oonib import config
-
-class SSLContext(ssl.DefaultOpenSSLContextFactory):
-    def __init__(self, *args, **kw):
-        ssl.DefaultOpenSSLContextFactory.__init__(self, 
-                config.helpers.ssl.private_key,
-                config.helpers.ssl.certificate)
-
diff --git a/oonib/testhelpers/tcp_helpers.py b/oonib/testhelpers/tcp_helpers.py
deleted file mode 100644
index 4d32ae0..0000000
--- a/oonib/testhelpers/tcp_helpers.py
+++ /dev/null
@@ -1,72 +0,0 @@
-
-from twisted.internet.protocol import Protocol, Factory, ServerFactory
-from twisted.internet.error import ConnectionDone
-
-from oonib import config
-from ooni.utils import log
-from ooni.kit.daphn3 import Daphn3Protocol
-from ooni.kit.daphn3 import read_pcap, read_yaml
-
-class TCPEchoProtocol(Protocol):
-    def dataReceived(self, data):
-        self.transport.write(data)
-
-class TCPEchoHelper(Factory):
-    """
-    A very simple echo protocol implementation
-    """
-    protocol = TCPEchoProtocol
-
-if config.helpers.daphn3.yaml_file:
-    daphn3Steps = read_pcap(config.helpers.daphn3.yaml_file)
-
-elif config.helpers.daphn3.pcap_file:
-    daphn3Steps = read_yaml(config.helpers.daphn3.pcap_file)
-
-else:
-    daphn3Steps = [{'client': 'client_packet'}, 
-        {'server': 'server_packet'}]
-
-class Daphn3ServerProtocol(Daphn3Protocol):
-    def nextStep(self):
-        log.debug("Moving on to next step in the state walk")
-        self.current_data_received = 0
-        # Python why?
-        if self.current_step >= (len(self.steps) - 1):
-            log.msg("Reached the end of the state machine")
-            log.msg("Censorship fingerpint bisected!")
-            step_idx, mutation_idx = self.factory.mutation
-            log.msg("step_idx: %s | mutation_id: %s" % (step_idx, mutation_idx))
-            #self.transport.loseConnection()
-            if self.report:
-                self.report['mutation_idx'] = mutation_idx
-                self.report['step_idx'] = step_idx
-            return
-        else:
-            self.current_step += 1
-        if self._current_step_role() == self.role:
-            # We need to send more data because we are again responsible for
-            # doing so.
-            self.sendPayload()
-
-class Daphn3Server(ServerFactory):
-    """
-    This is the main class that deals with the daphn3 server side component.
-    We keep track of global state of every client here.
-    Every client is identified by their IP address and the state of mutation is
-    stored by using their IP address as a key. This may lead to some bugs if
-    two different clients are sharing the same IP, but hopefully the
-    probability of such thing is not that likely.
-    """
-    protocol = Daphn3ServerProtocol
-    # step_idx, mutation_idx
-    mutation = [0, 0]
-    def buildProtocol(self, addr):
-        p = self.protocol()
-        p.steps = daphn3Steps
-        p.role = "server"
-        p.factory = self
-        return p
-
-
-

_______________________________________________
tor-commits mailing list
tor-commits@xxxxxxxxxxxxxxxxxxxx
https://lists.torproject.org/cgi-bin/mailman/listinfo/tor-commits