gnunet-developers
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [GNUnet-developers] vsftpd/wget/gnunet


From: Jan Marco Alkema
Subject: Re: [GNUnet-developers] vsftpd/wget/gnunet
Date: Sun, 29 Dec 2002 17:31:34 -0800

Igor,

Thank you for your feedback. It gives me new ideas and thoughts about this
subject ----)

>Your first private suggestion regarded, again, the use of
>ODBC/JDBC/whatever for GNUnet. There is just one point to
>consider: "whats the gain?". I, for one, can't see it.

In the past I have programmed some “flat file database” systems. At that
time I was very happy with it, but now (in helicopter view) I see that I
programmed things with are better worked out in real database programs.

I have tested 10 million of file footprints (size, name, MD5, path) in
MySQL. It was very fast, if you put indexes on the attributes.

In the past end users does not have “professional” database application.
Today I see in mine Office professional 2000 SQL server and Access. MySQL is
available for Windows. On Linux you have also professional databases
applications (Postgress, MySQL, Oracle, etc). The argument is that we have
an opportunity to use these programs. In mine point of view they are not
limited by a fix amount of something. Only the physical disk capacity can be
a restriction. Disk capacity is cheap nowadays ---)

The opportunity /chance is not how to put information in a database system,
but how to explore the information out of the databases.

In the future you will link the available Internet databases with each
other. For example you link URL= www.cs.purdue.edu/pub/redhat-8.0.dvd.img ->
West Lafayette Indiana -> physical coordinates x = .. y = .. z = .. ->
telephone number = 317-123-4567

>The truth is that even though the *industry* loves 3 and 4 letter words and
drops them on every occasion it can, I don't think those with academic
background (like some of the gnunet developers, I gather)so much appreciate
it.

I can’t judge that professional databases application don’t have an academic
background/people/issues. In Dutch they say “schoenmaker blijf bij je leest”
. Do what you are good in. For instance: “File sharing system builders” make
“file sharing systems” and database application builders make database
applications.

>A business organization can probably make better sales if it has a new
combination of four letter words to offer every year, but the fact is that
in this case it hardly makes our product (gnunet) significantly
better.

It does not matter to me what a “business organization” tells. I see the
results of 10 million file “footprint” entries. It was pretty fast, easy to
implement.

>A set of services we need from a dbmg are very simple and our data is very
uniform.

If you have for example 15 million host keys and 20000 million footprints of
files than you get a lot of problem if you haven’t a good database design
with good indexes. I know that every thing can be programmed, but
professional database makes it a lot easier for you. The other important
thing is the authorization of files. Professional database have grant
function on table/databases.

>Though I personally like the phrase about "not reinventing wheels", I am
*even more* fond of the saying "if it works, don't fix it". That is, if we
can pull the thing off with a simple and straight forward mechanism, that is
what we should go for.

>"if it works, don't fix it".

I agree with you not to change a running gnunet system at this moment. In
mine point of view the database odbc/jdbc thing could be developed in
parallel. Put footprint of files in the database and make relations in the
footprints. For example Redhat 8.0 professional exist of x CD’s and 1 DVD.
N.B. ‘Some’ files of Redhat 8.0 are also on previous versions of Redhat.
Integration of both things can be done eventually.

>If someone needs to use mysql/postgresql/somesuch with
GNUnet so that he/she can mouth statements like "oh boy
I have triggers now and locks and nested queries and lots
of things I don't understand", its very fortunate that
GNUnet 0.5.0 is programmed in quite modular fashion.

I see that SQL has a lot of advance of the gnunet flat file system. There is
a very lot of knowledge of SQL on Internet and in the brains of people. To
learn something other than SQL cost a lot of effort. N.B. With Windows
installer it is very easy for a normal end user to install a MySQL database
system.

>It should be relatively easy to code an optional *interface*
for commercial or highend databases, if someone wants to use such.

I don’t want to suggest that professional databases are “highend databases”,
because they are available for the end user for using in 2002. N.B. Years
ago I was very happy with a Intel 8088 processor, today a Pentium 4 2400 Mhz
make me ‘happy’.

>Also, in the spirit of open source, volunteers
are welcome to implement and submit such pieces of code
to be included to the project, providing that they meet
the necessary licence, quality, etc. considerations
(where applicable).

If someone will help to make a superiors file sharing system let me know.
Together we can make it happen ---)

>Its just that the existing developers go to directions they seem
worthwhile, and in addition they have much of other concerns. Often the most
honest advice to a more complex feature request is "if you
want it, do it yourself".

>"if you want it, do it yourself".

I think a good suggestion. But I think that if you know how to implement
jdbc/odbc it is not so difficult as it may seem. Use as much standard code
is mine advise. I suggest jdbc/odbc because I have seen than sites made with
MySQL in new releases made the change to odbc if they want to support more
database applications.

> or low disk overhead (that was the reason for bloom filter implementation)

I don’t see the advances of the bloom filter yet?

>Its the bandwidth whats the problem, and overhead in routing. If most users
can offer a 2kb trickle, 2kb trickle it will be, or worse

In Holland we have Cable (128 kbit upstream) and ADSL (fast 256 Kbit
upstream). The tendency is to greater speed. On 1 January 2003 I get more
download speed on mine ADSL server. I know that if I download from someone
else his uplink speed does limited the download.

>If you can't even send queries out fast, you shouldn't be dreaming of
receiving anything fast.

If you synchronize the footprint databases with each other I don’t think you
want to sent much queries out of the network. Only if I don’t have it in
mine ‘local’ database I do a query on the network.

>if only a few are using encryption, it will look like they have something
to hide. Similarly, it would make anonymous traffic more suspicious than
nonanonymous traffic.

I don’t think that is an argument for me.

For example I got the last Nationaletelefoongids (500 Mb zip file). I make a
image with CloneCd. The image I zip with winzip. The result I put on mine
FTP server. Richard logs in and retrieves it with wget.

In mine opinion. I put in the "Nationaletelefoongids CD" in mine Windows PC
and catalog the CD. I grant Richard in a Java GUI to download it one’s. The
system must copy the CD, zip it on mine local system before transferring it.
If I click on delete when transferred the CD info will be deleted when
transfer is successful. The footprints will remain in the database. In mine
opinion the gnunet system must manage the download. Which protocol (vsftpd
etc) is best, when must it be suspended or when resumed. The user can give
extra information, for example transfer before noon tomorrow. In a transfer
database (screen in Java GUI) I can see the status of the transfer. In
short: I put the Cd “Nationaletelefoongids CD in mine CDROM drive, I grant
Richard, Richard’s Cdrewiter burns “automatically” the CD in his Windows CD
somewhere in Holland.

>At least a careful design document (proof of concept) should be created
before even attempting it. That document should issue things like what
choices were made, why they were made, perhaps shed light on problems of
other systems, how this thing works better than them, and what it
gains/suffers when implemented on top of GNUnet.

Careful design document looks good to me. At lot of things I want to put on
the top of Gnunet is made in some way. There is a lot of sources out of
there. The great opportunity/problem is to integrate it with the rest. If
you look to http://www.grub.org they have the same goal/problems. We share
files they share URL’s.

Maybe you don’t get it: I want to make a mix between the Napster (central
database) idea and Gnutella (distributed database)idea. Everyone can have a
central database (by synchronization of the databases of the other nodes).

Maybe mine mid term goal is: Put as much data/audio CD/DVD “on line” with
“footprints” of files/tracks in a database environment,

If someone has comment / better ideas please let me know,

Greetings Jan Marco




reply via email to

[Prev in Thread] Current Thread [Next in Thread]