|
From: | Niklas Höglund |
Subject: | Re: [GNUnet-developers] slocate |
Date: | Tue, 12 Aug 2003 14:56:20 +0200 |
User-agent: | Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.4) Gecko/20030624 |
jan marco alkema wrote:
Different programs need different database structures, because they use their data in different ways. In gnunet, all data is stored as encrypted 1k blocks, while in gnutella entire files are stored unencrypted.Hello Niklas, Christian, Thank you for your feedback ---) Niklas, I can see in your feedback that you have experience with file sharing systems --)This should probably be built as a separate application on top of gnunet,ftp, gnutella, ... In my opinion the applications gnunet, ftp, gnutella should use the same MySQL database structure. I prefer that the projects make and maintain the database interface to this structure. The database structure should be some kind of rfc???? document.
I guess that what you want is a unified way to search all this data. As someone wrote here earlier, there are projects to provide a single user interface on top of many peer-to-peer systems. I think "gift" is the name of one.
No, but if another node has all but one block, the overloaded node needs only to transfer that one block for a transfer of a file to be completed. Another faster node can provide all other blocks.The problem with protocols like FTP is that when things get overloaded it doesn't work any longer, while in gnunet the file would be propagated to other nodes automatically. This could make gnunet faster than FTP.If a gnunet node gets overloaded you maybe also have a problem. If this node is a single source all packets must be put on Internet. If 1 block is missing you haven’t the complete file.
I have downloaded files of a couple of megabytes in decent speeds from gnunet. That was probably files that were available from some fast node (or had migrated there).
Yes, but this happens automatically in gnunet. For every file, without human intervention.Just look at what happens in gnutella when many people try to download apopular file from a single source.This problem is the “same” as with the distribution of new releases. They copy the distribution to multiple (“mirrors”) computers. In your example. You want to download a file from "A". The singe source "A" should copy it to other computers ("B", "C", "D", etc). You must be rerouted to computer "C" for the real ftp download.
[...snip...] But, again, this is a bit further down the road, don't expect this to happen this year :-).
But in the world of programming, things usually take a factor pi more time than expected...In “The Linux world” projects work together (against Microsoft for example). If all the different project work together this could be realized a lot faster then probably next year ---)
[Prev in Thread] | Current Thread | [Next in Thread] |