Existing search engines do not provide search in the previous versions of the index files like HTML, but only in the cached and supposedly recent version of the file. We argue that content of the WEB is getting more and more dynamic and updated much more frequently than in the past. Rodi's functional requirements include file version manager which will support content searchs for previous versions of the file as well as in the current one.
In the local networks typical enterprise search engine requires to share files and folders. Rodi search engine provides additional option - running daemon on the host. Daemon checks the access rights of the remote search engine and takes care of encryption. Rodi search engine allows to specify IP subnet(s) to scan and does not have to rely on exact IP addresses of the servers.
Data distribution networks today provide only search in the file names (if any) and no content search. They were originally created for delivery of binary or un-searchable content. Rodi network functional requirements include context sensitive content search. Because Rodi is distributed network keyword rating and consequently search results can differ from publisher to publisher. One can view the Rodi network as a group of loosely related or completely unrelated search engines. Publishers belonging to the same Rodi House can use the same function when calculating keywords rate.
Security is a huge problem for the existing network. Rodi has two answers for the problem. Publisher can hide behind bidirectional or unidirectional bouncer (aka Proxy) and spoof source IP address of the sent packets. Among additonal tools are DSA based authentication and end-to-end encryption. Using these options publisher can effectively hide IP address of the server and prevent some types of DDoS attacks
Rodi supports NAT penetration and works in firewalled environments actively avoiding traffic analyzers. Traffic analysers use some simple rules based on IP address and port number to collect the statistics or even drop the packets if ISP's decide that the traffic is illegal or parasitic. In the more advanced analysers "deep inspection of packets, including the identification of layer-7 patterns and sequences" is supported. Rodi cleint can use a simple encoding algorithm to encrypt the packet