I was quite unhappy regarding the space & processing power that require to store and process the network traffic data into general database. As well as the efficiency when handling the network flow. For an example in an ISP the data generated from Netflow is a significant amount that varies with the number of flow eg:Million flows per hour. Netflow considered De facto standard for network accounting / billing and now anomaly detection schemes. Therefore the general database structures not an efficient for processing large network traces. Therefore i was checking relevant projects in the academic world but AT&T Industry have come up with the project called Gigascope(Still i couldn't get the source code :) ). But similar product i found interesting CoMo Project. This area of researches are follows the "Network Data Streaming" Database structure.
Lets explore it "http://como.sourceforge.net/publications.php" :)