Advanced Cyber Data Management

Internet in expanding everyday. Trillions of bytes data are already available online in the form of webpages, uploaded files, software and many other forms. Among these huge amount of data many information are becoming obsolete. The truth about web-sites is, among all the websites created so far only 10-15% are most visited and rest of the websites are least visited. A huge number of files uploaded to file-sharing sites (like rapidshare or hotfiles) are rarely downloaded. Older versions of different software are hardly downloaded by net users. But the servers hosting these data are running round the clock, costing hardware resources, electricity and maintenance. If we keep going this, do we have any idea what will happen after another 20 years? Out internet might end up with huge amount of unused/rarely used data.

A possible solution to this problem is to invent something like DPTP (data priority tagging protocol) & HLPP (hardware on low power protocol) by the internet/IEEE engineers. DPTP will run on each servers hosting files and websites. It will build tags or indexes of frequently used data based on Alexa page ranking, DPTP internal algorithm and statistical analysis on last few days traffic. Then all the data stored on a specific server will be sorted and the most frequently used data will be separated from the rest. This separation will be physical separation, which means frequently used data/files will be stored on some selected portion of the storage. Most of the bandwidth will be allocated for that portion of data.

The job of HLPP will be to maintain the least desired data/information. The less used data of that server will stay in standby, maybe the RPM of the storage devices will be reduced or in case of solid-state disk (which is most likely the future replacement of current storage options) it will be temporarily shutdown by HLPP. This will save electricity and prolong the hardware lifetime. If any request comes for less prioritize data then the DPTP will detect the request and it will make a wake up call to HLPP to fetch that data and reply the request. However, there is one problem with this method, the user on the other end might experience delay.

Dump Backup Server is another concept which will be a offline backup server that might be used to backup/move obsolete files transferred from other servers. This server will be formed using old computers hard-drives, unused USB drives that people no longer use and dumped away. Which means an unreliable server full of unnecessary data. By using multi-port backbone interface consist of thousand of USB, SATA or IDE ports, storage will be formed for these type of servers. After a certain period this server will come online and synchronize with the main servers and backup very old files and then go offline.

All these are just some concepts for designing a cost-effective and environment friendly internet for the future. Why don't you think up something better?


Post a Comment

Twitter Delicious Facebook Digg Stumbleupon Favorites More

Powered by Blogger