1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

FTP Server [With its own file Database] ?

Discussion in 'General Software' started by newfellow, Mar 31, 2010.

  1. newfellow New Member

    Joined:
    Aug 28, 2009
    Messages:
    314 (0.17/day)
    Thanks Received:
    17
    Well, tested probably more than 20 close to 30 different server systems there is from google. Some on Linux most on Windows and I cannot believe this, but I am unable to find a single FTP server Software which would be intelligent about indexing the file system:

    A. Cache all Directories/Folders/Files in its own INDEX or Database.
    B. Read the Database ONLY as long there is no transfer. NEVER touch the hard drive before actual file data is required.

    Next thing I will try will be Oracle Database Filesystem which takes a bit to be builded, but by looks of the documents even that will not serve this purpose. The good question is why does any of the FTP server software out there use hard drive EVER to read the structures as it's incredible slow and there is absolutely no point using hard drive what so ever unless transfer occure.

    So, does anyone have any suggestions?
    (Please skip the NTFS / File system part completely on any of the suggestions. Since any contact to file system will automatically raise hardware and no Windows Indexing doesn't exactly help on here either. Whole concept of relaying to any file system is incorrect.)
     
  2. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,029 (6.15/day)
    Thanks Received:
    6,094
    I don't see a reason why the FTP needs to index the files before hand and not just read the directory when it is needed.

    Of course the main reason against this is that any system like this wouldn't be up to date. Once the files system is indexed, if changes are made via something other than the FTP server itself the FTP server index would be outdate.

    We see similar issues with Windows Indexing, a file can be deleted and it still shows up in the index, or a file can be created and it doesn't show up in the index.

    Reading the actual file structure the moment it is requested insures that the file list is accurate, and really uses very little system resources.
     
    Crunching for Team TPU 50 Million points folded for TPU
  3. newfellow New Member

    Joined:
    Aug 28, 2009
    Messages:
    314 (0.17/day)
    Thanks Received:
    17
    I actually asked this on RaidenFTPd support and few other FTP server software companies. It would seem all of them are relaying an file system or some sort of caching by action (memory based). I simply cannot understand why none of these companies see point in 30-62ns DDR interface speeds or Windows Registry style FTP File system construction methods and instead of actually utilizing the resources they have they use 9-20ms access hard drives even the most capable SSDs won't be fast enough to idea of true indexing.

    The whole idea of an database based would be lightning fast solutions to an bigger server systems and also incredible useful for rebuilding an 'cache' only when needed and avoid most of the bad added content probably while at it.
     
  4. newfellow New Member

    Joined:
    Aug 28, 2009
    Messages:
    314 (0.17/day)
    Thanks Received:
    17
    Consider, if you would be running lefts 10-100 hard drive FTP. What happens, if for example user got for 'Folder A'

    1. Hard drive has to power up because there is no cache
    2. Hard drive has to read / list causes most of the problems for breaking hard drive.

    ^ Those 2 alone are reason to build hundreds of thousands costing softwares. So, why not simply caching code for FTP software.


    To access File system index hardware has to be powered up = pretty big no no.
    The FTP server index should be always out of date of course. timed checks only or manual update required to avoid bad copies from being added and generally it is good to know when newly added data would be in server instead of instant.


    Yes, but I was referring that FTP Index should be working exactly like Windows Registry not like Windows Indexing. Directly on memory.


    yes, this is true, but it fights everything I am searching for as index would read single location. reading 'on-site' would be hundreds of thousands of locations possibly which would raise every single hardware there would be in speed to do so.


    -edit-

    and on top of that I am testing here some of personal coding (trust me I suck at coding takes a lot even if I begin to try, hehe). Anyway, to build such database which would index would actually take less than 40 seconds for 300000 pictures with in 21000 folders structure. This list includes all nessessary information for <content>, Attributes, Size and could be shown clean out from server.

    according to that calculation no matter how large the actual cache/index would be it would still be incredible fast and even this could be improved very very much considering with something like MySQL.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page