IPFS optimanization for single and multiple user file storage, webhosting, multiple user sites for easier downloads and creating sites #1590
Labels
need/triage
Needs initial labeling and prioritization
Is your feature request related to a problem? Please describe.
Multiple users can upload a lot of files to one page, up to hundreds of gigabytes. And that much data is washed together. There is no proper security mechanism to prevent one user from modifying the files of another user. For multiple user websites, you cannot set the base website to load, and download files uploaded by other users separately can optional way can download the user. The other problem is that file sharing and restoring would be easier if this happened within a folder. Furthermore, more could be opened for decentralized social media platforms like forums, video sharing file sharing platforms etc. There is no option for users to delete or or modify their own content. IPFS pages are very difficult to recover.
Describe the solution you'd like
-A central folder using the website hash example https://ipfs.io/ipfs/QmXoypizjW3WknFiJnKLwHCnL72vedxjQkDDP1mXWo6uco/ website using a QmXoypizjW3WknFiJnKLwHCnL72vedxjQkDDP1mXWo6uco folder. Here, the user can freely modify his own content. Here, make it easy to modify the files by copying the files to the folder and authenticating with a private key the site.
-A key file like a .json file with a private key with which the owner signs the page and prevents other users from modifying it.
-In multiple user sites in the central folder/users folder/current user each user has their own folder with their own key. And here the content is only for the site administrator and can only be modified by that current user.
-If the IPFS client is installed than download the full contents of the base directory when a user load a IPFS site. If not offer to install the IPFS program on all IPFS sites. In multiple user sites download the file if the user click the current file. And if the content is not available then the program always resume the download until the user delete the progress. This is very important for distributing files.
-Separate download mechanism single user sites 1. for that current page only. 2.The full website with all files one button press after offering in a clearly visible place when the user load the page. Single user sites are max 1-2 GB these can be easily downloaded by anyone. No user designs their site so highly that it cannot be downloaded by other users.
2 in multiple user sites allow to download optional way all files. When the user click the file resume the file download until the user are delete it. This is can multi terabyte websites can work if the user download a central page and optional way within this, only the selected user content. So there is no need the IPFS client to treat multiple gigabytes of data as a single page by try to download the IPFS client. It would also be unrealistic to download multiple gigabytes one user/site.
And always resume the download until the user delete it.
-Regular automatic updates and publishing when new content appears on a given page.
-Be able to create decentralized social networking sites using this technology. Forums, video sharing services, etc using .js files in the central folder.
-Easy recovery of websites from the central folder selection. Or by pasting it into a central folder where the pages are collected.
-List of websites: In The IPFS client from the websites we want to download and have downloaded. With options Delete, update, and inside the website the file options.
-A fully decentralized solution would be good to run MSQl, PHP to setup platforms like phpBB, Vanilla, MyBB and etc forum software. No one has done this yet. Where the MSQl database are decentralized too.
Describe alternatives you've considered
It offers a working solution the ZeroNet platform.
The text was updated successfully, but these errors were encountered: