Check out localgoogoo-go , a Go package that lets you use localGoogoo from the command line.
If you have the offline version of websites like (MDN, W3Schools, PHP.net, e.t.c), then this is a must have tool for you.
localGoogoo is a minimal search engine that saves you the stress of manually going through your offline websites looking for information.
With localGoogoo you just crawl/index these offline websites and just with a single search query you get the information you need.
- PHP: >= v7.x
- WebServer: Apache or Nginx
- Database: MySQL or MariaDB
git clone http://github.com/kodejuice/localgoogoo.git
cd localgoogoo
composer install
Make sure the localgoogoo
folder is placed somewhere under your local web document root. Your offline websites should also be under local web directory, localGoogoo wont be able to crawl them if they're not accessible via the http://
protocol.
Next, setup your database information in the config.json
file found in the root directory
config.json
{
"DB_HOST": "localhost",
"DB_USER": "root",
"DB_PASSWORD": "",
"DB_NAME": "localgoogoo"
}
Note: You can also setup your database information by running ./bin/localgoogoo config
You don't have to manually create the database, localGoogoo automatically does that.
After setup, visit (http://localhost/path/to/localgoogoo) you should see something like this:
And that's it, you can go to the crawled websites page to crawl/index websites, make your life easier.
If you're new to the Offline-websites thing, then you should check out HTTrack, a software that allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer.
localGoogoo is licensed under the MIT license.