Skip to content

Scraping Websites with GO! #127161

Closed Answered by tech-guru42
bigoxdev asked this question in Programming Help
Discussion options

You must be logged in to vote

In Go, you can use the goquery library along with the bet/http package to scrape websites.

Frist, you'll need to send an HTTP request to the desired website and retrieve the HTML content. Then, you can use goquery to parse the HTML and extract the desired information.

Here's an example snippet in Go:

package main

import (
"fmt"
"log"
"net/http"

"github.com/PuerkitoBio/goquery"

)

func main() {
url := "https://example.com"

response, err := http.Get(url)
if err != nil {
	log.Fatal(err)
}
defer response.Body.Close()

doc, err := goquery.NewDocumentFromReader(response.Body)
if err != nil {
	log.Fatal(err)
}

doc.Find("h1").Each(func(i int, s *goquery.Selection) {
	fmt.Println(s.Text())
})

}

Y…

Replies: 2 comments

This comment was marked as off-topic.

Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Programming Help Programming languages, open source, and software development.
2 participants