Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: add gmaps scraping blog. #2772

Open
wants to merge 3 commits into
base: master
Choose a base branch
from
Open

docs: add gmaps scraping blog. #2772

wants to merge 3 commits into from

Conversation

souravjain540
Copy link
Collaborator

approved by adam and marketing.

@souravjain540 souravjain540 requested a review from vdusek December 13, 2024 06:09
Copy link
Contributor

@janbuchar janbuchar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reviewed the article, it is very good and reads well, but it doesn't use the full potential of Crawlee in some places - let's improve that 🙂

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is huge, isn't there a more adequate format than gif?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i don't know. any suggestions? gif kinda fits here

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could this be webp as well?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes

Comment on lines +167 to +169
@crawler.router.default_handler
async def default_handler(context):
await scrape_google_maps(context)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the default_handler does not make much sense here...

Suggested change
@crawler.router.default_handler
async def default_handler(context):
await scrape_google_maps(context)
crawler.router.default_handler(scrape_google_maps)

This should be enough if you want to keep the handler definition outside of the main function.

"""
page = context.page
await page.goto(context.request.url)
print("Connected to:", context.request.url)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
print("Connected to:", context.request.url)
print("Processing: ", context.request.url)

Comment on lines +285 to +287
# Pretty-print the data
print(json.dumps(data, indent=4))
print("\n")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not how it's supposed to be done - it'd be better to use context.push_data(data)

Comment on lines +370 to +371
with open('google_maps_data.json', 'w', encoding='utf-8') as f:
json.dump(all_data, f, ensure_ascii=False, indent=2)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you use the default dataset for this, you can simply do crawler.export_data_json('path', ensure_ascii=False, indent=2)

First, we need a function that can handle the scrolling and detect when we've hit the bottom. Copy-paste this new function in the `gmap_scraper.py` file:

```python
async def load_more_items(page) -> bool:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. The article does not mention where the function should be called.
  2. Crawlee already has context.infinite_scroll() - does it not work in this case?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants