Skip to content

Experimenting with Rust Language by writing a basic scraper for fetching problems for a specific company

License

Notifications You must be signed in to change notification settings

krishnarb3/leetcode_scraper_rust

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Leetcode scraper

Fetches a random question from a set of specific companies and difficulties

Requires Leetcode premium session token

URL of the problem is output to console and/or posted as message to a discord webhook

Usage (CLI)

 cargo run --package leetcode_scraper --bin bootstrap -- --companies company1 company2 --difficulties difficulty1 difficulty2

Arguments

Companies:

--companies 

As per the request made in the graphql request for fetching problems for a company (Eg: https://leetcode.com/company/google/)

Difficulties:

--difficulties

One of Easy, Medium or Hard

Example invocation:

 cargo run --package leetcode_scraper --bin bootstrap -- --companies google facebook --difficulties Easy Medium

Environment variables

Leetcode session:

The value for leetcode session can be found in the cookie stored (The last key: LEETCODE_SESSION)

export LEETCODE_SESSION=value

Discord (Optional):

Create a discord webhook for the specific channel and set the following variable

export DISCORD_WEBHOOK_URL_KEY=value

Modes

CLI

AWS Lambda

Set the following environment variable:

export RUN_MODE="AWS_LAMBDA"

Building the zip for lambda:

docker run --rm -v $PWD:/code -v $HOME/.cargo/registry:/root/.cargo/registry -v $HOME/.cargo.git:/root/.cargo/git rustserverless/lambda-rust

Deploy lambda on AWS and set Rule using schedule entries (cron)

About

Experimenting with Rust Language by writing a basic scraper for fetching problems for a specific company

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages