Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make a linkedin api call to grab linked in photos #1029

Closed
2 of 4 tasks
jbubar opened this issue Feb 10, 2021 · 6 comments
Closed
2 of 4 tasks

Make a linkedin api call to grab linked in photos #1029

jbubar opened this issue Feb 10, 2021 · 6 comments
Labels
Complexity: Large P-Feature: Wins Page https://www.hackforla.org/wins/ role: back end/devOps Tasks for back-end developers size: 3pt Can be done in 13-18 hours

Comments

@jbubar
Copy link
Member

jbubar commented Feb 10, 2021

Dependency

Overview

Create a linked in scraper to grab a linked in photo given someone's linked in profile, for the wins page

Action Items

  • write a LinkedIn scraper
  • figure out what is the best way to save the LinkedIn photos
  • create a GitHub action to execute it

Resources/Instructions

LinkedIn Profile Scraper
Data Scraping With GitHub Actions

@jbubar jbubar changed the title Make a linkedin scraper to grab linked in photos Make a linkedin scraper to grab linked in photos Feb 10, 2021
@jbubar jbubar added role: back end/devOps Tasks for back-end developers Complexity: Large labels Feb 10, 2021
@akibrhast
Copy link
Member

akibrhast commented Feb 13, 2021

@jbubar

use linkedin photos on wins page

figure out what is the best way to save the linkedin photos

  • What about saving it as _data/linkedin_data.json ?
  •  {
       id: <linkedin_id>,
       image: <linkedin_image_url> 
     } ```
    

To scrape the page we are going to need their linkedin id/url/username. Currently the wins page submission form provides an optional field for the url. But then it has two other fields which are none optional. I.e. This would allow a submission to state that they allow the use of the linked profile picture but not provide a linkedin url. Which might through conditionals that are using the wins-data.json file off.(Also the same question is asked twice?)
image

I was thinking the form could be reworked such that a non-optional LinkedIn URL input is shown only if the users answers yes to "Could we use your linkedin profile picture next to your story?"

@jbubar
Copy link
Member Author

jbubar commented Feb 14, 2021

@akibrhast That is great thinking! Yes, Yes, and Yes!! I think you are right on all of the above. I like the idea of having a linkedin_data.json file, and separating out the issue of connecting it to the wins page. I think this would be a medium sized issue. And I agree with your idea to rework the form. Which one of those things would you like to do?

@akibrhast
Copy link
Member

akibrhast commented Feb 17, 2021

👽 👽 👽 @ahdithebomb @IAgbaje @pawan92 @tarang100 👽 👽 👽

  1. After looking into writing a scraper for a person's LinkedIn photo. I have discovered that it will not be possible to write an anonymous scraper for LinkedIn. LinkedIn has made impossible to crawl through their site unless you are logged in.
  2. As such I believe we are going to have to create a hackforla LinkedIn account for the bot to scrape(it’s something Bonnie needs to do I believe)
  3. If setting up a bot LinkedIn account is not too difficult, I’d like to take this issue on now. Otherwise, I’ll add it as a dependency to this issue.

@akibrhast
Copy link
Member

@akibrhast That is great thinking! Yes, Yes, and Yes!! I think you are right on all of the above. I like the idea of having a linkedin_data.json file, and separating out the issue of connecting it to the wins page. I think this would be a medium sized issue. And I agree with your idea to rework the form. Which one of those things would you like to do?

@jbubar If you could please take care of separating ' use linkedin photos on wins page' into it's own issue and adding this as a dependency that would be awesome!

Once we have a both linkedin account I would like to take care of the scraper and the github actions

@akibrhast
Copy link
Member

Update

According to @ahdithebomb

I believe this is in line with a letter we are aiming to write to LinkedIn for access for their API

@akibrhast akibrhast added the Dependencies Pull requests that update a dependency file label Feb 17, 2021
@akibrhast akibrhast added status: Blockers and removed Dependencies Pull requests that update a dependency file labels Feb 24, 2021
@ExperimentsInHonesty ExperimentsInHonesty changed the title Make a linkedin scraper to grab linked in photos Make a linkedin api call to grab linked in photos Mar 26, 2021
@ExperimentsInHonesty ExperimentsInHonesty added Dependency An issue is blocking the completion or starting of another issue and removed status: Blockers labels Mar 26, 2021
@sayalikotkar sayalikotkar added the Feature Missing This label means that the issue needs to be linked to a precise feature label. label Jun 20, 2021
@ExperimentsInHonesty ExperimentsInHonesty added P-Feature: Wins Page https://www.hackforla.org/wins/ and removed Feature Missing This label means that the issue needs to be linked to a precise feature label. labels Jun 20, 2021
@Sihemgourou Sihemgourou added this to the z. Excellent level milestone Aug 9, 2021
@SAUMILDHANKAR SAUMILDHANKAR added the size: 3pt Can be done in 13-18 hours label Jun 18, 2022
@ExperimentsInHonesty
Copy link
Member

This is not possible and is a violation of their terms of service.

@ExperimentsInHonesty ExperimentsInHonesty removed the Dependency An issue is blocking the completion or starting of another issue label Oct 6, 2024
@ExperimentsInHonesty ExperimentsInHonesty closed this as not planned Won't fix, can't repro, duplicate, stale Oct 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Complexity: Large P-Feature: Wins Page https://www.hackforla.org/wins/ role: back end/devOps Tasks for back-end developers size: 3pt Can be done in 13-18 hours
Projects
Development

No branches or pull requests

6 participants