Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat: modify workflow updateContributor to be trigger by a repository dispatch event and extract contrbuton of all public repo of frontendmu #248

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .github/workflows/update-contributors.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@ on:
pull_request_target:
types:
- closed
repository_dispatch:
types: [trigger-workflow]

jobs:
update-contributors:
Expand Down
13 changes: 13 additions & 0 deletions packages/frontendmu-data/scripts/update-contributors.config.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
{
"includedRepositories": [
"frontend.mu",
"video",
"google-photos-sync",
"conference-2024",
"frontendmu-ticket"
],
"excludedContributors": [
"actions-user",
"github-actions[bot]"
]
}
61 changes: 42 additions & 19 deletions packages/frontendmu-data/scripts/update-contributors.js
Original file line number Diff line number Diff line change
@@ -1,43 +1,62 @@
import fs from "fs";
import { execSync } from "child_process";

const owner = "Front-End-Coders-Mauritius";
const repo = "frontend.mu";
const owner = "frontendmu";
const branch = "main"; // Replace with the default branch of your repository

const contributorsFile = "./data/contributors.json";
const configSource = `https://raw.githubusercontent.com/${owner}/frontend.mu/${branch}/packages/frontendmu-data/scripts/update-contributors.config.json`

async function updateContributors() {
try {
const response = await fetch(
`https://api.github.com/repos/${owner}/${repo}/contributors`
const config = await loadConfig();
const includedRepositories = config.includedRepositories;
const excludedContributors = config.excludedContributors;

const allPublicRepositoriesList = await fetch(
`https://api.github.com/users/${owner}/repos`
).then((response) => response.json());
Comment on lines +16 to +18
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Handle potential pagination when fetching repositories

The GitHub API returns up to 30 repositories per request by default. If the user has more than 30 repositories, some repositories may be missed. Consider handling pagination or increasing the per_page parameter to ensure all repositories are fetched.

You can modify the fetch URL to include ?per_page=100:

- `https://api.github.com/users/${owner}/repos`
+ `https://api.github.com/users/${owner}/repos?per_page=100`

Or implement pagination as follows:

let allPublicRepositoriesList = [];
let page = 1;
let per_page = 100;
let hasMore = true;

while (hasMore) {
  const response = await fetch(
    `https://api.github.com/users/${owner}/repos?page=${page}&per_page=${per_page}`
  );
  const repos = await response.json();
  if (repos.length > 0) {
    allPublicRepositoriesList = allPublicRepositoriesList.concat(repos);
    page++;
  } else {
    hasMore = false;
  }
}


const allPublicRepositories = allPublicRepositoriesList.map(
(repo) => repo.name
Comment on lines +20 to +21
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Add error handling for fetching repositories

There's no check to verify if the repository fetch request was successful. If the API call fails or returns an error, the script may throw an exception when trying to process the response.

Consider adding error handling:

const response = await fetch(
  `https://api.github.com/users/${owner}/repos`
);
if (!response.ok) {
  throw new Error(`Error fetching repositories: ${response.status} ${response.statusText}`);
}
const allPublicRepositoriesList = await response.json();

);

const result = await response.json();
const contributors = result
.map((contributor) => {
return {
username: contributor.login,
contributions: contributor.contributions,
};
})
.filter((contributor) => {
// Exclude the following contributors from the list
const excludedContributors = ["actions-user", "github-actions[bot]"];
return !excludedContributors.includes(contributor.username);
const contributorsMap = {};

for (const repo of allPublicRepositories) {
if (!includedRepositories.includes(repo)) {
continue;
}
const contributorsList = await fetch(
`https://api.github.com/repos/${owner}/${repo}/contributors`
).then((response) => response.json());

contributorsList.forEach((contributor) => {
if (!excludedContributors.includes(contributor.login)) {
if (contributorsMap[contributor.login]) {
contributorsMap[contributor.login] += contributor.contributions;
} else {
contributorsMap[contributor.login] = contributor.contributions;
}
}
Comment on lines +34 to +41
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Refactor contributor aggregation logic using a Map

While the current logic works, using a Map can make the code more efficient and cleaner when accumulating contributions.

Here's how you can implement it:

const contributorsMap = new Map();

contributorsList.forEach((contributor) => {
  if (!excludedContributors.includes(contributor.login)) {
    const currentContributions = contributorsMap.get(contributor.login) || 0;
    contributorsMap.set(contributor.login, currentContributions + contributor.contributions);
  }
});

Then, when creating updatedContributors:

const updatedContributors = Array.from(contributorsMap, ([username, contributions]) => ({
  username,
  contributions
}));

});
}
const updatedContributors = Object.entries(contributorsMap).map(([username, contributions]) => ({
username,
contributions
}));
Comment on lines +24 to +47
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Add error handling when fetching contributors.

While the logic for aggregating contributions has been improved, there's still no error handling when fetching contributors for each repository. Consider adding try-catch blocks or checking the response status to handle potential API errors or rate limiting issues.

Example:

const contributorsResponse = await fetch(`https://api.github.com/repos/${owner}/${repo}/contributors`);
if (!contributorsResponse.ok) {
  console.error(`Failed to fetch contributors for ${repo}: ${contributorsResponse.status}`);
  continue;
}
const contributorsList = await contributorsResponse.json();

const contributorsData = JSON.stringify(updatedContributors, null, 2);

const updatedContributors = [...new Set(contributors)];
console.log(contributorsData)

if (
JSON.stringify(updatedContributors) !==
JSON.stringify(getExistingContributors())
) {
fs.writeFileSync(
contributorsFile,
JSON.stringify(updatedContributors, null, 2)
contributorsData
);
console.log("Contributors file updated.");

// Configure Git user and email for the commit
execSync('git config user.name "GitHub Action"');
Expand Down Expand Up @@ -70,4 +89,8 @@ function getExistingContributors() {
return [];
}

async function loadConfig() {
return await fetch(configSource).then((response) => response.json());
}
Comment on lines +92 to +94
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Enhance the loadConfig function with error handling and timeout.

While the loadConfig function is concise, it lacks error handling and a timeout setting. Consider adding these to improve robustness:

async function loadConfig() {
  try {
    const response = await fetch(configSource, { timeout: 5000 });
    if (!response.ok) {
      throw new Error(`HTTP error! status: ${response.status}`);
    }
    return await response.json();
  } catch (error) {
    console.error('Failed to load config:', error);
    throw error;
  }
}


updateContributors();
Loading