-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Map reports to rows in aggregate csv report [WIP] #133
Conversation
@@ -100,17 +103,28 @@ const processResults = (processObj) => { | |||
} else { | |||
filePath = path.join(tempFilePath, replacedUrl + '.mobile.report.' + opts.output); | |||
} | |||
// // @TODO ASYNC IS BETTER BUT WE ARENT CURRENTLY AWAITING THE PROMISE PROPERLY |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
With the original version, I don't think the promise was being properly await
ed so I was getting errors when my new code tried to interact with the last individual report. I haven't had time to study it closely, but I think there was a race condition that just didn't matter until I added more moving peaces down-stream. I could be wrong, but switching to the synchronous version of writeFile
fixed the problem immediately.
'helpers', | ||
'lighthouse', | ||
'7_15_2020_6_15_05PM', | ||
'tgiles.github.io_web-resume.html.desktop.report.csv' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To reduce brittleness, this test should probably use it's own mocked report. (I reused and existing one to save time.)
}); | ||
}); | ||
|
||
describe("reportToRowHeaders", () => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is testing implementation more than anything. We should actually be testing the headers of the generated CSV...
Thanks for your contributions to the tool @emersonthis I've recently started a new full time role so my responses may be delayed and what not. When you're ready for me to review this, please let me know |
@TGiles I'm in a similar situation. This patch implements the functionality that we needed for the client project we're working on. I need to finish the rest of that project before I can loop back and add missing tests, etc. I'm honestly not sure when that will be. I recommend that you review this, as-is whenever you're ready. Assuming it's a direction you want to pursue, hopefully our combined review comments will make it clear what else needs to be done before this is ready to be merged. Or, perhaps you feel comfortable merging it in something close to the current state, and we just open issue to remind us to address the loose ends later. This would prevent it from "rotting" if the PR sits open for a long time... |
ah, think I found the root for the async stuff. Running Lighthouse concurrently causes some weird behavior, according to this bug
|
Sounds like the issue you referenced might be the cause of #134. I think the async issue I mentioned in the comments above is actually the same as #134. However I'm not sure if the supposed fix (switching to sync) in this PR matches fits with that story. My gut feeling is that we should figure out #134 first and then loop back to this. What do you think? |
I agree that the root cause should be solved first, whatever that fix looks like. I'll probably need to refactor the main Lighthouse loop into Lighthouse being invoked by child processes. |
Somebody think of the PRs! |
This works, but I consider this still to be a proof of concept because there are couple loose ends (see comments) and I haven't had time to use it much (or write much test code). So I don't recommend merging this exactly as-is, but I wanted to make sure this work made it upstream somehow in case I get sidetracked and don't have a chance to tidy it up.
Thanks for your help thus far. And thanks for maintaining this cool tool.