-
-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Date column in mm/dd/yyyy format in salesforce report produces NA in results tibble #93
Comments
Not sure how I missed it the first time, but I now see that guess_types is an arg in sf_execute_report() that ultimately controls if readr::type_convert() is applied. For a local copy of sf_run_report() I have added ye olde as an arg and passed it as an arg in the call to sf_execute_report() inside the sf_run_report, which allows me to get useable data that I can convert later. I think I see a use case for myself of being able to specify the type conversions by column to correct for issues like this. Any thoughts on the best way to add that functionality to the package? If I have time I could give it a spin, get to work for me, and then figure out pull requests to let it be reviewed for inclusion in your package. |
@lorenze3 Thanks for flagging. Yes, the interim solution is to use Lines 747 to 765 in acffaff
In this function, there is already a line of code to handle cases where reports return a dash "-" instead of a null value for integer and numeric columns. I'll do the same to handle any date fields. |
@StevenMMortimer Just clarifying (I see I wrote some lousy english) -- I did have to modify the sf_run_report function to be able to pass guess_types through to the sf_execute_report function. In my local copy I did this by adding the ellipses argument to sf_run_report and then passing that through to the call of sf_execute_report. I might take a swing at modifying utils-query.R to add the date handling you described. IF you are ok with a dependency on the anytime package, I think I can directly copy your logic above use anytime::anydate() in the mutate call (line 758 in the block for replacing '-'). Thanks for the excellent package, btw. It's going to be a huge help at work. |
Thanks to you both for this discussion. It helped me with a similar issue. It looks like this fix was implemented for |
@ckelly17 -- My workaround is a custom version of sf_run_report which adds the ... argument and passes it through to sf_get_report_instance_results. Full function pasted in below, but again my edits are only the ... two times: in the argument definition and the call to sf_get_report_instance_results. I guess that requires some commas to be added too.
-- |
Thanks! Will look into replicating. |
Crap -- just in case you aren't straight up copy and pasting that, I just saw that I had to ,... in a third place: the call to sf_execute_report that's used in the synchronous branch of the conditional. Which is actually the branch I use. sorry for any confusion, and hope it works well for you. |
Thanks! Copy/paste worked just fine. Really appreciate it! |
With apologies for the no doubt numerous github issue style violations below, the issue I have is that when I pull a report for sales for with a date column, I'm getting a columns of NAs in R.
From salesforce's front end, I see a column
Created Date
with values in mm/dd/yyyy format. The resulting tibble has 0 non-NA values in the same column.In a using sf_describe_report(), I can see the columns meta data has it described as a date. Based on cruising the utils-datatypes.R file I suspect that there is an underlying assumption that all date columns will be in yyyy-mm-dd format.
I'd love a pointer to the function(s) that are responsible for selecting data types . . . I thought I could do a quick rewrite of the sf_format_date() function, but either I don't understand the underlying issue or that function isn't called as part of the sf_run_report() chain of events.
Here's my actual R script:
And the relevant sf column metadata on
Created Date
column from the report_details list is:Any advice appreciated!
The text was updated successfully, but these errors were encountered: