Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Request: elaborate on Evaluate #182

Open
richelbilderbeek opened this issue Jan 17, 2025 · 0 comments
Open

Request: elaborate on Evaluate #182

richelbilderbeek opened this issue Jan 17, 2025 · 0 comments

Comments

@richelbilderbeek
Copy link

Dear SPLASH,

Great to see this initiative!

I was reading the 'Evaluate' step and found this:

Balanced Evaluation: Combine quantitative and qualitative data to gain a comprehensive understanding of training effectiveness.

After reading this sentence I wish I would know which data (in this case, mostly survey questions I guess?) it is to gain an understanding of training effectiveness.

As far as I see, this is quite a big field with some counterintuitive results.

One example: some institutions use SETs ('Student Evaluation of Teachers') with the idea that high SETs correlate to better teaching. However, according to this meta-analysis [Uttl, Bob, Carmela A. White, and Daniela Wong Gonzalez. "Meta-analysis of faculty's teaching effectiveness: Student evaluation of teaching ratings and student learning are not related." Studies in Educational Evaluation 54 (2017): 22-42.], these are unrelated. Note that this is just the best paper I found on this subject and I am happy to be corrected.

So, I hope this simple sentence is elaborated on a bit more, as it would make its wisdom better applicable. Would be great!

Thanks and cheers, Richel

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Development

No branches or pull requests

1 participant