-
Notifications
You must be signed in to change notification settings - Fork 4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Interactive Design Meeting 3/27/15 & 4/3/15 #1892
Comments
Why What role does Is this just for coding environments like Visual Studio or for other package uses like |
Hi @kuhlenh thanks for sharing this. I have a few concerns/questions on the proposal. First, on One option I can think of is establishing a convention called Packages.csx which is where all The idea of the entry level script when a package is loaded is interesting and I can see the init script being useful. On the other hand, it makes package installation more brittle, ie if an exception occurs in executing that code, then that has to be handled somehow. npm has install scripts, but many people also want to get rid of them for a similar reason. On the decision to only run the deepest script if the parent script loads it, that seems a little strange / violating separation of concerns. Why should the parent script have to have internal knowledge of the package it depends on? Can you give a bit more detail on this decision? One big thing I am not seeing in the proposal is an ability to introduce shared functions via scripts in a package. With |
Forgot to mention the other question, |
@glennblock Re #load of nuget package. It's up to the host to support this. If your host prefers to list nuget packages separately in package.json and then consult that file before compiling the script that's fine. However, the advantage of having nuget references directly in the script (in #load directive) is that the scirpt file is all I need. Everything is specified there. I don't need to carry around another file. Re tooling - tooling can scan the script and find #load directives in the file. Re init script - it doesn't make installation brittle. The script is not run at install time. The script is included into the compilation of the script that contains #load and the code is thus executed at the time the script is executing. All packages has to be installed before that. Re shared functions - the init script is the place to define those functions. Since the init script is included into the compilation that #loads the package the functions will be available to the including script. |
@glennblock Re #u: the init script uses #u to define imports available globally, that is to the script including the package. |
@tmat Yes it could scan, but it also means individuals have to scan through all files to find the packages that are used. Having a separate place for packages is a well defined pattern in node, Ruby, Java (pom.xml) etc. Developers are accustomed to this pattern. They know go to the xxxx file to find if packages are present or not. Also there's the question related to installation. If I want to support an experience where the installer knows where to store the package name when I load from the CLI, it is easy to do with a separate file. Not so it it is embedded in the code. i.e. if I have multiple files, it has to determine what is the top-level file. And what if I have multiple top-level files? How does it know where to inject the As to the init script, ok, I didn't realize this ran at execution time. OK, that does solve my install concern. In terms of shared functions, the problem you introduce then is ambiguity as two init scripts can have functions (or other members) with the same name which are in completely different functions. We solve this in How do you see avoiding the ambiguity/conflict if other public members like functions are present? |
My simple-minded question regarding |
@glennblock I get there are scenarios where a separate packages.json is advantageous. And we should support them. However, we should imo also support nuget #load in the script, so that one can create standalone scripts (simple samples, shell scripts, etc.). @glennblock Obviously, some convention to prevent pollution would be a good idea. We were thinking about a similar convention to Python. When Python imports a module my_module a variable my_module is introduced and the functions defined in the module are encapsulated in it, I believe. If an init script of a package wishes to expose some functions it would define a top-level symbol of the same name as the package that exposes this functionality. That symbol can be a class for example, or it could be a property or a global import. The init script could like like: #r "MyLibrary"
#u MyLibrary.SomeNamespace
public static class MyPackage
{
public static void MyHelper() { ... }
}
// some initialization code Would it work? |
@stirno In the world of dnx, we would restore packages to your %UserProfile%.dnx\packages\ directory anyway, instead of having a packages folder in your solution. (We're also hoping to do this for non ASP.Net projects 😄) |
@tmat no objection to supporting inline package syntax for simplicity / REPL. |
@Pilchie as long as I don't end up with a mini-gac hell I am ok with that. Based on what we saw with our own global package woes, I worry about that. |
@Pilchie the nice thing about a local packages folder is everything is completely isolated. You don't accidentally get dependencies you didn't bet on, but which were satisfied based on a heuristic. |
@tmat yes having a wrapper class can work, that is basically what we do in scriptcs only we create it for you and name it based on a convention that uses the init filename. In our case it the init name is StartXXXX.csx and the XXX is used for the wrapper class name. This was in order to simplify the code and not require you to embed your own class. It gives it a more lightweight feel. You can use static methods on that class but we allow instance methods as well. If we did what you are suggesting, it would be good to enfrorce that no top level methods are exposed in the init script. |
@glennblock I think we could start with the explicit class. In case a class defined in the assembly included in the package already contains the helpers you want to expose you can do #u MyPackage = MyNamespace.Helpers; If it is too verbose we could consider "into clause" :
Which would do the wrapping of top level stuff into Foo singleton class accessible thru a global Foo property. This might actually be better than the other solutions since the script that is including the other script is in control. I can start with a #load directive without the clause and if I get ambiguity I need to resolve I can add the into clause. |
Oooh I really like that. |
This would def solve the problem with the added benefit that it would allow the consumer to resolve conflicts. 👍 |
Really like |
One of the design principle we talked about is the desire to have the scripts be sharable across various hosts. So we should "try" to not make the scripts dependent on some global state that host/environment sets up. Then the scripts will fail to run or behave differently in different host/environment. I wanted to mention this as it was explicitly stated in the notes. |
@ManishJayaswal I'd say we need to word it a bit more precisely. Of course, any extension points we provide via hosting API potentially make scripts host specific. I think we need to distinguish classes of scripts, each class designed for a specific kind of host. A different instance of the same host should compile the same script with the same semantics. For example, let's say that I design an app that is scriptable. I expose a host object MyApp that has certain API (object model) that the script can access. Then indeed scripts using that API will be host specific and won't run in other hosts that don't expose such API. But these scripts should compile in all instances of MyApp correctly. Similarly with extensibility of #r and #load -- since the strings are host-interpreted not all hosts will implement them the same. And that's fine. Of course, even if we limit ourselves to a particular class of scripts there might be failures due to machine state accessed from the script thru IO APIs etc. I'd call them run-time failures, as opposed to compile-time failures. |
Having shebang syntax would be beneficial. It describes the app the script is supposed to be executed by.
|
OK, I've read through the lot 😄 My feedback: DefinitionsIMHO we should drop the assumption that scripting is only for 'simple and small programs'. The considerable ecosystem of apps running on node, ruby, python, etc. shows that this isn't the case. It's certainly lightweight, one of it's strengths, but what people will go on to build with it, no one knows. Proposal for #r and #loadI like this behaviour since I do believe that nuget packages should become the units of reference rather than individual assemblies or scripts. I think it's also important that the search path is controllable. E.g. for scriptcs we would like this to be Seeding a script with other scripts for execution contextI'm not sure I understand the issue here or why Nested init.csxI think this is a no brainer. If a script calls IntoI really like this but I'm a little concerned about the idea of a singleton accessible via a global property. I'm guessing this would be a property on the host object? My concern here is portability across hosts. E.g. what if the host also provides a method or property named
I haven't thought too much about the implications of that though. Just thinking out loud. Nested dependenciesThis is outside the scope of the notes you posted, but whilst we're on the subject of referencing and loading NuGet packages, I'd like to plant the seed regarding nested dependencies, if it hasn't already in been planted. This would work similarly to node, i.e. each package gets the version of the package which it defines as a dependency, in it's own nested packages folder. This would remove dependency (I'll resist using 'DLL') hell for packages which only contain scripts, but challenges would remain for packages which expose assemblies. Just some food for thought. |
I assumed into was creating a new type not an instance. So you would do
|
@glennblock that could also work |
I am still on the fence of the described behavior of a neated init.csx, I don't agree not loading it is a no-brainer. As a package author who includes an init.csx, I do it with the intent that it should always be ran. Similarly in node if I include code in index.js in my module, I expect that to always be ran. |
@glennblock whether or not there should even be a facility to consume a package without running init.csx is another matter and you may have a good point there. @kuhlenh / @tmat can you provide the reasoning behind that proposal? I only meant that it is a no-brainer in the case where this facility exists, and when a caller script #loads N1, but N1 does not #load N2, that we should not take the liberty of running init.csx from N2, since it is an indirect dependency. |
Gotcha
|
@glennblock @adamralph The recommended way to consume packages would be to use #load which will run the init.csx. The #r option is to give flexibility to the consumer when they do not want to run the init script. Say you set some global usings using #u in init.csx which the consumer does not want because they introduce ambiguity in his code. However he wants to reference the libraries. This option gives them a way to do so. |
So what you are saying is #u namespaces are only in scope within init.csx On Thu, Apr 16, 2015 at 4:15 PM Manish Jayaswal notifications@github.com
|
@glennblock No. #u is a compilation level using (similar to VB project level import). Manish meant that if the consumer doesn't want the convenient usings and helpers that init.csx defines for them it has an option to use the assemblies packaged in the nuget directly via #r. |
Essentially, #r is equivalent to adding a reference to a .csproj in VS. |
As to the #r is this flexibility for security reasons? Or something else? I can see a few reasons one would write code in init.csx. One: To do some first time setup the first time the library is used. I do Two: Expose some helper functions to the client, this would be similar to Are there other scenarios you are thinking of? On Thu, Apr 16, 2015 at 4:26 PM Glenn Block glenn.block@gmail.com wrote:
|
OK let me up-level and see if I am getting this. You are envisioning a way to have standard NuGet packages expose a scripty interface as well through init.csx. Today in scriptcs land there are generally 2 packages, the original one, and the one that adds the scripty interface i.e. for example scriptcs.nancy adds a scripty interface to Nancy. In the new world, there could be one package which includes both the non-scripty and scripty interface. If one doesn't want to have the scripty interface loaded / the default usings injected, they just just use #r on the package and load the assemblies directly. Is that kind of it? |
Made some updates to my previous comment |
@glennblock Yup. |
@tmat OK, thanks for clarifying, that helps. The changes make sense in this context. |
So With regard to init.csx itself, do you envisage this supporting arbitrary script or will it have to conform to a certain shape? E.g. // init.csx
var x = 2;
public class Bar { }
public int Baz() { }
x * x As an arbitrarily invoked script, I'd expect this to provide a I guess |
I assume that The more I think about it, the more I favour @glennblock's suggestion of
or to be more language agnostic:
|
Interactive Design Meeting 3/27/15 & 4/3/15
Definitions
Design Team
The following individuals are members of the dream team:
Anthony D. Green
Chuck Stoner
Dustin Campbell
Kasey Uhlenhuth
Kevin Halverson
Kevin Pilch-Bisson
Lincoln Atkinson
Manish Jayaswal
Matt Warren
Tomas Matousek
Agenda: Scripting Scenarios
The purpose of these meetings was to figure out what scripting scenarios we want to support and propose solutions. Once we have this artifact, we can then have concrete discussions with scriptcs for possible future collaboration.
Proposal for #r and #load
We use the
#load
directive for sources (i.e., "include a script source") and the#r
directive for referencing assemblies (i.e., "reference the assembly and no scripts run"). A developer can actually use#load
or#r
on a NuGet, but each directive will lead to a different behavior. We would recommend a developer default to#load
NuGet packages--which will run any scripts present and these scripts can#r
assembly references. If a developer uses#load
on a NuGet that contains no scripts, we will#r
the package. Using#r
on a NuGet will reference all assemblies inside and no script will be run.NuGets are used everywhere in .NET now. We want to use this packaging paradigm in our scripts in order to seed them with execution context and reference assemblies. NuGet packages have the following directory structure: tools, lib, content, and build. The tools directory contains scripts and we would like to allow NuGet authors to put a default .csx script in there (e.g., "init.csx").
Examples
Example 1: If I want to reference an assembly--for example, Generic Lists--in my script, I would use:
Example 2: If I wrote a script that sets up execution context for my current script, I would use:
#load "myContextScript.csx"
Example 3: If the NuGet package I want to load has an initializer script init.csx that I want to run to help me set up my scripting environment, I would use:
Example 4: If the NuGet package I want to load has an initializer script init.csx that I do not want to run, I would use:
Seeding a script with other scripts for execution context
A problem we foresee with this scenario is that the user would have to manually
#r
every assembly from the seed script to use in the current script. To remedy this, we propose making a new "global directive"#u
. The#u
directive can only be used in scripts that initialize an environment or as a top-level script in order to prevent polluting the global namespace.To be continued...
We still are unclear on our solution for the scenario of running the intializer script init.csx for packages that have dependencies on other packages. We were only able to briefly touch on this, but our general proposal is:
If you
#load
a NuGet N1 with a dependency on another NuGet N2, we will only run N2's init.csx if N1's init.csx used#load
for N2. In other words, using#load
on N1 does not mean you will run all N1's dependent packages' init scripts.The text was updated successfully, but these errors were encountered: