-
Notifications
You must be signed in to change notification settings - Fork 129
RFC for *-EnvironmentVariable cmdlets. #92
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
List of cmdlets: Get-EnvironmentVariable Set-EnvironmentVariable New-EnvironmentVariable Add-EnvironmentVariable Remove-EnvironmentVariable
Just wanted to comment that a related concept is the sub-class of "Environment Variable that Contain Paths" have an entire host of their own special challenges. And in fact, due to the special processing done by Windows on "PATH" (e.g. Merging user + system to build process path, expanding environment variables, etc) - I think it might be a further sub-class of "Environment Variables that Contains Paths". Personally, I think that it should be separate CMDLets so that the functionality to handle these special cases is easy to discover. If it were buried in general environment variable CMDLets it would risk becoming the greatest secret feature in Windows history :) Here is the article that outlines the many issues and is backed by working PowerShell functions: |
Three other commands you "might" consider are Push-EnvironmentBlock -Desc "Before importing VS 2015 vars"
PSCX\Invoke-BatchFile "${env:VS140COMNTOOLS}..\..\vc\vcvarsall.bat"
# Do stuff with compiler, etc and when done:
Pop-EnvironmentBlock
# Env block set back to the state when Push-EnvironmentBlock was called This kind of thing is very handy when you need to bounce between different versions of VS compilers and tools (without your process $env:path growing like crazy). |
|
||
Add the new cmdlets to PowerShell to get\set\create the environment variables for machine\process\user. | ||
|
||
Currently users are forced to use the .Net api's to retreive\set the value of environment variables : |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggest "... retrieve\set the value of environment variables for the user and machine scopes"
The environment provider works fine for setting process scoped environment variables.
$ENV:foo = "bar"
Currently users are forced to use the .Net api's to retreive\set the value of environment variables : | ||
```powershell | ||
[System.Environment]::GetEnvironmentVariables() | ||
[System.Environment]::SetEnvironmentVariable("foo","bar") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suggest changing the example to show user or machine scope.
For the PATH-style manipulations, how about Update-EnvironmentVariable ? |
I think paths should have their own switches or CMDLets - it should not manipulate only PATH because there are custom path variables. Would be nice if scope took multiples or at least a CMDlet to also update the current process.
|
Hi, I think it is important that we have an easy way to manage environment variables easily, and for this, these ones seem just fine to me: Get-EnvironmentVariable
Set-EnvironmentVariable
New-EnvironmentVariable
Add-EnvironmentVariable
Remove-EnvironmentVariable The feature I would really like to have in the end with these cmdlets, is an easier, way to update the path variable. But this affects not only the path variable in my opinion. Addind to an existing array environment variableThe suggestion that @DarwinJS mention is too specific to the path environment variable. And this behavior would be identical, for any environment variable that contains an array. ($PsModulePath, or $PathExt are other examples, but a custom one would be the same). I think the solution (this module / these cmdlets) should take in consideration the fact that we want to handle environment variables that contains Arrays in an easier way. That's why I like the idea of @BrucePay for #Let's say the current value $env:path is -> C:\Program Files (x86)\Microsoft VS Code\bin;C:\Users\stephanevg\AppData\Local\Microsoft\WindowsApps;C:\Scripts
$Array = @("C:\Scripts","D:\MyFolder")
Update-EnvironmentVariable -Name "Path" -Value $Array
I would imagine in the case above, that it would simply try to add both of the values to the I imagine it returning something like this: Get-EnvironmentVariable -Name Path
C:\Program Files (x86)\Microsoft VS Code\bin;C:\Users\stephanevg\AppData\Local\Microsoft\WindowsApps;C:\Scripts;D:\MyFolder
Overriding existing valuesI think it could also be interesting to have a have a way to override the existing variable with a new value. #Let's say the current value $env:path is -> C:\Program Files (x86)\Microsoft VS Code\bin;C:\Users\stephanevg\AppData\Local\Microsoft\WindowsApps;C:\Scripts
$Array = @("C:\Scripts","D:\MyFolder")
Update-EnvironmentVariable -Name "Path" -Value $Array -Force
The command above would result in deleting the values that are not present in the source I would expect something like this: Get-EnvironmentVariable -Name Path
C:\Scripts;D:\MyFolder
|
@Stephanevg - I think your idea of array handling is great! However, this part: "....in deleting the values that are not present in the source $array ..." is not a good idea. It is a fundamental thinking flaw when engineering paths to think that one knows what else is in the path. Although I no longer do mass Desktop OS deployments - many users are capable of adding software to their systems and this would cause mass havoc. Though in very large numbers of servers the same is true - no one individual knows what should be on the path for all machines. So I would not even make such functionality available as it will create many problems. It would be ok to remove more than one stipulated path at a time - but not remove all other paths besides a given list. There are a couple places where paths are not like simple arrays:
Because of #2 any path updates should always require that a scope be specified (User, Computer or Process). Also because of #2, on removals it would be nice to specify that a path should be removed from both user and machine in the case that it appears in both (so that it is really, really gone). |
To support "array" env variables we could add |
In PSCX, we have the following command to support path env vars. I would love to see equiv functionality in the core set of commands:
The
|
I would also love to have the PathVariable management to be a bit more simplified. I like the fact of having cmdlets just for that. It would the things more discoverable, and anybody could simply find it by tabbing in the console. The Also, a user defined variable containing a delimited list , will also not be handled by the Perhaps it make sense to have both of the cmdlets sets?:
AND
|
Thinking about this more, I would like to make the case there should NOT be any CMDLet whose primary way of working is to replace the entire path with another value. Background: When thinking about this challenge I always conceive of myself running the code on hundreds or thousands of production snowflaked machines. Desktop machines will always be snowflakes and this should work well there - and many servers are too. Also running on a massive number of machines should work well - or else I'm just back to using my own custom stuff as this is when the real risks of messing up are the most likely and simultaneously the most damaging. As @Stephanevg stated the goal of dedicated CMDLets would be to ease the issue and be discoverable. However, if CMDLets are provided which replace the path value it then becomes an anti-pattern as it makes catastrophic damage very easy. In other words I believe helping people succeed has two edges
I think cleaning out the entire path should take a specific, dedicated action that makes one think about what they are doing. For instance, "Remove-Path -All" - would remove the entire path, but require -Force to actually work and would give a warning that this is a very unusual and probably damaging action to take. And it should never remove Windows and System32 even when this action is taken. Then to hurt themselves with a complete path replacement, the following code is required (and gives one reason for pause):
Here are some other features to keep people in the pit of success:
Also I think CMDLet names should be descriptive of what you are doing - you are adding a path to a variable - not adding a new path variable. So either "Add-Path" or "Add-PathToVariable" Add-Path -Variable -Path -Scope -Position -NumberOfBackups #position is important when the path add must be first, last or before or after something. Not giving this capability means a massive rabbit hole of engineering if I need specific path placement. I'm not sure what other CMDLets would be needed. Here are some other thoughts with regard to the CMDLet set:
|
First +100 to everything @DarwinJS says above - although Ensure-OnPath could probably be Add-Path's default behavior (only add if not already present) with -Force to override. Second, we also need this - https://stackoverflow.com/questions/171588/is-there-a-cmmand-to-refresh-environment-variables-from-the-command-prompt-in-w I'm trying to write scripts to provision dev machines on Windows but it requires that tool A's path mods are available immediately so I can install tool B using tool A i.e. install python (which installs pip) and then use pip to install conan. So yeah, an |
Nice catch @rkeithhill - I actually thought of that and then forgot about it while writing my last post. I would be in favor of a little more specific CMDLet name like: And what if you want to update the path, but not take on the risk of other variable changes?
I would also say that if there is a way to call the Windows API to do that path refresh that would be best. A work around if the recently added path has NO environment variables is to simply add the same path to the Process Path scope. But that does not handle more complex scenarios such as environment variables in the path, nor emulating the EXACT placement of the path in the list when windows overlays Machine and User. I'm not sure, but I would also guess that environment variables in the MACHINE path are either:
Chocolatey has a built-in refreshenv - I don't know if it is calling Windows APIs or has it's own code. In any case I would be very hesitant to try to reverse engineer exactly how Windows builds the final, expanded, overlayed "Process" path - if the native Windows algorithm is in accessible via API - it should be used. |
Heh, |
And |
I believe we should consider separately |
I don't know if anyone has started this yet, but for what it's worth, I think that any Finally, anything that's messing with "path" variables is different from other array variables in that it should probably have a switch to allow cleaning up the path and removing entries that point to paths that don't actually exist. It's also worth mentioning that path variables can contain variables (e.g. Obviously I should add a link to my limited implementation of this stuff for reference too 😉 |
In spirit, I agree with a lot of what's being said here about safety of updates, but I feel that core Cmdlets are the wrong place for it. Rather, if one is making mass updates to workstations, one should be looking at DSC, which is specifically designed for things like idempotence, guaranteed state and performing updates in a "safe" manner. In general, I believe simplicity should be favored for core Cmdlets and behavior that models the input of the user as exactly as possible, keeping in mind the principal of least astonishment and the question "What would I expect this Cmdlet to do if I had encountered it before". With this in mind, I certainly would not expect PowerShell to ever ignore my inputs as was suggested for one earlier edge case. I'm hesitant to agree with the inclusion of the suggested Regarding default scope of the write-oriented cmdlets, I find myself in an unfortunate position of preferring With the |
I think making this available only via DSC is too limiting. When we bootstrap/configure a build environment for a dev, we are doing this for each particular Git repo, not machine wide. For this scenario, a set of environment variable management commands is the way to go IMO. |
I agree with @rkeithhill that DSC is too limiting - in fact this should be in the smallest powershell footprint because environment variables are a foundational part of an operating system. |
Let me try to summarize the existing discussion and make some recommendations: 3 distinct areas of functionality have emerged:
I suggest keeping this RFC focused on (a) only; see my suggestions in a follow-up post. Below is a summary and food for thought for spinning off (b) and (c): (b), due to the many pitfalls and subtleties involved, as detailed by @DarwinJS, definitely deserves its own (set of) cmdlet(s).
Update-PathVariable
[[-Name]=<string>='PATH'] [-Remove <string[]>] [-Prepend <string[]>] [-Append <string[]>]
[-InsertBefore <string[]>] [-InsertAfter <string[]>]
[-Scope = 'Process'|'User'|'Machine']
[-Force] This makes it clear that changes should be limited to modifications of the existing value, as opposed to (potentially catastrophic) full replacement. Scopes Apart from The platform-appropriate (and confusingly named) One challenge on Windows is how to handle unexpanded entries such as Not finding targeted entries should be a quiet no-op (with As for (c):
|
As for (a), the original topic of this RFC: Instead of creating new cmdlets, I suggest extending the existing The # Create:
New-Item env:foo -Value bar # same as: $env:foo = 'bar' (implicit creation)
# Create / update:
Set-Item env:foo -Value baz # same as: $env:foo = 'baz'
# Set-Content env:foo -Value baz would work too.
# Get:
Get-Item env:foo # returns a [System.Collections.DictionaryEntry] instance with .Key/.Name and .Value
Get-Content env:foo # same as: $env:foo
# Remove:
Remove-Item env:foo # same as: $env:foo = '' With the addition of a dynamic This means that namespace notation - e.g., Given that manipulating persisted values should be a more deliberate action anyway, that distinction strikes me as reasonable. As stated, the More things to consider:
|
@PowerShell/powershell-committee discussed this, in general, we are not against having multiple ways to manage env vars so having cmdlets doesn't preclude improving the env provider (and adding the dynamic |
Given that scopes are an existing concept that typically have the prefix pattern
|
@vexx32 using the existing syntax would make sense |
What would happen when you embed an env var reference in the "$env:machine:path" nomenclature? (FYI these must be regular windows shell references like "%WINDIR%"). We should be very careful to disambiguate the very special scope of the PATH variable and all it's challenges versus a general environment variable. |
Are those cases handled at all, currently? |
Can I ask why you're for having multiple ways? What is the benefit of introducing multiple PowerShell solutions in this case (using the .NET types directly is always an option)? The down-sides are potential confusion (why are there different approaches, and how do they relate to each other?), and additional implementation and documentation effort. Or do you think that the provider improvements cannot provide all required functionality? In general, I think it's worth giving the provider model a little more love. |
Several providers do have their own cmdlets (see: However, I completely agree that improving the provider should come first. |
One thing that is getting lost here is that I opted in that the PATH environment variable is NOT LIKE ANY OTHER environment variable due to all these issues: https://cloudywindows.io/post/straight-and-narrow-code-for-safe-windows-path-updates/ With regard to all other environment variables: However, PowerShell's promiscuous typing and methods hold to the pragmatics of valuing discovery of how to do things over and above "techinical purity" (of non-duplication of APIs, et al.) In that I know I've personally had much more benefit from the pragmatic focus than the technical purity one. |
@mklement0 not being against something doesn't mean we are "for" something. The discussion we had is that PSProviders may be more difficult to use than cmdlets so having cmdlets in addition to the provider isn't necessarily a bad thing. Personally, I'm ok with the provider only solution as that's what I'm accustomed to, but I probably don't represent the general user. |
As @vexx32 rightfully points out, this duplication has ample precedent. Yet, precedent isn't in and of itself justification: My guess is that the duplication is part of the problem of neglecting the provider paradigm. If every area of functionality has dedicated cmdlets, the provider-based alternative becomes irrelevant, and the powerful abstraction underlying the provider model isn't broadly recognized. @DarwinJS: The discovery problem is well worth considering, but I would argue that sticking with the provider model is not about technical purity - on the contrary: it's about providing useful abstractions that can be applied to different areas of functionality, reducing learning effort. Now there may be areas where the provider model becomes too restrictive, but management of environment variables seems like a good fit, especially given that the current - limited - access is wholly based on the provider model, via namespace variable notation ( As for your concerns re If we apply the same safeguard to namespace notation - e.g., As for reading / writing the unexpanded values (raw
It's interesting to note that |
We think some of this is doable but not incredibly necessary or high-priority at this point. @charub had written some of the code originally before the RFC was written, I'm going to follow up offline with her to see if there's a starting place / minimum viable that we can ship. |
Spoke to @charub today, and she thinks she might have the code lying around somewhere for us to give it a look. |
@JamesWTruher is going to be releasing an equivalent in the Gallery as a place to incubate the module before we decide whether and when it should be included as part of PowerShell. There will be a new RFC describing the behavior of that module if we decide we want to take it in. |
List of cmdlets:
Get-EnvironmentVariable
Set-EnvironmentVariable
New-EnvironmentVariable
Add-EnvironmentVariable
Remove-EnvironmentVariable
This change is