SolvedPowerShell Consider removing the default -Depth value from ConvertTo-Json

Summary of the proposal:

  • Remove the default value for -Depth

    • The hard-coded internal limit of 100, which when exceeded, reports an error, is sufficient to prevent "runaway" JSON strings stemming from object trees with cyclic dependencies.
    • Typical input objects will then be fully serialized by default, which is typically the intent.
  • Use -Depth solely at the user's discretion in order to:

    • Intentionally truncate the input object tree at the specified depth.
    • On rare occasions, allow serialization of object trees that are deeper than 100 levels (this could also be a solution to #3181).

Motivation

-Depth defaulting to 2 in ConvertTo-Json has caused much confusion and frustration over the years; @iRon7 has recently tried to create a "canonical" post on SO, which also shows how frequently the issue is arising.

Currently, an input object tree that exceeds the (default) depth doesn't cause an error, but results in near-useless .psobject.ToString() serialization of property values that exceed the depth (see #8381 for a proposal to visualize the cut-off differently).

In combination with the low default -Depth value of 2, that makes for frequent user frustration, because the behavior frequently amounts to quiet de-facto failure that may not be discovered until later.

The seemingly arbitrary and quiet truncation is surprising to most users, and having to account for it in every ConvertTo-Json call is an unnecessary burden.

Backward-compatibility impact

The only way in which I think existing code could be impacted is in that payloads generated with ConvertTo-Json could now increase in depth (and size), if users previously relied on the implicit cut-off at depth 2 - that strikes me as a Bucket 3: Unlikely Grey Area change.

39 Answers

✔️Accepted Answer

Here's a pretty reasonable argument I'm not hearing enough of: MY DATA IS GONE.

I ran this cmdlet and without so much as returning an error, pumping any nested JSON with more than 2 levels into it, changing any value and writing it back to file results in LOST DATA. This is due to an "invisible" default which basically just throws anything at level 3+ in the trash silently.

No error, no warning - complete SUCCESSFUL operation....until you actually need that data again and realize it's gone because you forgot what is apparently a MANDATORY param still to this day incorrectly not marked as mandatory, but optional.

Bad. Coding.

It is 100% reproducible and has been complained about for YEARS by many. It should never have been allowed to make it into production builds of Powershell, let alone survived THIS long without an immediate fix. IMO nobody should EVER use these Powershell cmdlets to manipulate JSON but instead use other means until/unless this is ever properly corrected.

The correction should be either simply change the existing param of depth to mandatory, or leave it an optional param and simply remove the absurd hard-coded default of 2 that in my testing with the exception of the most basic json almost ALWAYS result in silent data loss, leaving the default exactly what is normally the industry standard for cmdlet params with no limiting value assigned that can truncate data - MAXVALUE - which in this case still appears to be 100. However, note that scenario is the ONLY acceptable one that would then NEVER result in possible silent data loss, because anything over 100 then DOES finally result in notification via warning/error.

Then, as is the industry standard for cmdlet params, if the developer wants LESS - then they can add the OPTIONAL depth param. But also consider then adding additional code that at the very least provides the missing warning (or erroraction) feedback today to make clear that the data passed into the cmdlet has in fact exceeded the depth limit and will result in loss - even at the verbose/debug stream level.

In the end NO other cmdlet behaves this way and continuing to just leave this behavior in place is tantamount to agreeing this is the acceptable new standard for all cmdlets in this category, and hence changing ConvertTo-CSV, ConvertTo-XML, Add-Content, and Set-Content cmdlets to perform exactly the same as these json cmdlets do today and adding the matching optional depth param to those with a default of 2 also, so that everyone's scripts that are missing a -depth XX defined (because it's optional and they have no idea) find all their output files exactly two lines long. If you agree that scenario is absurd and probably wouldn't fly, then perhaps ask yourself - why does this? :)

Other Answers:

Hello,

I would also like to up-vote this issue. Please remove this unexpected behavior. Day to day it costs developers several hours or days to find out the reason for the bugs caused by this.

I would have even preferred my program to die with an error instead of silently messing with the data structure.

Kind regards

Konstantin

@iRon7:

I see the problem with the depth of 100, but my guess is that it's fine to lower this to a more reasonable number that covers the majority of use cases while preventing infinite recursion.
(Again, only people with excessively deep trees who rely on default truncation at depth 2 would be affected, which doesn't strike me as likely.)

As for circular references and the default value preventing "runaway" strings when serializing arbitrary .NET types:
It doesn't really make sense to use ConvertTo-Json for that - use Export-CliXml instead.
So I'm not sure that -RecurringDepth is needed, but, as @SteveL-MSFT says, adding parameters is an option.

However, the gist of my proposal is to bring sane default behavior to ConvertTo-Json, and that won't be possible without what is technically a breaking change - fingers crossed for bucket 3, though.

The typical use case is to use custom-made data-transfer objects ("property bags"; hashtables or custom objects) that you want serialized in full - the internal hard limit is there as a safety belt.

Cutting off the input object tree at a given depth should always be an explicit decision, not quietly applied default behavior that may initially even go unnoticed.

That concerns about incautious, atypical use dictate default behavior that:

  • defies user expectations to begin with,
  • is a challenge to remember,
  • is a nuisance to work around, because you need to figure out the actual depth of your data and keep the parameter in sync with later changes (unless you go for -Depth 100, which shouldn't be necessary)

has led to longstanding frustration with the cmdlet.

I have long suspected the default Depth was too low. That was until I experimented with setting the Depth to 100 in $PSDefaultParameterValues. The result was a marked increase in execution times for bunch of automation I was responsible for. In some cases, the scripts would execute indefinitely. Large numbers of deep and wide (and sometimes infinitely deep) objects can account for that. Changing this could result in user complaints that scripts are taking longer after upgrading to a pwsh version where the depth was set too high. In some cases, they may even have scripts that never finish executing.

While I definitely feel the pain of the default depth being low and silently truncating, I disagree with the assertion that this would not be an impactful change.

I think the default behavior from the beginning should have been to error instead of silently truncate. I agree that 2 is too low. I disagree that increasing the default to 100 is a good idea.

As was discussed in another issue or PR, there are 2 concepts at play here: depth and action when that depth is reached. I think for most uses, slightly increasing the default depth and changing the default behavior to error would be sufficient is solving the majority of pain points. This would need to come with the ability to truncate as an option, as without it infinitely deep objects could never be serialized.

I believe that the assertion that users rarely serialize arbitrary .NET types is false. I have seen plenty of logging that lazily throws any and all objects through ConvertTo-Json -Compress. I would caution that this assertion be investigated thoroughly before any decisions are made.

Adding a warning seems reasonable

More Issues: