意外的 ConvertTo-Json 结果?答案:它的默认 -Depth 为 2 [英] Unexpected ConvertTo-Json results? Answer: it has a default -Depth of 2
问题描述
为什么会出现意外ConvertTo-Json
结果,为什么我会得到像 System.Collections.Hashtable
这样的值和/或为什么会有往返 ($Json | ConvertFrom-Json | ConvertTo-Json
) 失败?
Why do I get unexpected ConvertTo-Json
results, why do I get values like System.Collections.Hashtable
and/or why does a round-trip ($Json | ConvertFrom-Json | ConvertTo-Json
) fail?
Stackoverflow 有一个很好的机制来防止重复的问题,但据我所知,没有机制可以防止出现重复的原因的问题.以这个问题为例:几乎每周都会出现一个新问题,原因相同,但通常很难将其定义为重复问题,因为问题本身只是略有不同.尽管如此,如果这个问题/答案本身最终成为重复(或偏离主题),我不会感到惊讶,但不幸的是,stackoverflow 不可能 写一篇文章 以防止其他程序员继续写由这个已知"陷阱引起的问题.
Stackoverflow has a good mechanism to prevent duplicate questions but as far as I can see there is no mechanism to prevent questions that have a duplicate cause. Take this question as a an example: almost every week a new question comes in with the same cause, yet it is often difficult to define it as a duplicate because the question itself is just a slightly different. Nevertheless, I wouldn't be surprised if this question/answer itself ends up as a duplicate (or off-topic) but unfortunately stackoverflow has no possibility to write an article to prevent other programmers from continuing writing questions caused by this "known" pitfall.
具有相同共同原因的类似问题的几个示例:
A few examples of similar questions with the same common cause:
- PowerShell ConvertTo-Json 未按预期转换数组(昨天)
- 带有嵌入式哈希表的 Powershell ConvertTo-json
- powershellConvertTo-Json"弄乱了json格式输出一个>
- 嵌套数组和 ConvertTo-Json
- Powershell ConvertTo-JSON 缺少嵌套级别
- 如何将 JSON 对象保存到使用 Powershell 的文件?
- 无法将数组中的 PSCustomObjects 正确转换回 JSON一个>
- ConvertTo-Json 将数组展平 3 层深
- 将对象数组添加到 PSObject一次
- 为什么 ConvertTo-Json 丢弃值
- 如何往返将此 JSON 转换为 PSObject 并返回 Powershell
- …
那么,这个自我回答"的问题与上述重复问题有什么不同吗?
它在标题中有共同的原因,这样可以更好地防止由于相同的原因而重复提问.
So, were does this "self-answered" question differ from the above duplicates?
It has the common cause in the title and with that it might better prevent repeating questions due to the same cause.
推荐答案
答案
ConvertTo-Json
有一个 -Depth
参数:
Answer
ConvertTo-Json
has a -Depth
parameter:
指定包含对象的级别数JSON 表示.
默认值为2.
Specifies how many levels of contained objects are included in the JSON representation.
The default value is 2.
示例
要使用 JSON 文件进行完整的往返,您需要增加 ConvertTo-Json
cmdlet 的 -Depth
:
Example
To do a full round-trip with a JSON file you need to increase the -Depth
for the ConvertTo-Json
cmdlet:
$Json | ConvertFrom-Json | ConvertTo-Json -Depth 9
TL;DR
可能是因为 ConvertTo-Json
使用 (.Net) 完整类型终止比默认 -Depth
(2) 更深的分支名称,程序员假定存在错误或 cmdlet 限制并且不阅读帮助或关于.
就我个人而言,我认为字符串末尾带有一个简单的省略号(三个点:...)切断分支,会有更明确的含义(另请参阅:Github 问题:8381)
TL;DR
Probably because ConvertTo-Json
terminates branches that are deeper than the default -Depth
(2) with a (.Net) full type name, programmers assume a bug or a cmdlet limitation and do not read the help or about.
Personally, I think a string with a simple ellipsis (three dots: …) at the end of the cut off branch, would have a clearer meaning (see also: Github issue: 8381)
这个问题通常也会在另一个讨论中结束:为什么深度有限?
This issue often ends up in another discussion as well: Why is the depth limited at all?
某些对象具有循环引用,这意味着子对象可以引用父对象(或其祖父母之一),如果将其序列化为 JSON,则会导致不定式循环.
Some objects have circular references, meaning that a child object could refer to a parent (or one of its grandparents) causing a infinitive loop if it would be serialized to JSON.
以下面的哈希表为例,它的 parent
属性引用了对象本身:
Take for example the following hash table with a parent
property that refers to the object itself:
$Test = @{Guid = New-Guid}
$Test.Parent = $Test
如果你执行:$Test |ConvertTo-Json
默认情况下,它会很方便地停在深度级别 2 处:
If you execute: $Test | ConvertTo-Json
it will conveniently stop at a depth level of 2 by default:
{
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": {
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": {
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": "System.Collections.Hashtable"
}
}
}
这就是为什么自动将 -Depth
设置为较大的值并不是一个好主意.
This is why it is not a good idea to automatically set the -Depth
to a large amount.
这篇关于意外的 ConvertTo-Json 结果?答案:它的默认 -Depth 为 2的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!