Google Cloud SSH密钥 [英] Google Cloud SSH Keys

查看:376
本文介绍了Google Cloud SSH密钥的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经使用Google Compute Engine设置了我的新服务器。我在Google云端控制台( sshKeys )中的元数据中添加了用户及其公钥。



我尝试过在元数据中替换一个公钥,但现在旧版似乎是唯一能够将ssh传入我的服务器(使用PuTTY)的公钥。现在,即使我删除整个元数据或输入乱码文本到 sshKeys 中, code>字段,它仍然可以工作!

它可能需要一段时间才能将元数据推送到服务器(我之前的尝试是瞬时的)?

解决方案

要理解Google Compute Engine如何管理ssh密钥,您必须了解GCE如何管理元数据(因为正如您所写,它们位于元数据存储中)。



更具体地说,项目和实例元数据 a>是至关重要的。引用文档(见前面的链接):


可以在项目和实例级别分配元数据。项目级元数据传播到项目中的所有虚拟机实例,而实例级元数据仅影响该实例。您可以设置项目和实例级元数据,但是如果您为项目和实例元数据设置了相同的键,Compute Engine将使用实例元数据。


虽然这看起来相当逻辑和直接,但必须非常仔细地注意使用的术语:


项目高级元数据传播到项目中的所有虚拟机实例[/ b] >


您可以同时设置[...],但如果您为 [...] Compute Engine 将使用实例元数据

如果您考虑这两个断言,则意味着两件事: / p>


  1. 如果您将元数据设置为项目级别 ,它将传播到您的实例。

  2. 如果您将元数据设置在实例级别,它将优先于项目级别,并且不会传播任何内容。

直接后果其中,GCE平台负责在实例中放置/删除您的ssh密钥(并在放置它们时创建相关用户,而仅从〜user / .ssh / authorized_keys 文件,因此当您不指定自己的文件时,您不会丢失〜user 的任何数据键(创建实例或更新)。如果你这样做,GCE平台将认为ssh密钥管理为 manual ,并且不会与元数据存储同步。



幸运的是,GCE平台做得很好,因此,您无需重新创建实例即可让您的密钥由GCE平台管理:您只需将相对于 sshKeys



同样,如果您使用键 sshKeys 添加一些实例级元数据, ,它会禁用ssh密钥GCE平台管理;除非你删除了实例级别的元数据。



关于延迟问题:目前为止,除了网络延迟(因此没有平台执行延迟),我没有任何延迟。我认为这个平台不时会出现延迟,但这似乎不太可能成为您问题的原因。





$ b $

附加说明: 有些发行版(例如 ubuntu )包含特定用户(在ubuntu的情况下:〜〜ubuntu ),通过它可以登录项目级别中的每个用户。但该用户的 authorized_keys 是在实例创建时生成的,并且GCE平台似乎再也无法再次更改该实例。恕我直言,自动ssh密钥管理应该是首选。






资料来源:个人使用GCE,terraform和Google开发者控制台

I have set up my new server with Google Compute Engine. I added a user and their public key into the metadata in the Google Cloud console (sshKeys).

I attempted to replace a public key in the metadata, but now the old one seems to be the only one able to ssh into my server (using PuTTY). The new one doesn't seem to be updated.

Now, even if I remove the whole metadata or type gibberish text into the sshKeys field, it will still work!

Could it be that it will require sometime for the metadata to be pushed to the server (my previous attempts were instantaneous)?

解决方案

To understand how Google Compute Engine manages the ssh keys, you have to understand how GCE manages the metadata (since, as you wrote, they are in the metadata store).

And more specifically, the difference between project and instance metadata is crucial. To quote the documentation (see previous links):

Metadata can be assigned at both the project and instance level. Project level metadata propagates to all virtual machine instances within the project, while instance level metadata only impacts that instance. You can set both project and instance level metadata but if you set the same key for both your project and instance metadata, Compute Engine will use the instance metadata.

While this seems rather logical and straightforward, one has to pay attention, very closely, to the used terms:

Project level metadata propagates to all virtual machine instances within the project [...]

and

You can set both [...] but if you set the same key for both [...], Compute Engine will use the instance metadata.

If you consider both assertions, it means two things:

  1. If you set the metadata at the project level ONLY, it will propagate in your instances.
  2. If you set the metadata at the instance level, it will take precedence over the project level ones, and nothing will be propagated.

As a direct consequence of that, the GCE platform takes care of placing/removing your ssh keys in the instance (and creating the relevant users when placing them, while just removing the key from the ~user/.ssh/authorized_keys file when removing them - so you don't lose any data for ~user) ONLY when you do not specify your own keys (at instance creation or later). If you do, the GCE platform will consider the ssh key management as manual, and nothing will be kept in sync with the metadata store.

Fortunately, the GCE platform is well done, and therefore, you need not re-creating an instance to get your keys managed by the GCE platform: you only need to remove the instance level metadata relative to the sshKeys.

The same way, if you add some instance level metadata with the key sshKeys, it will disable the ssh keys GCE platform management; unless you remove that instance level metadata.

Concerning the delay question: I didn't have any delay other than the network delay (so no platform execution delay noticeable) so far. I don't think it's impossible that the platform has delays from time to time, but it doesn't seem very likely to be the cause of your problem.


Additional note:

Some distributions (such as ubuntu) include a specific user (in the case of ubuntu: ~ubuntu), with which every user existing in the project-level ssh keys can login; but that user's authorized_keys is generated at instance creation time, and never seems to be changed again by the GCE platform. IMHO, the automatic ssh key management should be preferred.


Source: personal experience with GCE, terraform, and the Google Developer Console

这篇关于Google Cloud SSH密钥的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆