如何加载和存储nvarchar [英] How to load and store nvarchar

查看:118
本文介绍了如何加载和存储nvarchar的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

堆栈:使用Ambari 2.1安装HDP-2.3.2.0-2950
我在下面的步骤:

​Stack : Installed HDP-2.3.2.0-2950 using Ambari 2.1 The steps that I am following : ​


  1. 使用Sqoop将SQL服务器表加载到HDFS
  2. 在Hive中创建EXTERNAL表

在执行sqoop import命令时,我没有使用与charset / unicode / utf-8有关的任何东西,并且导入成功

I didn't use anything pertaining to charset/unicode/utf-8 while executing the sqoop import commands and the import was successful

创建Hive外部表时,我想知道我应该为原始sql服务器表中的nvarchar列选择哪种数据类型,现在我担心即使在导入时需要解决的Sqoop问题。

While creating the Hive external table, I was wondering what data type shall I select for the nvarchar columns in the original sql server table, now I am worried that even in Sqoop while importing that needs to be addressed.


  1. 在Sqoop import中找不到相关的charset / nvarchar等选项
  2. 在Hive中,可以使用varchar / string来代替nvarchar
  3. li>
  1. ​Couldn't find relevant charset/nvarchar etc. options in Sqoop import
  2. In Hive, can varchar/string blindly be used in place of nvarchar


推荐答案

我们知道 nvarchar 无法通过 sqoop 理解,那么我们需要将它转换为 varc har
eg

We know that nvarchar type is not understandable by sqoop then we need to just cast it as varchar e.g.

select 
CAST(col1 AS varchar )  AS col1,
col2,
col3,
col4
from table_name

这篇关于如何加载和存储nvarchar的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆