将编码的std :: string从Base16转换为Base10? [英] Convert encoded std::string from Base16 to Base10?

查看:284
本文介绍了将编码的std :: string从Base16转换为Base10?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个以Base16编码的大整数的 std :: string
$ b

  bbb91c1c95b656f386b19ab284b9c0f66598e7761cd71569734bb72b6a7153b77613a6cef8e63 
e9bd9bb1e0e53a0fd8fa2162b160fcb7b461689afddf098bfc32300cf6808960127f1d9f0e287
f948257f7e0574b56585dd1efe1192d784b9c93f9c2215bd4867062ea30f034265374fa013ab4
5af06cd8554fd55f1c442c2ed

我想要一个以Base10编码的大整数的 std :: string

  13182363340585954094154991955162141609757130565683854218475776626603716062690 
50741824486137510938646762753180989129520441058729412931959771922633699694948
46611764803267065720664398942078304585998290003537553345030144535441671492050
01138054588415687622649540474976282005406232907125282540703919964112809484362
9分配

如何从Base16转换字符串t Base10?

解决方案


将Base16编码的std :: string转换为Base10?


以下内容适用于您。下面的代码向您展示了如何使用C风格的字符串,这很容易概念化。您在将CryptoPP :: Integer转换为LPCTSTR 的前一个问题有引用。

  #include< iostream> 
#include< string>
使用namespace std;

#includecryptlib.h
#includeinteger.h
using namespace CryptoPP;

INT主(INT的argc,字符* argv的[])
{
串S2,S1 =
的bbb91c1c95b656f386b19ab284b9c0f66598e7761cd71569734bb72b6a7153b77613a6cef8e63
的e9bd9bb1e0e53a0fd8fa2162b160fcb7b461689afddf098bfc32300cf6808960127f1d9f0e287
f948257f7e0574b56585dd1efe1192d784b9c93f9c2215bd4867062ea30f034265374fa013ab4
5af06cd8554fd55f1c442c2ed;

//追加'h'表示Base16
// Integer n((s1 +h)。c_str());

//预先加上'0x'来表示Base16
整数n((0x+ s1).c_str());

//转换为Base10
s2 = IntToString< Integer>(n,10);
cout<< s2<< ENDL;

返回0;





$ b $ p $上面的代码显示了你如何用C风格的字符串做到这一点,这很容易概念化。另一种方法是使用Crypto ++ Pipeline 将ASCII字符串转换为大端字节数组。

  #include< iostream> 
#include< string>
使用namespace std;

#includecryptlib.h
#includeinteger.h
#includefilters.h
#includehex.h
使用名称空间CryptoPP;

INT主(INT的argc,字符* argv的[])
{
串S3,S2,S1 =
的bbb91c1c95b656f386b19ab284b9c0f66598e7761cd71569734bb72b6a7153b77613a6cef8e63
e9bd9bb1e0e53a0fd8fa2162b160fcb7b461689afddf098bfc32300cf6808960127f1d9f0e287
f948257f7e0574b56585dd1efe1192d784b9c93f9c2215bd4867062ea30f034265374fa013ab4
5af06cd8554fd55f1c442c2ed;

//使用HexDecoder转换为big-endian数组
StringSource ss(s1,true,new HexDecoder(new StringSink(s2)));

//使用big-endian数组构造n
整数n((const byte *)s2.data(),s2.size());

//转换为Base10
s3 = IntToString<整数>(n,10);
cout<< s3<< ENDL;

返回0;






这是另一种执行方式使用Crypto ++ Pipeline 进行转换。

  #include< iostream> 
#include< string>
使用namespace std;

#includecryptlib.h
#includeinteger.h
#includefilters.h
#includehex.h
使用名称空间CryptoPP;

INT主(INT的argc,字符* argv的[])
{
串S2,S1 =
的bbb91c1c95b656f386b19ab284b9c0f66598e7761cd71569734bb72b6a7153b77613a6cef8e63
的e9bd9bb1e0e53a0fd8fa2162b160fcb7b461689afddf098bfc32300cf6808960127f1d9f0e287
f948257f7e0574b56585dd1efe1192d784b9c93f9c2215bd4867062ea30f034265374fa013ab4
5af06cd8554fd55f1c442c2ed;

//使用源代码转换为big-endian数组
StringSource ss(s1,true,new HexDecoder);

//使用big-endian数组构造n
整数n;
n.Decode(ss,ss.MaxRetrievable());

//转换为Base10
s2 = IntToString< Integer>(n,10);
cout<< s2<< ENDL;

返回0;






如果您对将ASCII字符串转换为内部表示的字节数组的算法,然后参见 StringToInteger integer.cpp 中。它反复按基数(2,8,10,16等)分。

I have a std::string with a large integer encoded in Base16:

bbb91c1c95b656f386b19ab284b9c0f66598e7761cd71569734bb72b6a7153b77613a6cef8e63
e9bd9bb1e0e53a0fd8fa2162b160fcb7b461689afddf098bfc32300cf6808960127f1d9f0e287
f948257f7e0574b56585dd1efe1192d784b9c93f9c2215bd4867062ea30f034265374fa013ab4
5af06cd8554fd55f1c442c2ed

I want a std::string with a large integer encoded in Base10:

13182363340585954094154991955162141609757130565683854218475776626603716062690
50741824486137510938646762753180989129520441058729412931959771922633699694948
46611764803267065720664398942078304585998290003537553345030144535441671492050
01138054588415687622649540474976282005406232907125282540703919964112809484362
9

How do I convert the strings from Base16 to Base10?

解决方案

Convert encoded std::string from Base16 to Base10?

The following should work for you. The code below shows you how to do it with C-style strings, which is easy to conceptualize. Your previous question at Convert CryptoPP::Integer to LPCTSTR has the references.

#include <iostream>
#include <string>
using namespace std;

#include "cryptlib.h"
#include "integer.h"
using namespace CryptoPP;

int main(int argc, char* argv[])
{
  string s2, s1 =
      "bbb91c1c95b656f386b19ab284b9c0f66598e7761cd71569734bb72b6a7153b77613a6cef8e63"
      "e9bd9bb1e0e53a0fd8fa2162b160fcb7b461689afddf098bfc32300cf6808960127f1d9f0e287"
      "f948257f7e0574b56585dd1efe1192d784b9c93f9c2215bd4867062ea30f034265374fa013ab4"
      "5af06cd8554fd55f1c442c2ed";

  // Append 'h' to indicate Base16
  // Integer n((s1 + "h").c_str());

  // Prepend '0x' to indicate Base16
  Integer n(("0x" + s1).c_str());

  // Convert to Base10
  s2 = IntToString<Integer>(n, 10);
  cout << s2 << endl;

  return 0;
}


The code above shows you how to do it with C-style strings, which is easy to conceptualize. Another way to do it uses a Crypto++ Pipeline to convert the ASCII string into a big-endian array of bytes.

#include <iostream>
#include <string>
using namespace std;

#include "cryptlib.h"
#include "integer.h"
#include "filters.h"
#include "hex.h"
using namespace CryptoPP;

int main(int argc, char* argv[])
{
  string s3, s2, s1 =
      "bbb91c1c95b656f386b19ab284b9c0f66598e7761cd71569734bb72b6a7153b77613a6cef8e63"
      "e9bd9bb1e0e53a0fd8fa2162b160fcb7b461689afddf098bfc32300cf6808960127f1d9f0e287"
      "f948257f7e0574b56585dd1efe1192d784b9c93f9c2215bd4867062ea30f034265374fa013ab4"
      "5af06cd8554fd55f1c442c2ed";

  // Use a HexDecoder to convert to big-endian array
  StringSource ss(s1, true, new HexDecoder(new StringSink(s2)));

  // Use big-endian array to construct n
  Integer n((const byte*)s2.data(), s2.size());

  // Convert to Base10
  s3 = IntToString<Integer>(n, 10);
  cout << s3 << endl;

  return 0;
}


Here's another way to perform the conversion using a Crypto++ Pipeline.

#include <iostream>
#include <string>
using namespace std;

#include "cryptlib.h"
#include "integer.h"
#include "filters.h"
#include "hex.h"
using namespace CryptoPP;

int main(int argc, char* argv[])
{
  string s2, s1 =
      "bbb91c1c95b656f386b19ab284b9c0f66598e7761cd71569734bb72b6a7153b77613a6cef8e63"
      "e9bd9bb1e0e53a0fd8fa2162b160fcb7b461689afddf098bfc32300cf6808960127f1d9f0e287"
      "f948257f7e0574b56585dd1efe1192d784b9c93f9c2215bd4867062ea30f034265374fa013ab4"
      "5af06cd8554fd55f1c442c2ed";

  // Use a source to convert to big-endian array
  StringSource ss(s1, true, new HexDecoder);

  // Use big-endian array to construct n
  Integer n;
  n.Decode(ss, ss.MaxRetrievable());

  // Convert to Base10
  s2 = IntToString<Integer>(n, 10);
  cout << s2 << endl;

  return 0;
}


If you are interested in the algorithm that converts the ASCII string to a byte array for internal representation, then see StringToInteger in integer.cpp. It repeatedly divides by the base (2, 8, 10, 16, etc).

这篇关于将编码的std :: string从Base16转换为Base10?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆