使用时间戳合并csv文件 [英] Merge csv Files with TimeStamps
问题描述
数据文件1:
data_20150801.csv
data_20150801.csv
Time Header Header Header Header
2015-08-01 07:00 14.4 14.4 14.4 68
2015-08-01 07:01 14.4 14.4 14.4 68
数据文件2
data2_20150801.csv
data2_20150801.csv
Time Header Header
2015-08-01 00:00 90 12312
2015-08-01 00:01 232 13213
......
2015-08-01 07:00 1000 1500
2015-08-01 07:01 2312 1245
2015-08-01 07:02 1232 1232
2015-08-01 07:03 1231 1232
Id喜欢合并这2个.csv文件,以获得一个看起来像这样的文件:
Id like to merge those 2 .csv Files, to get a File That looks like:
Time Header Header Header Header Header Header
2015-08-01 07:00 14.4 14.4 14.4 68 1000 1500
所以基本上我需要从data2_复制行并将其插入data_的正确时间点我使用Notepad ++手动进行了尝试,但是问题是,有时data2_中没有一分钟的条目,因此我需要检查缺少的TimeStep在哪里,并手动跳过该点.
so basically I need to copy the Rows from data2_ and insert them at the right time points in data_ I tried it manually with Notepad ++ but the Problem is, that sometimes there's no entry for one Minute in data2_ so I'd need to check where the missing TimeStep is and skip that point manually.
我在Python中做了一些事情,但我仍然是菜鸟,所以我缺乏如何开始解决此类问题的经验?
I did some things in Python but I'm still a noob so I lack the experience on how to start tackling a problem like this?
我使用的是Mac,发现cat命令将Folder中的.csv文件合并为一个cvs文件->有没有一种方法可以逐行保存时间戳?
I'm using a mac and I found that cat command that combines .csv files in a Folder to one cvs file --> is there a way to do this line by line conserving the timestamps?
推荐答案
您可以使用Python Pandas轻松完成此操作,但这可能是过度设计:
You could use Python Pandas to do this quite easily, but its probably an overengineering:
import pandas as pd
d_one = from_csv('data.csv',sep=',',engine='python',header=0)
d_two = from_csv('data2.csv',sep=',',engine='python',header=0)
d_three = pd.merge(d_one, d_two, left_on='timestamp',right_on='timestamp')
d_three.to_csv('output.csv',sep=',')
我还没有机会测试此代码,但是它应该做您想要的,您可能需要修改制表符的逗号(取决于文件),等等.
I havent had the chance to test this code but it should do what you want, you may need to modify commas for tabs (depending on the file), etc.
这篇关于使用时间戳合并csv文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!