No space in my local disk to save .asc files of total size around 300GB

I have written python script to download .tar files in ASCII format and I tried to extract them in my local folder. But my local disk space was not sufficient to accommodate all files of size around 300GB (data files from 2005 to 2021). Is there any way that can store these files in cloud or any DB for free? If this is possible I can read files directly from cloud or DB to my python code. Any leads would be highly appreciated.

Thanks in advance.

Hello,

I am not aware of any free options for storing that volume of data. One of the larger options I have seen is Zenodo, which has a 50 GB maximum (you could split your 300 GB across multiple parts)
https://zenodo.org/deposit/new

You might just leave the files compressed and iterate programmatically over the archive and decompress on the fly.

Another question is why you are in need to use ASCII files which take up almost 10 to 12 times the space of similarly available binary files?

I don’t know exactly what you are doing, but one other way to tackle such a problem would be to download a file, do whatever you want to do with it, delete the file, download another (next) file and so on. This way you do not exhaust your local disk. But of course, this can not be done with every situation, depending on what you do, just a possible scenario.

I need to use ASCII format. It is part of my thesis.

Hi, I am shahidh from indian , AP i would like to talk to you about some doubts regarding the thesis that you are doing and also about this radolan data , I am doing kind of similar project like what you are doing I guess so little help would be appriciated.
thank you.

What was your doubt? Please post it here. If I can, I will, definitely.