copying only 10gb of files from a folder that say contains 100gb using python
Question:
i have this script that copies all files with the file extension txt which scans all the files in the folder and then copies them to another folder. The thing is most of the time there is over 100GB of files in that one folder, what i would like is to only copy over a specific amount of files in size. say only copy 10GB of .txt files. i just can not seem to find from searching how to achieve this.
Answers:
You could do something like this:
import os
import shutil
def move_up_to_10GB_of_files(source_directory: str, destination_directory: str) -> None:
GB: int = 1 << 30
_10GB: int = GB * 10
total_size: int = 0
for filename in os.listdir(source_directory):
src_path: str = os.path.join(source_directory, filename)
dst_path: str = os.path.join(destination_directory, filename)
size: int = os.path.getsize(src_path)
total_size += size
if total_size > _10GB:
return
shutil.copy(src_path, dst_path)
i have this script that copies all files with the file extension txt which scans all the files in the folder and then copies them to another folder. The thing is most of the time there is over 100GB of files in that one folder, what i would like is to only copy over a specific amount of files in size. say only copy 10GB of .txt files. i just can not seem to find from searching how to achieve this.
You could do something like this:
import os
import shutil
def move_up_to_10GB_of_files(source_directory: str, destination_directory: str) -> None:
GB: int = 1 << 30
_10GB: int = GB * 10
total_size: int = 0
for filename in os.listdir(source_directory):
src_path: str = os.path.join(source_directory, filename)
dst_path: str = os.path.join(destination_directory, filename)
size: int = os.path.getsize(src_path)
total_size += size
if total_size > _10GB:
return
shutil.copy(src_path, dst_path)