How to extract all .rar files inside a folder (gdrive) from google colab?
For using variable in !unrar, you need to use this code
file_path = '/content/drive/MyDrive/Colab Notebooks/123.rar'
!unrar x '$file_path'
So, in your situation you will have next code:
data_path = "/content/gdrive/My Drive/folder"
for file in os.scandir(data_path):
!unrar x '$file.path' "/content/drive/path/output_folder/"
How To Extract Password Protected Rar File Using Patool On Google Colab
For 7z, you can use
! 7z e -pPASSWORD "path/to/file.zip"
to extract a file with password.
Unzip failed to finish in Google Colab
As @BobSmith said, I move all of my dataset to the google colab's local disk first and extract all of it using :
!unzip -u -q /content/syn_train_3.zip
and for rar using unrar
!unrar e real_train_500_2.rar train_dir
the extraction is proved faster. and I split the dataset to .npy files and save it to the drive again.
I found that Google Colab uses Google Drive File Stream like Backup and Sync in your desktop. It would be painful to wait the dataset synced between Colab and Drive.
Careful, don't let the "/drive/My Drive"
in Google Colab fools you that it already saved to Google Drive, it needs time to sync!.
Related Topics
Hiding Axis Text in Matplotlib Plots
Convert Tensorflow String to Python String
Python: How to Calculate the Average Word Length in a Sentence Using the .Split Command
Importerror: No Module Named Bs4 (Beautifulsoup)
How to Expand Input Buffer Size of Pyserial
Printing Each Letter of a Word + Another Letter - Python
How to Make My Discord.Py Bot Play Mp3 in Voice Channel
Reading Particular Cell Value from Excelsheet in Python
Python - Get Last Element After Str.Split()
Convert the String 2.90K to 2900 or 5.2M to 5200000 in Pandas Dataframe
Running Two Python Scripts With Bash File
Numpy Distance Calculations of Different Shaped Arrays
How to Repeat a Function N Times
Comparing Digits in an Integer in Python
How to Use a Pre-Trained Neural Network With Grayscale Images