Given a pandas Dataframe which contains some data, what is the best to store this data to Firebase?
Should I convert the Dataframe to a local file (e.g. .csv, .txt) and then upload it on Firebase Storage, or is it also possible to directly store the pandas Dataframe without conversion? Or are there better best practices?
Update 01/03 - So far I've come with this solution, which requires writing a csv file locally, then reading it in and uploading it and then deleting the local file. I doubt however that this is the most efficient method, thus I would like to know if it can be done better and quicker?
import os
import firebase_admin
from firebase_admin import db, storagecred = firebase_admin.credentials.Certificate(cert_json)
app = firebase_admin.initialize_app(cred, config)
bucket = storage.bucket(app=app)def upload_df(df, data_id):"""Upload a Dataframe as a csv to Firebase Storage:return: storage_ref"""# Storage location + extensionstorage_ref = data_id + ".csv"# Store locallydf.to_csv(data_id)# Upload to Firebase Storageblob = bucket.blob(storage_ref)with open(data_id,'rb') as local_file:blob.upload_from_file(local_file)# Delete locallyos.remove(data_id)return storage_ref