Facing Memory Error while reading the file from google bigquery
Facing Memory Error while reading the file from google bigquery
I am trying to read 1.8 M rows of data from google bigquery in python but facing the memory error issue. as I am using pandas_gbq for this.
import pandas_gbq as pgq
a= pgq.read_gbq(viewsql, project_id = env_config['projectid'], private_key =
env_config['service_account_key_file']+".json")
@anky_91 can we do the same for the Bigquery table
– HarshHErtZ
Aug 27 at 11:42
i think so. you can look it up in the web or this link might help: stackoverflow.com/questions/34685577/…
– anky_91
Aug 27 at 11:49
By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.
you can read chunks in such cases. df = pd.read_csv('filename',chunksize=(any number of chunk you wan eg: 5000)
– anky_91
Aug 27 at 9:38