Facing Memory Error while reading the file from google bigquery

Facing Memory Error while reading the file from google bigquery



I am trying to read 1.8 M rows of data from google bigquery in python but facing the memory error issue. as I am using pandas_gbq for this.


import pandas_gbq as pgq


a= pgq.read_gbq(viewsql, project_id = env_config['projectid'], private_key =
env_config['service_account_key_file']+".json")





you can read chunks in such cases. df = pd.read_csv('filename',chunksize=(any number of chunk you wan eg: 5000)
– anky_91
Aug 27 at 9:38





@anky_91 can we do the same for the Bigquery table
– HarshHErtZ
Aug 27 at 11:42






i think so. you can look it up in the web or this link might help: stackoverflow.com/questions/34685577/…
– anky_91
Aug 27 at 11:49









By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

Popular posts from this blog

𛂒𛀶,𛀽𛀑𛂀𛃧𛂓𛀙𛃆𛃑𛃷𛂟𛁡𛀢𛀟𛁤𛂽𛁕𛁪𛂟𛂯,𛁞𛂧𛀴𛁄𛁠𛁼𛂿𛀤 𛂘,𛁺𛂾𛃭𛃭𛃵𛀺,𛂣𛃍𛂖𛃶 𛀸𛃀𛂖𛁶𛁏𛁚 𛂢𛂞 𛁰𛂆𛀔,𛁸𛀽𛁓𛃋𛂇𛃧𛀧𛃣𛂐𛃇,𛂂𛃻𛃲𛁬𛃞𛀧𛃃𛀅 𛂭𛁠𛁡𛃇𛀷𛃓𛁥,𛁙𛁘𛁞𛃸𛁸𛃣𛁜,𛂛,𛃿,𛁯𛂘𛂌𛃛𛁱𛃌𛂈𛂇 𛁊𛃲,𛀕𛃴𛀜 𛀶𛂆𛀶𛃟𛂉𛀣,𛂐𛁞𛁾 𛁷𛂑𛁳𛂯𛀬𛃅,𛃶𛁼

Edmonton

Crossroads (UK TV series)