Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MemoryError odoo.db.dump #4

Open
sebalix opened this issue Oct 3, 2017 · 4 comments
Open

MemoryError odoo.db.dump #4

sebalix opened this issue Oct 3, 2017 · 4 comments

Comments

@sebalix
Copy link
Collaborator

sebalix commented Oct 3, 2017

From @eqms on October 7, 2016 16:30

Hi,

with small databases the dump function runs very good.
But if I try to backup a bigger system (zip file over the front end has 1.6 GB) I get this memory error.

Traceback (most recent call last):
File "./backup_odoorpc.py", line 62, in
dump = odoo.db.dump(mydbpwd, mydb,'zip')
File "/usr/local/lib/python2.7/dist-packages/odoorpc/db.py", line 131, in dump
'args': args})
File "/usr/local/lib/python2.7/dist-packages/odoorpc/odoo.py", line 264, in json
data = self._connector.proxy_json(url, params)
File "/usr/local/lib/python2.7/dist-packages/odoorpc/rpc/jsonrpclib.py", line 94, in call
return json.load(decode_data(response))
File "/usr/lib/python2.7/json/init.py", line 290, in load
**kw)
File "/usr/lib/python2.7/json/init.py", line 338, in loads
return _default_decoder.decode(s)
File "/usr/lib/python2.7/json/decoder.py", line 366, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python2.7/json/decoder.py", line 382, in raw_decode
obj, end = self.scan_once(s, idx)
MemoryError

Is there a way to increase the memory?

Copied from original issue: osiell/odoorpc#31

@sebalix
Copy link
Collaborator Author

sebalix commented Oct 3, 2017

Hi,

Are you running on a 32 bits system?

@sebalix
Copy link
Collaborator Author

sebalix commented Oct 3, 2017

From @eqms on October 10, 2016 8:32

No

uname --all
Linux myodoo 3.16.0-4-amd64 #1 SMP Debian 3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux

Distributor ID: Debian
Description: Debian GNU/Linux 8.5 (jessie)
Release: 8.5
Codename: jessie

@sebalix
Copy link
Collaborator Author

sebalix commented Oct 3, 2017

It is difficult to say what is happening, maybe that a JSON-RPC query is not optimal for such volume of data. The web client does a standard HTTP query on /web/database/backup (on 8.0) but this approach requires a token to be performed. So now, I've no immediate solution... Maybe test on Python 3 to see if the memory consumption is better?

@azawawi
Copy link

azawawi commented Jan 29, 2019

Confirmed Python 3 memory consumption is way better. Our backup process started taking more CPU time and 16x to 18x the memory size as the zipped backup increased in size (was at ~ 185 MB when the problem occurred). Part of it was related to Python 2 inefficient memory model in its json library.

On Python 64-bit 2.7.12, it reached 3.2 GB for 185 MB zipped backup size.
On Python 64-bit 3.5.2, it takes about 1.8 GB for the same workload. Not optimal but it is solution.

On the backup machine, please remember to increase the swap space to handle future spikes in your backup process memory usage and upgrade to Python 3 64-bit.

Please close this issue. Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants