Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Big memory usage in debian vs ubuntu when mounting a 120 dwarfs #154

Open
godane opened this issue Jul 20, 2023 · 16 comments
Open

Big memory usage in debian vs ubuntu when mounting a 120 dwarfs #154

godane opened this issue Jul 20, 2023 · 16 comments

Comments

@godane
Copy link

godane commented Jul 20, 2023

So i noticed that Debian takes a ton more ram running the same 120 dwarfs then Ubuntu. Ubuntu is at 418mb but with Debian its is at 1.69gb. My guess is Debian kernel is the problem cause fuse is a kernel module and not build into kernel vmlinuz image.

here is Debian ram usage:
screen03

here is Ubuntu ram usage:
screen04

Debian kernel is 6.1.0-10-amd64 and Ubuntu kernel is 6.2.0-25-generic. I hope this helps.

@mhx
Copy link
Owner

mhx commented Jul 20, 2023

Hi!

Am I understanding correctly that your use case is to run 120 separate dwarfs instances?

Have you accessed any files in the mounted file system images yet? Or is this just after you've mounted them?

Maybe you can elaborate on your use case and why you're running that many instances?

Depending on how big the underlying DwarFS images are and also depending on the access pattern, I'd expect each instance to use up to the configured cachesize (512 MB by default). So with 120 instances, assuming the default cache size, assuming somewhat large file system images, and assuming all of the mounts actually being accessed, I'd expect the total amount of memory to eventually exceed 512 GB (yes, half a terabyte). :)

@godane
Copy link
Author

godane commented Jul 20, 2023

I'm setting up my live SSD to have videos and pdfs in modules. There are no file opened on debian or ubuntu just mounted. Also the cachesize by default is set to 32mb so it every less then default.

@mhx
Copy link
Owner

mhx commented Jul 20, 2023

I'm setting up my live SSD to have videos and pdfs in modules.

Why?

Also, what "modules"?

@godane
Copy link
Author

godane commented Jul 20, 2023

I made a fork of Tomas-M linux-live scripts. Its based on that. Modules/bundles are just what i call any squashfs/dwarfs files that are add to root (/) filesystem.

The 'why' is cause I want help Flash Drives For Freedom on the software side: https://flashdrivesforfreedom.org/

I think this filesystem could be helpful in hiding data for them cause there not likely the officials to have dwarfs.

@mhx
Copy link
Owner

mhx commented Jul 20, 2023

I think this filesystem could be helpful in hiding data for them cause there not likely the officials to have dwarfs.

I'm not convinced that this is a good idea.

Why not just use proper encryption?

You'll even get write support and more importantly, your data is actually safe.

All that aside, why does it have to be 120 "modules" then? What's the benefit over just one?

@godane
Copy link
Author

godane commented Jul 20, 2023

Its cause I'm trying to organize the media collection going into this. I don't want everything to be a blob cause that blob will have to be put into 16/32/64 gb sticks also. Its a mess as it is.

I have write support using overlayfs to save changes. I use this filesystem as my daily driver so everything is just save to changes and home folder.

2nd i think the best there going to get is Security through obscurity. Boot a liveusb with media without complicated things is the best i can do.

@mhx
Copy link
Owner

mhx commented Jul 20, 2023

Coming back to the original issue, I don't see why the same binary would use different amounts of memory under the same circumstances. The only thing that comes to my mind right now would be different stack sizes. Would you mind running

$ ulimit -a

on both machines and posting the output?

@godane
Copy link
Author

godane commented Jul 20, 2023

debian:
screen05

ubuntu:
screen06

NOTE: xubuntu hostname is cause thats what my main system hostname is when building debian.

@mhx
Copy link
Owner

mhx commented Jul 20, 2023

Yeah, those look pretty identical to me. I still have no idea where the difference in RSS comes from.

One thing you could try is to just use the dwarfs binary from the regular tarball instead of the universal binary. Unpacking the universal binary is going to add to the memory footprint. Doing this on my machine, I see the RSS drop from 13 MB to 7 MB. I've also tried the universal binary on a Ubuntu (22.04) box and I'm seeing pretty much the same RSS as on my main machine.

@godane
Copy link
Author

godane commented Jul 21, 2023

Nothing really changed with ram usage for debian 12.

screen07

Here is the kernel diff between debian and ubuntu kernels using this command: diff -aur config-6.1.0-10-amd64 config-6.2.0-24-generic
diff.txt

I think looking at the kernel config may give more insight into why this is happening.

@mhx
Copy link
Owner

mhx commented Jul 22, 2023

Do you notice the same thing with other programs? Or other fuse drivers?

@godane
Copy link
Author

godane commented Jul 22, 2023

So looks like dwarfs2 works better at memory on debian the dwarfs with fuse3. Bad news is its still 1.26gb vs getting 420mb on ubuntu with dwarfs using fuse3.

screen08

@alexmyczko
Copy link

you're using the binaries or building from source? same options? same compiler?

@godane
Copy link
Author

godane commented Apr 13, 2024

i was using the static binary at the time. I moved on back to squashfs for my live os and i'm also using ubuntu. This bug from what i can remember is most likely a debian bug but not sure.

Best guess for someone testing this is to make a 100+ dwarfs images and mount them all on debian to test the ram usage. Hope this helps.

@alexmyczko
Copy link

once i have the itp done i will consider testing this.

@mhx
Copy link
Owner

mhx commented May 4, 2024

The ideas outlined in #219 will likely help with this issue by allowing to mount multiple images in the same process.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants