Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Huge memory usage of nuxt-link #1750

Closed
AndruxaSazonov opened this issue Sep 28, 2017 · 23 comments
Closed

Huge memory usage of nuxt-link #1750

AndruxaSazonov opened this issue Sep 28, 2017 · 23 comments

Comments

@AndruxaSazonov
Copy link

AndruxaSazonov commented Sep 28, 2017

Sorry for the general explanation of case, but I'm struggling with SSR leaks (probably in my code). The project has only bootstrap-vue and axios as modules, no other nuxt dependencies.

The actual code is (without it seems no leaks)

async asyncData ({ app, store }) {
    const [, fetchSecond] = await storeChecker(app,
      [
        {
          state: 'FirstData',
          check: current => current && (current.length > 0),
          save: (store, data) => store.commit('FirstData', Object.assign({}, data.data)),
          promiser: axios => axios.get('/FirstData')
        },
        {
          promiser: axios => axios.get('/SecondData?' + createFilter({
            filter: {
              limit: 20,
              order: 'createdAt desc',
              fields: ['name', 'id']
            }
          }))
        }
      ]
    )
    // next code is only grouping entities, tried without it with no luck
    const categories = convertCategoriesArrays([
      item => '/first/' + item.id,
      item => '/second/' + item.id
    ], store.state.FirstData)
    return {
      firstDataType1: categories[0],
      firstDataType2: categories[1],
      secondData: fetchSecond.data
    }
  }

and where storeChecker is defined as follows:

function storeChecker (app, promisesOptions) {
  var awaiters = []

  const axios = app.$axios
  const store = app.store

  for (let index in promisesOptions) {
    const option = promisesOptions[index]
    if (option.state) {
      if (option.check(store.state[option.state])) {
        awaiters.push(store.state[option.state])
      } else {
        awaiters.push(option.promiser(axios).then(data => option.save(store, data)))
      }
    } else {
      awaiters.push(option.promiser(axios))
    }
  }

  return Promise.all(awaiters)
}

Starting pm2's two clustered instances and benchmarking with different ApacheBench requests, such as

ab -n 100 -c 25 -r -k http://localhost:3000/

I see (using pm2 monitor or whatever) memory growth in linear manner. Profiling one instance and comparing the snapshots give me a picture, where context is copied every request... I have a difference between one and 20 request in 20 saved data strings in memory.

heapdumps

I have tried to avoid closures, putting all the code in asyncData method, but no changes in memory consumption are. Searching around the code for a few days didn't give me a hint.

Using the latest nuxt version (rc11). I guess that SSRContext is possibly copied via global between request, but cannot prove. The nearest line in profile, is the Vue.use, but it isn't a hint also.

Any thoughts are appreciated. This is a (my) real problem on server side, cause 50 request (10 concurrent) needs ~500Mb memory each time, and the memory usage summarize.

This question is available on Nuxt.js community (#c1572)
@igtm
Copy link

igtm commented Sep 28, 2017

Same problem!

helplessly i installed graceful-cluster to shut down leaking processes forcedly as workaround.

@AndruxaSazonov
Copy link
Author

AndruxaSazonov commented Sep 29, 2017

installed graceful-cluster to shut down leaking processes forcedly as workaround

yes, it may be the solution for a time. pm2 also restarts processes with --max-old-space-size exceeded for node instance, but with signal and laggy perfomance at the edge of limit. graceful-cluster is better in this case, thanks.

But I've not lost a hope to find out what is the root cause. Heap dumps points me to the regenerator-runtime package, and words in its code

// Rather than returning an object with a next method, we keep
// things simple and return the next function itself.

and I see many next()'s in dumps comparison... but what to do I don't know. Tried to change the babel presets with no effect. Probably this issue can enlighten what is happening... Investigating.

In addition - have tried to plug idle-gc module for forced gc... it does not collect. BTW, it would be nice if nuxt.config.js will have a 'startup()' method (as Meteor has).

@AndruxaSazonov
Copy link
Author

AndruxaSazonov commented Oct 2, 2017

Digging the issues of Vue SSR didn't help. Tried to reduce LRU cache size and age, but it's not a components cache since the memory space not reused (but accumulated each request). Also did some experiments on runInNewContext: 'once' with no luck.

Maybe, sometimes, I will find more than hour to profile and report back what's the heck.

@AndruxaSazonov
Copy link
Author

AndruxaSazonov commented Oct 2, 2017

Okay. Heaps comparison possibly did the case. The problem is in the <nuxt-link/>. I have a very big catalog, which was rendered with nuxt (router-) links.

Accordingly with this, vue-router copies -- with SSR too -- the previous route.

When I have changed all nuxt-links to html anchors, additional memory usage returns back to initial start values. Possibly the question could be marked as issue. Also have noticed that event loop accelerates at three-four times at least as well as time of each request!

heaps

@AndruxaSazonov
Copy link
Author

It seems that this issue vuejs/vue-router#1279 enlighten what happens.

Will try to update if it is possible vue-router manually and report back. With reproduction or solution :)

@AndruxaSazonov
Copy link
Author

Made a separate pages with different cases. Actually, here two separate moments -

  1. Large memory usage with nuxt-link (vue-router history mode on server side), but it doesn't leak,
  2. Combination of data fetch/creation promise, which next comes to nuxt-links in closure. It prevents coming back process memory size to initial values.

The repository here: https://github.com/AndruxaSazonov/leak-repro

@igtm
Copy link

igtm commented Oct 3, 2017

i tried your repo and reproduced memory leak and high cpu usage.
so the <nuxt-link /> is the culprit! Great job!
you should change the title of this issue :)

but why??

@AndruxaSazonov
Copy link
Author

but why??

Don't know. Router in nuxt contains reference to app, and rendering router-link may use some child data of context. This is a hypothesis, not more than.

@AndruxaSazonov AndruxaSazonov changed the title [Question] What may cause a leak? Huge memory usage of nuxt-link Oct 4, 2017
@AndruxaSazonov
Copy link
Author

Looking through the code of nuxt didn't help. Probably it is not the problem of nuxt, but the vue ssr's. No time for now to check plain ssr router-link.

One moment also - changing nuxt-link to router-link gives the same picture, so the thoughts as above.

@AndruxaSazonov
Copy link
Author

AndruxaSazonov commented Oct 4, 2017

More and more interesting!! Please check some additions in repro:
AndruxaSazonov/leak-repro@119b1ef

It seems that not a link actually or the rendering itself copies the context (if those copied of course). The problem when you nesting components!... :(

And why are $router available in included component...

@AndruxaSazonov
Copy link
Author

AndruxaSazonov commented Oct 4, 2017

Found a time to test on vue-ssr-boilerplate with latest dependencies set in packages.json.... better if I hadn't :/

The same history with the behaviours of router-link and html anchor in src/views/Home.vue tests... Anchors are 14 times faster.

This problem is the show-stopper anyways in my projects, since I have very big number of inner site links in each page (since it is a catalog).

Will use manual hooks on anchors as a workaround for time. And graceful-cluster too.

@arenddeboer
Copy link

Thanks @AndruxaSazonov for your detailed description of this issue. Did you ever find a solution ?
Adding 600 router-links to my app brings it down very quickly.

@arenddeboer
Copy link

Update: found the issue to be unrelated to router-link, adding many router-links just increases the leak issue n-times. The culprit wat VeeValidate.

@ryouaki
Copy link

ryouaki commented Aug 24, 2018

I got the same issue,but no solution。

@arenddeboer
Copy link

Finding memory leaks is one of the more difficult tasks when working with long running programs.
One of the best ways to see what type of objects are increasing over time is taking heap dumps as the original poster of this issue has done. There are many guides on how to do this, like this one: https://marmelab.com/blog/2018/04/03/how-to-track-and-fix-memory-leak-with-nodejs.html

In nuxt.js it will probably be an ever increasing number of Vue instances which might not shed much light on the core issue. My advice would be to first find a way to quickly test the issue.This can be done with a tool like apache benchmark (again as described in this issue) to send many requests per second to your server and watch memory consumption over time with for instance htop -p <your node process id>. Or use a node.js memory logging module from npm. Make sure you disable caching as this can prevent the leak from showing up. Next disable third party plugins, one at a time and see if the problem disappears. If not disable your own plugins or components.

I found this approach more practical than sifting through heap dumps. if you use git, of course first create a new branch as your modifications to the project can quickly increase.

@AndruxaSazonov
Copy link
Author

AndruxaSazonov commented Aug 26, 2018 via email

@ryouaki
Copy link

ryouaki commented Aug 27, 2018

Could you please explain how to fix this issue?

@besnikh
Copy link

besnikh commented Aug 28, 2018

@AndruxaSazonov I've read most of your comments about the leak and steps you took to find it since now I am facing same problem in production..

I am using Kubernetes on Google Cloud and this is how my Deployment looks like
fireshot capture 186 - kubernetes engine - tepazari-main_ - https___console cloud google com_k

Note that it takes about 90 minutes from initial start to 4GB memory limit, then pod auto restarts etc..

What is your suggestion to find the memory leak on my case.. or how did you solve the issue.
I am using vuetify and I am not using axios-module

Cheers

@ryouaki
Copy link

ryouaki commented Aug 29, 2018

@besnikh I think this is the reason of memory leak issue of vue-router

Do you use beforeRouteEnter at your component? This way will get infinite loop at function poll on vue-router.common.js .

@AndruxaSazonov
Copy link
Author

AndruxaSazonov commented Aug 29, 2018 via email

@besnikh
Copy link

besnikh commented Aug 29, 2018

@ryouaki I use Vuetify which uses vue-router, but I do not use beforeRouteEnter .. I use an auth middleware, I guess that's the beforeRouteEnter ?
I do not use official auth middleware, but something I created using firebase auth.
Do you think this might be the problem ?

@AndruxaSazonov I am also using latest nuxt, latest vuetify but still my instances are going all the way up on memory usage then down.. It's not a issue now maybe, but after I get a lot of traffic :(

Are u still using <v-btn since that comp is using nuxt-link ?
Btw I also have a lot of links about 100...

I really don't know what to do :(

@ryouaki
Copy link

ryouaki commented Aug 29, 2018

@besnikh yes , It is the problem from beforeRouteEnter.

When vue-router found that , user add a beforeRouteEnter function on an async component, it will invoke poll function, and set a timer callback every 16ms until the async compoent load success. sometimes async component will load failed, and then poll can not be break. I am not sure why this will happen.

I am investigating.

@lock
Copy link

lock bot commented Oct 31, 2018

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@lock lock bot locked as resolved and limited conversation to collaborators Oct 31, 2018
@danielroe danielroe added the 2.x label Jan 18, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

7 participants