You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
p.Metric(&lset)
hash = lset.Hash()
// Hash label set as it is seen local to the target. Then add target labels
// and relabeling and store the final label set.
lset = sl.sampleMutator(lset)
// The label set may be set to empty to indicate dropping.
if lset.IsEmpty() {
sl.cache.addDropped(met)
continue
}
if !lset.Has(labels.MetricName) {
err = errNameLabelMandatory
break loop
}
if !lset.IsValid() {
err = fmt.Errorf("invalid metric name or label names: %s", lset.String())
break loop
}
// If any label limits is exceeded the scrape should fail.
if err = verifyLabelLimits(lset, sl.labelLimits); err != nil {
sl.metrics.targetScrapePoolExceededLabelLimits.Inc()
break loop
}
// Append metadata for new series if they were present.
updateMetadata(lset, true)
Create a duplicate of labels.builder. p. Metric($lset) use ScratchBuilder to build labels, and in sampleMutator it will transform structure labels.Labels back to labels.Builder, which will cause duplicate sort and memory allocation for labels, same operation happened in relable.process.
Traverse lset multiple times.
The structure of lset is []Label, traversing lset is expensive, operations like get metricName, lset valid and verifyLabel can be reused and merged into one traversal.
Memory allocation is not precise enough
Some operations can specify make capacity more precisely to reduce performance consumption during grow capacity. such as mutatesample and scratchbuilder
Advice
use labels.builder all the time until the cache miss logic is over.
the data structure of label.builder can use more efficient structures like map?
The text was updated successfully, but these errors were encountered:
Proposal
Problem
current scrape logic:
Create a duplicate of labels.builder.
p. Metric($lset)
use ScratchBuilder to build labels, and in sampleMutator it will transform structure labels.Labels back to labels.Builder, which will cause duplicate sort and memory allocation for labels, same operation happened in relable.process.Traverse lset multiple times.
The structure of lset is
[]Label
, traversing lset is expensive, operations likeget metricName
,lset valid
andverifyLabel
can be reused and merged into one traversal.Memory allocation is not precise enough
Some operations can specify make capacity more precisely to reduce performance consumption during grow capacity. such as mutatesample and scratchbuilder
Advice
The text was updated successfully, but these errors were encountered: