diff options
author | Bjørn Erik Pedersen <bjorn.erik.pedersen@gmail.com> | 2023-12-24 19:11:05 +0100 |
---|---|---|
committer | Bjørn Erik Pedersen <bjorn.erik.pedersen@gmail.com> | 2024-01-27 16:28:14 +0100 |
commit | 7285e74090852b5d52f25e577850fa75f4aa8573 (patch) | |
tree | 54d07cb4a7de2db5c89f2590266595f0aca6cbd6 /hugolib/hugo_sites.go | |
parent | 5fd1e7490305570872d3899f5edda950903c5213 (diff) | |
download | hugo-develop2024.tar.gz hugo-develop2024.zip |
all: Rework page store, add a dynacache, improve partial rebuilds, and some general spring cleaningdevelop2024
There are some breaking changes in this commit, see #11455.
Closes #11455
Closes #11549
This fixes a set of bugs (see issue list) and it is also paying some technical debt accumulated over the years. We now build with Staticcheck enabled in the CI build.
The performance should be about the same as before for regular sized Hugo sites, but it should perform and scale much better to larger data sets, as objects that uses lots of memory (e.g. rendered Markdown, big JSON files read into maps with transform.Unmarshal etc.) will now get automatically garbage collected if needed. Performance on partial rebuilds when running the server in fast render mode should be the same, but the change detection should be much more accurate.
A list of the notable new features:
* A new dependency tracker that covers (almost) all of Hugo's API and is used to do fine grained partial rebuilds when running the server.
* A new and simpler tree document store which allows fast lookups and prefix-walking in all dimensions (e.g. language) concurrently.
* You can now configure an upper memory limit allowing for much larger data sets and/or running on lower specced PCs.
We have lifted the "no resources in sub folders" restriction for branch bundles (e.g. sections).
Memory Limit
* Hugos will, by default, set aside a quarter of the total system memory, but you can set this via the OS environment variable HUGO_MEMORYLIMIT (in gigabytes). This is backed by a partitioned LRU cache used throughout Hugo. A cache that gets dynamically resized in low memory situations, allowing Go's Garbage Collector to free the memory.
New Dependency Tracker: Hugo has had a rule based coarse grained approach to server rebuilds that has worked mostly pretty well, but there have been some surprises (e.g. stale content). This is now revamped with a new dependency tracker that can quickly calculate the delta given a changed resource (e.g. a content file, template, JS file etc.). This handles transitive relations, e.g. $page -> js.Build -> JS import, or $page1.Content -> render hook -> site.GetPage -> $page2.Title, or $page1.Content -> shortcode -> partial -> site.RegularPages -> $page2.Content -> shortcode ..., and should also handle changes to aggregated values (e.g. site.Lastmod) effectively.
This covers all of Hugo's API with 2 known exceptions (a list that may not be fully exhaustive):
Changes to files loaded with template func os.ReadFile may not be handled correctly. We recommend loading resources with resources.Get
Changes to Hugo objects (e.g. Page) passed in the template context to lang.Translate may not be detected correctly. We recommend having simple i18n templates without too much data context passed in other than simple types such as strings and numbers.
Note that the cachebuster configuration (when A changes then rebuild B) works well with the above, but we recommend that you revise that configuration, as it in most situations should not be needed. One example where it is still needed is with TailwindCSS and using changes to hugo_stats.json to trigger new CSS rebuilds.
Document Store: Previously, a little simplified, we split the document store (where we store pages and resources) in a tree per language. This worked pretty well, but the structure made some operations harder than they needed to be. We have now restructured it into one Radix tree for all languages. Internally the language is considered to be a dimension of that tree, and the tree can be viewed in all dimensions concurrently. This makes some operations re. language simpler (e.g. finding translations is just a slice range), but the idea is that it should also be relatively inexpensive to add more dimensions if needed (e.g. role).
Fixes #10169
Fixes #10364
Fixes #10482
Fixes #10630
Fixes #10656
Fixes #10694
Fixes #10918
Fixes #11262
Fixes #11439
Fixes #11453
Fixes #11457
Fixes #11466
Fixes #11540
Fixes #11551
Fixes #11556
Fixes #11654
Fixes #11661
Fixes #11663
Fixes #11664
Fixes #11669
Fixes #11671
Fixes #11807
Fixes #11808
Fixes #11809
Fixes #11815
Fixes #11840
Fixes #11853
Fixes #11860
Fixes #11883
Fixes #11904
Fixes #7388
Fixes #7425
Fixes #7436
Fixes #7544
Fixes #7882
Fixes #7960
Fixes #8255
Fixes #8307
Fixes #8863
Fixes #8927
Fixes #9192
Fixes #9324
Diffstat (limited to 'hugolib/hugo_sites.go')
-rw-r--r-- | hugolib/hugo_sites.go | 560 |
1 files changed, 156 insertions, 404 deletions
diff --git a/hugolib/hugo_sites.go b/hugolib/hugo_sites.go index f3f5c3eb2..80e754453 100644 --- a/hugolib/hugo_sites.go +++ b/hugolib/hugo_sites.go @@ -1,4 +1,4 @@ -// Copyright 2019 The Hugo Authors. All rights reserved. +// Copyright 2024 The Hugo Authors. All rights reserved. // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. @@ -17,27 +17,25 @@ import ( "context" "fmt" "io" - "path/filepath" - "sort" "strings" "sync" "sync/atomic" "github.com/bep/logg" + "github.com/gohugoio/hugo/cache/dynacache" "github.com/gohugoio/hugo/config/allconfig" "github.com/gohugoio/hugo/hugofs/glob" + "github.com/gohugoio/hugo/hugolib/doctree" "github.com/fsnotify/fsnotify" - "github.com/gohugoio/hugo/identity" - - radix "github.com/armon/go-radix" - "github.com/gohugoio/hugo/output" "github.com/gohugoio/hugo/parser/metadecoders" "github.com/gohugoio/hugo/common/hugo" + "github.com/gohugoio/hugo/common/maps" "github.com/gohugoio/hugo/common/para" + "github.com/gohugoio/hugo/common/types" "github.com/gohugoio/hugo/hugofs" "github.com/gohugoio/hugo/source" @@ -47,9 +45,7 @@ import ( "github.com/gohugoio/hugo/helpers" "github.com/gohugoio/hugo/lazy" - "github.com/gohugoio/hugo/resources/kinds" "github.com/gohugoio/hugo/resources/page" - "github.com/gohugoio/hugo/resources/page/pagemeta" ) // HugoSites represents the sites to build. Each site represents a language. @@ -74,13 +70,19 @@ type HugoSites struct { // As loaded from the /data dirs data map[string]any - contentInit sync.Once - content *pageMaps + // Cache for page listings. + cachePages *dynacache.Partition[string, page.Pages] - postRenderInit sync.Once + // Before Hugo 0.122.0 we managed all translations in a map using a translationKey + // that could be overridden in front matter. + // Now the different page dimensions (e.g. language) are built-in to the page trees above. + // But we sill need to support the overridden translationKey, but that should + // be relatively rare and low volume. + translationKeyPages *maps.SliceCache[page.Page] - // Keeps track of bundle directories and symlinks to enable partial rebuilding. - ContentChanges *contentChangeMap + pageTrees *pageTrees + + postRenderInit sync.Once // File change events with filename stored in this map will be skipped. skipRebuildForFilenamesMu sync.Mutex @@ -88,11 +90,12 @@ type HugoSites struct { init *hugoSitesInit - workers *para.Workers - numWorkers int + workersSite *para.Workers + numWorkersSites int + numWorkers int *fatalErrorHandler - *testCounters + *buildCounters } // ShouldSkipFileChangeEvent allows skipping filesystem event early before @@ -103,31 +106,17 @@ func (h *HugoSites) ShouldSkipFileChangeEvent(ev fsnotify.Event) bool { return h.skipRebuildForFilenames[ev.Name] } -func (h *HugoSites) getContentMaps() *pageMaps { - h.contentInit.Do(func() { - h.content = newPageMaps(h) - }) - return h.content -} - // Only used in tests. -type testCounters struct { - contentRenderCounter uint64 - pageRenderCounter uint64 +type buildCounters struct { + contentRenderCounter atomic.Uint64 + pageRenderCounter atomic.Uint64 } -func (h *testCounters) IncrContentRender() { - if h == nil { - return +func (c *buildCounters) loggFields() logg.Fields { + return logg.Fields{ + {Name: "pages", Value: c.pageRenderCounter.Load()}, + {Name: "content", Value: c.contentRenderCounter.Load()}, } - atomic.AddUint64(&h.contentRenderCounter, 1) -} - -func (h *testCounters) IncrPageRender() { - if h == nil { - return - } - atomic.AddUint64(&h.pageRenderCounter, 1) } type fatalErrorHandler struct { @@ -172,16 +161,6 @@ type hugoSitesInit struct { // Loads the Git info and CODEOWNERS for all the pages if enabled. gitInfo *lazy.Init - - // Maps page translations. - translations *lazy.Init -} - -func (h *hugoSitesInit) Reset() { - h.data.Reset() - h.layouts.Reset() - h.gitInfo.Reset() - h.translations.Reset() } func (h *HugoSites) Data() map[string]any { @@ -192,6 +171,41 @@ func (h *HugoSites) Data() map[string]any { return h.data } +// Pages returns all pages for all sites. +func (h *HugoSites) Pages() page.Pages { + key := "pages" + v, err := h.cachePages.GetOrCreate(key, func(string) (page.Pages, error) { + var pages page.Pages + for _, s := range h.Sites { + pages = append(pages, s.Pages()...) + } + page.SortByDefault(pages) + return pages, nil + }) + if err != nil { + panic(err) + } + return v +} + +// Pages returns all regularpages for all sites. +func (h *HugoSites) RegularPages() page.Pages { + key := "regular-pages" + v, err := h.cachePages.GetOrCreate(key, func(string) (page.Pages, error) { + var pages page.Pages + for _, s := range h.Sites { + pages = append(pages, s.RegularPages()...) + } + page.SortByDefault(pages) + + return pages, nil + }) + if err != nil { + panic(err) + } + return v +} + func (h *HugoSites) gitInfoForPage(p page.Page) (source.GitInfo, error) { if _, err := h.init.gitInfo.Do(context.Background()); err != nil { return source.GitInfo{}, err @@ -283,16 +297,24 @@ func (h *HugoSites) PrintProcessingStats(w io.Writer) { func (h *HugoSites) GetContentPage(filename string) page.Page { var p page.Page - h.getContentMaps().walkBundles(func(b *contentNode) bool { - if b.p == nil || b.fi == nil { + h.withPage(func(s string, p2 *pageState) bool { + if p2.File() == nil { return false } - if b.fi.Meta().Filename == filename { - p = b.p + if p2.File().FileInfo().Meta().Filename == filename { + p = p2 return true } + for _, r := range p2.Resources().ByType(pageResourceType) { + p3 := r.(page.Page) + if p3.File() != nil && p3.File().FileInfo().Meta().Filename == filename { + p = p3 + return true + } + } + return false }) @@ -320,20 +342,10 @@ func (h *HugoSites) loadGitInfo() error { // Reset resets the sites and template caches etc., making it ready for a full rebuild. func (h *HugoSites) reset(config *BuildCfg) { - if config.ResetState { - for _, s := range h.Sites { - if r, ok := s.Fs.PublishDir.(hugofs.Reseter); ok { - r.Reset() - } - } - } - h.fatalErrorHandler = &fatalErrorHandler{ h: h, donec: make(chan bool), } - - h.init.Reset() } // resetLogs resets the log counters etc. Used to do a new build on the same sites. @@ -345,43 +357,42 @@ func (h *HugoSites) resetLogs() { } func (h *HugoSites) withSite(fn func(s *Site) error) error { - if h.workers == nil { - for _, s := range h.Sites { - if err := fn(s); err != nil { - return err - } + for _, s := range h.Sites { + if err := fn(s); err != nil { + return err } - return nil } + return nil +} - g, _ := h.workers.Start(context.Background()) - for _, s := range h.Sites { - s := s - g.Run(func() error { - return fn(s) - }) - } - return g.Wait() +func (h *HugoSites) withPage(fn func(s string, p *pageState) bool) { + h.withSite(func(s *Site) error { + w := &doctree.NodeShiftTreeWalker[contentNodeI]{ + Tree: s.pageMap.treePages, + LockType: doctree.LockTypeRead, + Handle: func(s string, n contentNodeI, match doctree.DimensionFlag) (bool, error) { + return fn(s, n.(*pageState)), nil + }, + } + return w.Walk(context.Background()) + }) } // BuildCfg holds build options used to, as an example, skip the render step. type BuildCfg struct { - // Reset site state before build. Use to force full rebuilds. - ResetState bool // Skip rendering. Useful for testing. SkipRender bool // Use this to indicate what changed (for rebuilds). whatChanged *whatChanged - // This is a partial re-render of some selected pages. This means - // we should skip most of the processing. + // This is a partial re-render of some selected pages. PartialReRender bool // Set in server mode when the last build failed for some reason. ErrRecovery bool // Recently visited URLs. This is used for partial re-rendering. - RecentlyVisited map[string]bool + RecentlyVisited *types.EvictingStringQueue // Can be set to build only with a sub set of the content source. ContentInclusionFilter *glob.FilenameFilter @@ -389,174 +400,95 @@ type BuildCfg struct { // Set when the buildlock is already acquired (e.g. the archetype content builder). NoBuildLock bool - testCounters *testCounters + testCounters *buildCounters } -// shouldRender is used in the Fast Render Mode to determine if we need to re-render -// a Page: If it is recently visited (the home pages will always be in this set) or changed. -// Note that a page does not have to have a content page / file. -// For regular builds, this will always return true. -// TODO(bep) rename/work this. +// shouldRender returns whether this output format should be rendered or not. func (cfg *BuildCfg) shouldRender(p *pageState) bool { - if p == nil { - return false - } - - if p.forceRender { - return true - } - - if len(cfg.RecentlyVisited) == 0 { - return true - } - - if cfg.RecentlyVisited[p.RelPermalink()] { + if !p.renderOnce { return true } - if cfg.whatChanged != nil && !p.File().IsZero() { - return cfg.whatChanged.files[p.File().Filename()] - } - - return false -} - -func (h *HugoSites) renderCrossSitesSitemap() error { - if h.Conf.IsMultihost() || !(h.Conf.DefaultContentLanguageInSubdir() || h.Conf.IsMultiLingual()) { - return nil - } - - sitemapEnabled := false - for _, s := range h.Sites { - if s.conf.IsKindEnabled(kinds.KindSitemap) { - sitemapEnabled = true - break - } - } - - if !sitemapEnabled { - return nil - } + // The render state is incremented on render and reset when a related change is detected. + // Note that this is set per output format. + shouldRender := p.renderState == 0 - s := h.Sites[0] - // We don't have any page context to pass in here. - ctx := context.Background() - - templ := s.lookupLayouts("sitemapindex.xml", "_default/sitemapindex.xml", "_internal/_default/sitemapindex.xml") - return s.renderAndWriteXML(ctx, &s.PathSpec.ProcessingStats.Sitemaps, "sitemapindex", - s.conf.Sitemap.Filename, h.Sites, templ) -} - -func (h *HugoSites) renderCrossSitesRobotsTXT() error { - if h.Configs.IsMultihost { - return nil - } - if !h.Configs.Base.EnableRobotsTXT { - return nil + if !shouldRender { + return false } - s := h.Sites[0] + fastRenderMode := cfg.RecentlyVisited.Len() > 0 - p, err := newPageStandalone(&pageMeta{ - s: s, - kind: kinds.KindRobotsTXT, - urlPaths: pagemeta.URLPath{ - URL: "robots.txt", - }, - }, - output.RobotsTxtFormat) - if err != nil { - return err + if !fastRenderMode { + // Not in fast render mode or first time render. + return shouldRender } if !p.render { - return nil + // Not be to rendered for this output format. + return false } - templ := s.lookupLayouts("robots.txt", "_default/robots.txt", "_internal/_default/robots.txt") - - return s.renderAndWritePage(&s.PathSpec.ProcessingStats.Pages, "Robots Txt", "robots.txt", p, templ) -} - -func (h *HugoSites) removePageByFilename(filename string) { - h.getContentMaps().withMaps(func(m *pageMap) error { - m.deleteBundleMatching(func(b *contentNode) bool { - if b.p == nil { - return false - } - - if b.fi == nil { - return false - } - - return b.fi.Meta().Filename == filename - }) - return nil - }) -} + if p.outputFormat().IsHTML { + // This is fast render mode and the output format is HTML, + // rerender if this page is one of the recently visited. + return cfg.RecentlyVisited.Contains(p.RelPermalink()) + } -func (h *HugoSites) createPageCollections() error { - allPages := newLazyPagesFactory(func() page.Pages { - var pages page.Pages - for _, s := range h.Sites { - pages = append(pages, s.Pages()...) + // In fast render mode, we want to avoid re-rendering the sitemaps etc. and + // other big listings whenever we e.g. change a content file, + // but we want partial renders of the recently visited pages to also include + // alternative formats of the same HTML page (e.g. RSS, JSON). + for _, po := range p.pageOutputs { + if po.render && po.f.IsHTML && cfg.RecentlyVisited.Contains(po.RelPermalink()) { + return true } - - page.SortByDefault(pages) - - return pages - }) - - allRegularPages := newLazyPagesFactory(func() page.Pages { - return h.findPagesByKindIn(kinds.KindPage, allPages.get()) - }) - - for _, s := range h.Sites { - s.PageCollections.allPages = allPages - s.PageCollections.allRegularPages = allRegularPages } - return nil + return false } func (s *Site) preparePagesForRender(isRenderingSite bool, idx int) error { var err error - s.pageMap.withEveryBundlePage(func(p *pageState) bool { - if err = p.initOutputFormat(isRenderingSite, idx); err != nil { - return true + + initPage := func(p *pageState) error { + if err = p.shiftToOutputFormat(isRenderingSite, idx); err != nil { + return err } - return false - }) - return nil -} + return nil + } -// Pages returns all pages for all sites. -func (h *HugoSites) Pages() page.Pages { - return h.Sites[0].AllPages() + return s.pageMap.forEeachPageIncludingBundledPages(nil, + func(p *pageState) (bool, error) { + return false, initPage(p) + }, + ) } -func (h *HugoSites) loadData(fis []hugofs.FileMetaInfo) (err error) { - spec := source.NewSourceSpec(h.PathSpec, nil, nil) - +func (h *HugoSites) loadData() error { h.data = make(map[string]any) - for _, fi := range fis { - basePath := fi.Meta().Path - fileSystem := spec.NewFilesystemFromFileMetaInfo(fi) - files, err := fileSystem.Files() - if err != nil { - return err - } - for _, r := range files { - if err := h.handleDataFile(basePath, r); err != nil { - return err - } - } - } + w := hugofs.NewWalkway( + hugofs.WalkwayConfig{ + Fs: h.PathSpec.BaseFs.Data.Fs, + WalkFn: func(path string, fi hugofs.FileMetaInfo) error { + if fi.IsDir() { + return nil + } + pi := fi.Meta().PathInfo + if pi == nil { + panic("no path info") + } + return h.handleDataFile(source.NewFileInfo(fi)) + }, + }) - return + if err := w.Walk(); err != nil { + return err + } + return nil } -func (h *HugoSites) handleDataFile(basePath string, r source.File) error { +func (h *HugoSites) handleDataFile(r *source.File) error { var current map[string]any f, err := r.FileInfo().Meta().Open() @@ -567,8 +499,8 @@ func (h *HugoSites) handleDataFile(basePath string, r source.File) error { // Crawl in data tree to insert data current = h.data - dataPath := filepath.Join(basePath, r.Dir()) - keyParts := strings.Split(dataPath, helpers.FilePathSeparator) + dataPath := r.FileInfo().Meta().PathInfo.Dir()[1:] + keyParts := strings.Split(dataPath, "/") for _, key := range keyParts { if key != "" { @@ -635,17 +567,12 @@ func (h *HugoSites) handleDataFile(basePath string, r source.File) error { return nil } -func (h *HugoSites) errWithFileContext(err error, f source.File) error { - fim, ok := f.FileInfo().(hugofs.FileMetaInfo) - if !ok { - return err - } - realFilename := fim.Meta().Filename - - return herrors.NewFileErrorFromFile(err, realFilename, h.SourceSpec.Fs.Source, nil) +func (h *HugoSites) errWithFileContext(err error, f *source.File) error { + realFilename := f.FileInfo().Meta().Filename + return herrors.NewFileErrorFromFile(err, realFilename, h.Fs.Source, nil) } -func (h *HugoSites) readData(f source.File) (any, error) { +func (h *HugoSites) readData(f *source.File) (any, error) { file, err := f.FileInfo().Meta().Open() if err != nil { return nil, fmt.Errorf("readData: failed to open data file: %w", err) @@ -656,178 +583,3 @@ func (h *HugoSites) readData(f source.File) (any, error) { format := metadecoders.FormatFromString(f.Ext()) return metadecoders.Default.Unmarshal(content, format) } - -func (h *HugoSites) findPagesByKindIn(kind string, inPages page.Pages) page.Pages { - return h.Sites[0].findPagesByKindIn(kind, inPages) -} - -func (h *HugoSites) resetPageState() { - h.getContentMaps().walkBundles(func(n *contentNode) bool { - if n.p == nil { - return false - } - p := n.p - for _, po := range p.pageOutputs { - if po.cp == nil { - continue - } - po.cp.Reset() - } - - return false - }) -} - -func (h *HugoSites) resetPageStateFromEvents(idset identity.Identities) { - h.getContentMaps().walkBundles(func(n *contentNode) bool { - if n.p == nil { - return false - } - p := n.p - OUTPUTS: - for _, po := range p.pageOutputs { - if po.cp == nil { - continue - } - for id := range idset { - if po.cp.dependencyTracker.Search(id) != nil { - po.cp.Reset() - continue OUTPUTS - } - } - } - - if p.shortcodeState == nil { - return false - } - - for _, s := range p.shortcodeState.shortcodes { - for _, templ := range s.templs { - sid := templ.(identity.Manager) - for id := range idset { - if sid.Search(id) != nil { - for _, po := range p.pageOutputs { - if po.cp != nil { - po.cp.Reset() - } - } - return false - } - } - } - } - return false - }) -} - -// Used in partial reloading to determine if the change is in a bundle. -type contentChangeMap struct { - mu sync.RWMutex - - // Holds directories with leaf bundles. - leafBundles *radix.Tree - - // Holds directories with branch bundles. - branchBundles map[string]bool - - pathSpec *helpers.PathSpec - - // Hugo supports symlinked content (both directories and files). This - // can lead to situations where the same file can be referenced from several - // locations in /content -- which is really cool, but also means we have to - // go an extra mile to handle changes. - // This map is only used in watch mode. - // It maps either file to files or the real dir to a set of content directories - // where it is in use. - symContentMu sync.Mutex - symContent map[string]map[string]bool -} - -func (m *contentChangeMap) add(dirname string, tp bundleDirType) { - m.mu.Lock() - if !strings.HasSuffix(dirname, helpers.FilePathSeparator) { - dirname += helpers.FilePathSeparator - } - switch tp { - case bundleBranch: - m.branchBundles[dirname] = true - case bundleLeaf: - m.leafBundles.Insert(dirname, true) - default: - m.mu.Unlock() - panic("invalid bundle type") - } - m.mu.Unlock() -} - -func (m *contentChangeMap) resolveAndRemove(filename string) (string, bundleDirType) { - m.mu.RLock() - defer m.mu.RUnlock() - - // Bundles share resources, so we need to start from the virtual root. - relFilename := m.pathSpec.RelContentDir(filename) - dir, name := filepath.Split(relFilename) - if !strings.HasSuffix(dir, helpers.FilePathSeparator) { - dir += helpers.FilePathSeparator - } - - if _, found := m.branchBundles[dir]; found { - delete(m.branchBundles, dir) - return dir, bundleBranch - } - - if key, _, found := m.leafBundles.LongestPrefix(dir); found { - m.leafBundles.Delete(key) - dir = string(key) - return dir, bundleLeaf - } - - fileTp, isContent := classifyBundledFile(name) - if isContent && fileTp != bundleNot { - // A new bundle. - return dir, fileTp - } - - return dir, bundleNot -} - -func (m *contentChangeMap) addSymbolicLinkMapping(fim hugofs.FileMetaInfo) { - meta := fim.Meta() - if !meta.IsSymlink { - return - } - m.symContentMu.Lock() - - from, to := meta.Filename, meta.OriginalFilename - if fim.IsDir() { - if !strings.HasSuffix(from, helpers.FilePathSeparator) { - from += helpers.FilePathSeparator - } - } - - mm, found := m.symContent[from] - - if !found { - mm = make(map[string]bool) - m.symContent[from] = mm - } - mm[to] = true - m.symContentMu.Unlock() -} - -func (m *contentChangeMap) GetSymbolicLinkMappings(dir string) []string { - mm, found := m.symContent[dir] - if !found { - return nil - } - dirs := make([]string, len(mm)) - i := 0 - for dir := range mm { - dirs[i] = dir - i++ - } - - sort.Strings(dirs) - - return dirs -} |