summaryrefslogtreecommitdiffstats
path: root/resources/page
diff options
context:
space:
mode:
authorBjørn Erik Pedersen <bjorn.erik.pedersen@gmail.com>2023-12-24 19:11:05 +0100
committerBjørn Erik Pedersen <bjorn.erik.pedersen@gmail.com>2024-01-27 16:28:14 +0100
commit7285e74090852b5d52f25e577850fa75f4aa8573 (patch)
tree54d07cb4a7de2db5c89f2590266595f0aca6cbd6 /resources/page
parent5fd1e7490305570872d3899f5edda950903c5213 (diff)
downloadhugo-develop2024.tar.gz
hugo-develop2024.zip
all: Rework page store, add a dynacache, improve partial rebuilds, and some general spring cleaningdevelop2024
There are some breaking changes in this commit, see #11455. Closes #11455 Closes #11549 This fixes a set of bugs (see issue list) and it is also paying some technical debt accumulated over the years. We now build with Staticcheck enabled in the CI build. The performance should be about the same as before for regular sized Hugo sites, but it should perform and scale much better to larger data sets, as objects that uses lots of memory (e.g. rendered Markdown, big JSON files read into maps with transform.Unmarshal etc.) will now get automatically garbage collected if needed. Performance on partial rebuilds when running the server in fast render mode should be the same, but the change detection should be much more accurate. A list of the notable new features: * A new dependency tracker that covers (almost) all of Hugo's API and is used to do fine grained partial rebuilds when running the server. * A new and simpler tree document store which allows fast lookups and prefix-walking in all dimensions (e.g. language) concurrently. * You can now configure an upper memory limit allowing for much larger data sets and/or running on lower specced PCs. We have lifted the "no resources in sub folders" restriction for branch bundles (e.g. sections). Memory Limit * Hugos will, by default, set aside a quarter of the total system memory, but you can set this via the OS environment variable HUGO_MEMORYLIMIT (in gigabytes). This is backed by a partitioned LRU cache used throughout Hugo. A cache that gets dynamically resized in low memory situations, allowing Go's Garbage Collector to free the memory. New Dependency Tracker: Hugo has had a rule based coarse grained approach to server rebuilds that has worked mostly pretty well, but there have been some surprises (e.g. stale content). This is now revamped with a new dependency tracker that can quickly calculate the delta given a changed resource (e.g. a content file, template, JS file etc.). This handles transitive relations, e.g. $page -> js.Build -> JS import, or $page1.Content -> render hook -> site.GetPage -> $page2.Title, or $page1.Content -> shortcode -> partial -> site.RegularPages -> $page2.Content -> shortcode ..., and should also handle changes to aggregated values (e.g. site.Lastmod) effectively. This covers all of Hugo's API with 2 known exceptions (a list that may not be fully exhaustive): Changes to files loaded with template func os.ReadFile may not be handled correctly. We recommend loading resources with resources.Get Changes to Hugo objects (e.g. Page) passed in the template context to lang.Translate may not be detected correctly. We recommend having simple i18n templates without too much data context passed in other than simple types such as strings and numbers. Note that the cachebuster configuration (when A changes then rebuild B) works well with the above, but we recommend that you revise that configuration, as it in most situations should not be needed. One example where it is still needed is with TailwindCSS and using changes to hugo_stats.json to trigger new CSS rebuilds. Document Store: Previously, a little simplified, we split the document store (where we store pages and resources) in a tree per language. This worked pretty well, but the structure made some operations harder than they needed to be. We have now restructured it into one Radix tree for all languages. Internally the language is considered to be a dimension of that tree, and the tree can be viewed in all dimensions concurrently. This makes some operations re. language simpler (e.g. finding translations is just a slice range), but the idea is that it should also be relatively inexpensive to add more dimensions if needed (e.g. role). Fixes #10169 Fixes #10364 Fixes #10482 Fixes #10630 Fixes #10656 Fixes #10694 Fixes #10918 Fixes #11262 Fixes #11439 Fixes #11453 Fixes #11457 Fixes #11466 Fixes #11540 Fixes #11551 Fixes #11556 Fixes #11654 Fixes #11661 Fixes #11663 Fixes #11664 Fixes #11669 Fixes #11671 Fixes #11807 Fixes #11808 Fixes #11809 Fixes #11815 Fixes #11840 Fixes #11853 Fixes #11860 Fixes #11883 Fixes #11904 Fixes #7388 Fixes #7425 Fixes #7436 Fixes #7544 Fixes #7882 Fixes #7960 Fixes #8255 Fixes #8307 Fixes #8863 Fixes #8927 Fixes #9192 Fixes #9324
Diffstat (limited to 'resources/page')
-rw-r--r--resources/page/page.go47
-rw-r--r--resources/page/page_generate/generate_page_wrappers.go96
-rw-r--r--resources/page/page_lazy_contentprovider.go4
-rw-r--r--resources/page/page_marshaljson.autogen.go180
-rw-r--r--resources/page/page_matcher.go4
-rw-r--r--resources/page/page_nop.go42
-rw-r--r--resources/page/page_paths.go381
-rw-r--r--resources/page/page_paths_test.go295
-rw-r--r--resources/page/pagegroup.go4
-rw-r--r--resources/page/pagemeta/page_frontmatter.go12
-rw-r--r--resources/page/pagemeta/page_frontmatter_test.go69
-rw-r--r--resources/page/pages.go4
-rw-r--r--resources/page/pages_related.go3
-rw-r--r--resources/page/pages_sort.go26
-rw-r--r--resources/page/pages_sort_test.go3
-rw-r--r--resources/page/permalinks.go14
-rw-r--r--resources/page/permalinks_integration_test.go5
-rw-r--r--resources/page/permalinks_test.go5
-rw-r--r--resources/page/site.go50
-rw-r--r--resources/page/siteidentities/identities.go34
-rw-r--r--resources/page/taxonomy.go2
-rw-r--r--resources/page/testhelpers_page_test.go38
-rw-r--r--resources/page/testhelpers_test.go40
-rw-r--r--resources/page/zero_file.autogen.go72
24 files changed, 467 insertions, 963 deletions
diff --git a/resources/page/page.go b/resources/page/page.go
index b5af489f1..56ba04d74 100644
--- a/resources/page/page.go
+++ b/resources/page/page.go
@@ -19,16 +19,14 @@ import (
"context"
"html/template"
- "github.com/gohugoio/hugo/identity"
"github.com/gohugoio/hugo/markup/converter"
"github.com/gohugoio/hugo/markup/tableofcontents"
"github.com/gohugoio/hugo/config"
- "github.com/gohugoio/hugo/tpl"
"github.com/gohugoio/hugo/common/maps"
+ "github.com/gohugoio/hugo/common/paths"
"github.com/gohugoio/hugo/compare"
- "github.com/gohugoio/hugo/hugofs/files"
"github.com/gohugoio/hugo/navigation"
"github.com/gohugoio/hugo/related"
@@ -122,7 +120,7 @@ type ContentRenderer interface {
type FileProvider interface {
// File returns the source file for this Page,
// or a zero File if this Page is not backed by a file.
- File() source.File
+ File() *source.File
}
// GetPageProvider provides the GetPage method.
@@ -133,9 +131,6 @@ type GetPageProvider interface {
// This will return nil when no page could be found, and will return
// an error if the ref is ambiguous.
GetPage(ref string) (Page, error)
-
- // GetPageWithTemplateInfo is for internal use only.
- GetPageWithTemplateInfo(info tpl.Info, ref string) (Page, error)
}
// GitInfoProvider provides Git info.
@@ -166,6 +161,12 @@ type OutputFormatsProvider interface {
OutputFormats() OutputFormats
}
+// PageProvider provides access to a Page.
+// Implemented by shortcodes and others.
+type PageProvider interface {
+ Page() Page
+}
+
// Page is the core interface in Hugo and what you get as the top level data context in your templates.
type Page interface {
ContentProvider
@@ -175,7 +176,7 @@ type Page interface {
type PageFragment interface {
resource.ResourceLinksProvider
- resource.ResourceMetaProvider
+ resource.ResourceNameTitleProvider
}
// PageMetaProvider provides page metadata, typically provided via front matter.
@@ -187,7 +188,7 @@ type PageMetaProvider interface {
Aliases() []string
// BundleType returns the bundle type: `leaf`, `branch` or an empty string.
- BundleType() files.ContentClass
+ BundleType() string
// A configured description.
Description() string
@@ -224,9 +225,8 @@ type PageMetaProvider interface {
// to the source of this Page. It will be relative to any content root.
Path() string
- // This is just a temporary bridge method. Use Path in templates.
- // Pathc is for internal usage only.
- Pathc() string
+ // This is for internal use only.
+ PathInfo() *paths.Path
// The slug, typically defined in front matter.
Slug() string
@@ -240,13 +240,6 @@ type PageMetaProvider interface {
// Section returns the first path element below the content root.
Section() string
- // Returns a slice of sections (directories if it's a file) to this
- // Page.
- SectionsEntries() []string
-
- // SectionsPath is SectionsEntries joined with a /.
- SectionsPath() string
-
// Sitemap returns the sitemap configuration for this page.
// This is for internal use only.
Sitemap() config.SitemapConfig
@@ -332,9 +325,6 @@ type PageWithoutContent interface {
// e.g. GetTerms("categories")
GetTerms(taxonomy string) Pages
- // Used in change/dependency tracking.
- identity.Provider
-
// HeadingsFiltered returns the headings for this page when a filter is set.
// This is currently only triggered with the Related content feature
// and the "fragments" type of index.
@@ -430,7 +420,7 @@ type TranslationsProvider interface {
type TreeProvider interface {
// IsAncestor returns whether the current page is an ancestor of other.
// Note that this method is not relevant for taxonomy lists and taxonomy terms pages.
- IsAncestor(other any) (bool, error)
+ IsAncestor(other any) bool
// CurrentSection returns the page's current section or the page itself if home or a section.
// Note that this will return nil for pages that is not regular, home or section pages.
@@ -438,7 +428,7 @@ type TreeProvider interface {
// IsDescendant returns whether the current page is a descendant of other.
// Note that this method is not relevant for taxonomy lists and taxonomy terms pages.
- IsDescendant(other any) (bool, error)
+ IsDescendant(other any) bool
// FirstSection returns the section on level 1 below home, e.g. "/docs".
// For the home page, this will return itself.
@@ -447,7 +437,7 @@ type TreeProvider interface {
// InSection returns whether other is in the current section.
// Note that this will always return false for pages that are
// not either regular, home or section pages.
- InSection(other any) (bool, error)
+ InSection(other any) bool
// Parent returns a section's parent section or a page's section.
// To get a section's subsections, see Page's Sections method.
@@ -463,6 +453,13 @@ type TreeProvider interface {
// Page returns a reference to the Page itself, kept here mostly
// for legacy reasons.
Page() Page
+
+ // Returns a slice of sections (directories if it's a file) to this
+ // Page.
+ SectionsEntries() []string
+
+ // SectionsPath is SectionsEntries joined with a /.
+ SectionsPath() string
}
// PageWithContext is a Page with a context.Context.
diff --git a/resources/page/page_generate/generate_page_wrappers.go b/resources/page/page_generate/generate_page_wrappers.go
index 2449cf28d..d720b8a42 100644
--- a/resources/page/page_generate/generate_page_wrappers.go
+++ b/resources/page/page_generate/generate_page_wrappers.go
@@ -14,19 +14,14 @@
package page_generate
import (
- "bytes"
"errors"
"fmt"
"os"
"path/filepath"
"reflect"
- "github.com/gohugoio/hugo/common/maps"
-
"github.com/gohugoio/hugo/codegen"
"github.com/gohugoio/hugo/resources/page"
- "github.com/gohugoio/hugo/resources/resource"
- "github.com/gohugoio/hugo/source"
)
const header = `// Copyright 2019 The Hugo Authors. All rights reserved.
@@ -46,7 +41,7 @@ const header = `// Copyright 2019 The Hugo Authors. All rights reserved.
`
var (
- pageInterface = reflect.TypeOf((*page.Page)(nil)).Elem()
+ pageInterface = reflect.TypeOf((*page.PageMetaProvider)(nil)).Elem()
packageDir = filepath.FromSlash("resources/page")
)
@@ -56,10 +51,6 @@ func Generate(c *codegen.Inspector) error {
return fmt.Errorf("failed to generate JSON marshaler: %w", err)
}
- if err := generateFileIsZeroWrappers(c); err != nil {
- return fmt.Errorf("failed to generate file wrappers: %w", err)
- }
-
return nil
}
@@ -73,25 +64,7 @@ func generateMarshalJSON(c *codegen.Inspector) error {
includes := []reflect.Type{pageInterface}
- // Exclude these methods
- excludes := []reflect.Type{
- // Leave this out for now. We need to revisit the author issue.
- reflect.TypeOf((*page.AuthorProvider)(nil)).Elem(),
-
- reflect.TypeOf((*resource.ErrProvider)(nil)).Elem(),
-
- // navigation.PageMenus
-
- // Prevent loops.
- reflect.TypeOf((*page.SitesProvider)(nil)).Elem(),
- reflect.TypeOf((*page.Positioner)(nil)).Elem(),
-
- reflect.TypeOf((*page.ChildCareProvider)(nil)).Elem(),
- reflect.TypeOf((*page.TreeProvider)(nil)).Elem(),
- reflect.TypeOf((*page.InSectionPositioner)(nil)).Elem(),
- reflect.TypeOf((*page.PaginatorProvider)(nil)).Elem(),
- reflect.TypeOf((*maps.Scratcher)(nil)).Elem(),
- }
+ excludes := []reflect.Type{}
methods := c.MethodsFromTypes(
includes,
@@ -123,71 +96,6 @@ package page
return nil
}
-func generateFileIsZeroWrappers(c *codegen.Inspector) error {
- filename := filepath.Join(c.ProjectRootDir, packageDir, "zero_file.autogen.go")
- f, err := os.Create(filename)
- if err != nil {
- return err
- }
- defer f.Close()
-
- // Generate warnings for zero file access
-
- warning := func(name string, tp reflect.Type) string {
- msg := fmt.Sprintf(".File.%s on zero object. Wrap it in if or with: {{ with .File }}{{ .%s }}{{ end }}", name, name)
-
- // We made this a Warning in 0.92.0.
- // When we remove this construct in 0.93.0, people will get a nil pointer.
- return fmt.Sprintf("z.log.Warnln(%q)", msg)
- }
-
- var buff bytes.Buffer
-
- methods := c.MethodsFromTypes([]reflect.Type{reflect.TypeOf((*source.File)(nil)).Elem()}, nil)
-
- for _, m := range methods {
- if m.Name == "IsZero" || m.Name == "Classifier" {
- continue
- }
- fmt.Fprint(&buff, m.DeclarationNamed("zeroFile"))
- fmt.Fprintln(&buff, " {")
- fmt.Fprintf(&buff, "\t%s\n", warning(m.Name, m.Owner))
- if len(m.Out) > 0 {
- fmt.Fprintln(&buff, "\treturn")
- }
- fmt.Fprintln(&buff, "}")
-
- }
-
- pkgImports := append(methods.Imports(), "github.com/gohugoio/hugo/common/loggers", "github.com/gohugoio/hugo/source")
-
- fmt.Fprintf(f, `%s
-
-package page
-
-%s
-
-// ZeroFile represents a zero value of source.File with warnings if invoked.
-type zeroFile struct {
- log loggers.Logger
-}
-
-func NewZeroFile(log loggers.Logger) source.File {
- return zeroFile{log: log}
-}
-
-func (zeroFile) IsZero() bool {
- return true
-}
-
-
-%s
-
-`, header, importsString(pkgImports), buff.String())
-
- return nil
-}
-
func importsString(imps []string) string {
if len(imps) == 0 {
return ""
diff --git a/resources/page/page_lazy_contentprovider.go b/resources/page/page_lazy_contentprovider.go
index 2d647e90c..665b2d003 100644
--- a/resources/page/page_lazy_contentprovider.go
+++ b/resources/page/page_lazy_contentprovider.go
@@ -77,7 +77,6 @@ func (lcp *LazyContentProvider) Reset() {
func (lcp *LazyContentProvider) TableOfContents(ctx context.Context) template.HTML {
lcp.init.Do(ctx)
return lcp.cp.TableOfContents(ctx)
-
}
func (lcp *LazyContentProvider) Fragments(ctx context.Context) *tableofcontents.Fragments {
@@ -131,7 +130,7 @@ func (lcp *LazyContentProvider) Len(ctx context.Context) int {
}
func (lcp *LazyContentProvider) Render(ctx context.Context, layout ...string) (template.HTML, error) {
- lcp.init.Do(context.TODO())
+ lcp.init.Do(ctx)
return lcp.cp.Render(ctx, layout...)
}
@@ -149,6 +148,7 @@ func (lcp *LazyContentProvider) ParseContent(ctx context.Context, content []byte
lcp.init.Do(ctx)
return lcp.cp.ParseContent(ctx, content)
}
+
func (lcp *LazyContentProvider) RenderContent(ctx context.Context, content []byte, doc any) (converter.ResultRender, bool, error) {
lcp.init.Do(ctx)
return lcp.cp.RenderContent(ctx, content, doc)
diff --git a/resources/page/page_marshaljson.autogen.go b/resources/page/page_marshaljson.autogen.go
index bc9b5cc0f..18ed2a75d 100644
--- a/resources/page/page_marshaljson.autogen.go
+++ b/resources/page/page_marshaljson.autogen.go
@@ -17,27 +17,12 @@ package page
import (
"encoding/json"
- "github.com/gohugoio/hugo/common/maps"
- "github.com/gohugoio/hugo/config"
- "github.com/gohugoio/hugo/hugofs/files"
- "github.com/gohugoio/hugo/identity"
- "github.com/gohugoio/hugo/langs"
- "github.com/gohugoio/hugo/media"
- "github.com/gohugoio/hugo/navigation"
- "github.com/gohugoio/hugo/source"
"time"
+
+ "github.com/gohugoio/hugo/config"
)
func MarshalPageToJSON(p Page) ([]byte, error) {
- rawContent := p.RawContent()
- resourceType := p.ResourceType()
- mediaType := p.MediaType()
- permalink := p.Permalink()
- relPermalink := p.RelPermalink()
- name := p.Name()
- title := p.Title()
- params := p.Params()
- data := p.Data()
date := p.Date()
lastmod := p.Lastmod()
publishDate := p.PublishDate()
@@ -54,128 +39,65 @@ func MarshalPageToJSON(p Page) ([]byte, error) {
isNode := p.IsNode()
isPage := p.IsPage()
path := p.Path()
- pathc := p.Pathc()
+ pathc := p.Path()
slug := p.Slug()
lang := p.Lang()
isSection := p.IsSection()
section := p.Section()
- sectionsEntries := p.SectionsEntries()
- sectionsPath := p.SectionsPath()
sitemap := p.Sitemap()
typ := p.Type()
weight := p.Weight()
- language := p.Language()
- file := p.File()
- gitInfo := p.GitInfo()
- codeOwners := p.CodeOwners()
- outputFormats := p.OutputFormats()
- alternativeOutputFormats := p.AlternativeOutputFormats()
- menus := p.Menus()
- translationKey := p.TranslationKey()
- isTranslated := p.IsTranslated()
- allTranslations := p.AllTranslations()
- translations := p.Translations()
- store := p.Store()
- getIdentity := p.GetIdentity()
s := struct {
- RawContent string
- ResourceType string
- MediaType media.Type
- Permalink string
- RelPermalink string
- Name string
- Title string
- Params maps.Params
- Data interface{}
- Date time.Time
- Lastmod time.Time
- PublishDate time.Time
- ExpiryDate time.Time
- Aliases []string
- BundleType files.ContentClass
- Description string
- Draft bool
- IsHome bool
- Keywords []string
- Kind string
- Layout string
- LinkTitle string
- IsNode bool
- IsPage bool
- Path string
- Pathc string
- Slug string
- Lang string
- IsSection bool
- Section string
- SectionsEntries []string
- SectionsPath string
- Sitemap config.SitemapConfig
- Type string
- Weight int
- Language *langs.Language
- File source.File
- GitInfo source.GitInfo
- CodeOwners []string
- OutputFormats OutputFormats
- AlternativeOutputFormats OutputFormats
- Menus navigation.PageMenus
- TranslationKey string
- IsTranslated bool
- AllTranslations Pages
- Translations Pages
- Store *maps.Scratch
- GetIdentity identity.Identity
+ Date time.Time
+ Lastmod time.Time
+ PublishDate time.Time
+ ExpiryDate time.Time
+ Aliases []string
+ BundleType string
+ Description string
+ Draft bool
+ IsHome bool
+ Keywords []string
+ Kind string
+ Layout string
+ LinkTitle string
+ IsNode bool
+ IsPage bool
+ Path string
+ Pathc string
+ Slug string
+ Lang string
+ IsSection bool
+ Section string
+ Sitemap config.SitemapConfig
+ Type string
+ Weight int
}{
- RawContent: rawContent,
- ResourceType: resourceType,
- MediaType: mediaType,
- Permalink: permalink,
- RelPermalink: relPermalink,
- Name: name,
- Title: title,
- Params: params,
- Data: data,
- Date: date,
- Lastmod: lastmod,
- PublishDate: publishDate,
- ExpiryDate: expiryDate,
- Aliases: aliases,
- BundleType: bundleType,
- Description: description,
- Draft: draft,
- IsHome: isHome,
- Keywords: keywords,
- Kind: kind,
- Layout: layout,
- LinkTitle: linkTitle,
- IsNode: isNode,
- IsPage: isPage,
- Path: path,
- Pathc: pathc,
- Slug: slug,
- Lang: lang,
- IsSection: isSection,
- Section: section,
- SectionsEntries: sectionsEntries,
- SectionsPath: sectionsPath,
- Sitemap: sitemap,
- Type: typ,
- Weight: weight,
- Language: language,
- File: file,
- GitInfo: gitInfo,
- CodeOwners: codeOwners,
- OutputFormats: outputFormats,
- AlternativeOutputFormats: alternativeOutputFormats,
- Menus: menus,
- TranslationKey: translationKey,
- IsTranslated: isTranslated,
- AllTranslations: allTranslations,
- Translations: translations,
- Store: store,
- GetIdentity: getIdentity,
+ Date: date,
+ Lastmod: lastmod,
+ PublishDate: publishDate,
+ ExpiryDate: expiryDate,
+ Aliases: aliases,
+ BundleType: bundleType,
+ Description: description,
+ Draft: draft,
+ IsHome: isHome,
+ Keywords: keywords,
+ Kind: kind,
+ Layout: layout,
+ LinkTitle: linkTitle,
+ IsNode: isNode,
+ IsPage: isPage,
+ Path: path,
+ Pathc: pathc,
+ Slug: slug,
+ Lang: lang,
+ IsSection: isSection,
+ Section: section,
+ Sitemap: sitemap,
+ Type: typ,
+ Weight: weight,
}
return json.Marshal(&s)
diff --git a/resources/page/page_matcher.go b/resources/page/page_matcher.go
index 4c861cbd7..f5e8e2697 100644
--- a/resources/page/page_matcher.go
+++ b/resources/page/page_matcher.go
@@ -63,7 +63,7 @@ func (m PageMatcher) Matches(p Page) bool {
if m.Path != "" {
g, err := glob.GetGlob(m.Path)
// TODO(bep) Path() vs filepath vs leading slash.
- p := strings.ToLower(filepath.ToSlash(p.Pathc()))
+ p := strings.ToLower(filepath.ToSlash(p.Path()))
if !(strings.HasPrefix(p, "/")) {
p = "/" + p
}
@@ -123,7 +123,6 @@ func DecodeCascadeConfig(in any) (*config.ConfigNamespace[[]PageMatcherParamsCon
}
return config.DecodeNamespace[[]PageMatcherParamsConfig](in, buildConfig)
-
}
// DecodeCascade decodes in which could be either a map or a slice of maps.
@@ -161,7 +160,6 @@ func mapToPageMatcherParamsConfig(m map[string]any) (PageMatcherParamsConfig, er
}
}
return pcfg, pcfg.init()
-
}
// decodePageMatcher decodes m into v.
diff --git a/resources/page/page_nop.go b/resources/page/page_nop.go
index 735d6eea8..a8f42e4d3 100644
--- a/resources/page/page_nop.go
+++ b/resources/page/page_nop.go
@@ -21,19 +21,17 @@ import (
"html/template"
"time"
- "github.com/gohugoio/hugo/identity"
+ "github.com/gohugoio/hugo/hugofs/files"
"github.com/gohugoio/hugo/markup/converter"
"github.com/gohugoio/hugo/markup/tableofcontents"
- "github.com/gohugoio/hugo/hugofs/files"
- "github.com/gohugoio/hugo/tpl"
-
"github.com/gohugoio/hugo/hugofs"
"github.com/gohugoio/hugo/navigation"
"github.com/gohugoio/hugo/common/hugo"
"github.com/gohugoio/hugo/common/maps"
+ "github.com/gohugoio/hugo/common/paths"
"github.com/gohugoio/hugo/source"
"github.com/gohugoio/hugo/config"
@@ -59,6 +57,8 @@ var (
// PageNop implements Page, but does nothing.
type nopPage int
+var noOpPathInfo = paths.Parse(files.ComponentFolderContent, "no-op.md")
+
func (p *nopPage) Err() resource.ResourceError {
return nil
}
@@ -103,7 +103,7 @@ func (p *nopPage) BaseFileName() string {
return ""
}
-func (p *nopPage) BundleType() files.ContentClass {
+func (p *nopPage) BundleType() string {
return ""
}
@@ -163,10 +163,8 @@ func (p *nopPage) Extension() string {
return ""
}
-var nilFile *source.FileInfo
-
-func (p *nopPage) File() source.File {
- return nilFile
+func (p *nopPage) File() *source.File {
+ return nil
}
func (p *nopPage) FileInfo() hugofs.FileMetaInfo {
@@ -189,10 +187,6 @@ func (p *nopPage) GetPage(ref string) (Page, error) {
return nil, nil
}
-func (p *nopPage) GetPageWithTemplateInfo(info tpl.Info, ref string) (Page, error) {
- return nil, nil
-}
-
func (p *nopPage) GetParam(key string) any {
return nil
}
@@ -221,16 +215,16 @@ func (p *nopPage) Hugo() (h hugo.HugoInfo) {
return
}
-func (p *nopPage) InSection(other any) (bool, error) {
- return false, nil
+func (p *nopPage) InSection(other any) bool {
+ return false
}
-func (p *nopPage) IsAncestor(other any) (bool, error) {
- return false, nil
+func (p *nopPage) IsAncestor(other any) bool {
+ return false
}
-func (p *nopPage) IsDescendant(other any) (bool, error) {
- return false, nil
+func (p *nopPage) IsDescendant(other any) bool {
+ return false
}
func (p *nopPage) IsDraft() bool {
@@ -357,8 +351,8 @@ func (p *nopPage) Path() string {
return ""
}
-func (p *nopPage) Pathc() string {
- return ""
+func (p *nopPage) PathInfo() *paths.Path {
+ return noOpPathInfo
}
func (p *nopPage) Permalink() string {
@@ -529,13 +523,10 @@ func (p *nopPage) WordCount(context.Context) int {
return 0
}
-func (p *nopPage) GetIdentity() identity.Identity {
- return identity.NewPathIdentity("content", "foo/bar.md")
-}
-
func (p *nopPage) Fragments(context.Context) *tableofcontents.Fragments {
return nil
}
+
func (p *nopPage) HeadingsFiltered(context.Context) tableofcontents.Headings {
return nil
}
@@ -550,6 +541,7 @@ func (r *nopContentRenderer) ParseAndRenderContent(ctx context.Context, content
func (r *nopContentRenderer) ParseContent(ctx context.Context, content []byte) (converter.ResultParse, bool, error) {
return nil, false, nil
}
+
func (r *nopContentRenderer) RenderContent(ctx context.Context, content []byte, doc any) (converter.ResultRender, bool, error) {
return nil, false, nil
}
diff --git a/resources/page/page_paths.go b/resources/page/page_paths.go
index 1bc16fe35..8052287c6 100644
--- a/resources/page/page_paths.go
+++ b/resources/page/page_paths.go
@@ -17,7 +17,9 @@ import (
"path"
"path/filepath"
"strings"
+ "sync"
+ "github.com/gohugoio/hugo/common/paths"
"github.com/gohugoio/hugo/common/urls"
"github.com/gohugoio/hugo/helpers"
"github.com/gohugoio/hugo/output"
@@ -39,16 +41,14 @@ type TargetPathDescriptor struct {
Type output.Format
Kind string
- Sections []string
+ Path *paths.Path
+ Section *paths.Path
// For regular content pages this is either
// 1) the Slug, if set,
// 2) the file base name (TranslationBaseName).
BaseName string
- // Source directory.
- Dir string
-
// Typically a language prefix added to file paths.
PrefixFilePath string
@@ -74,7 +74,6 @@ type TargetPathDescriptor struct {
// TODO(bep) move this type.
type TargetPaths struct {
-
// Where to store the file on disk relative to the publish dir. OS slashes.
TargetFilename string
@@ -107,237 +106,347 @@ func (p TargetPaths) PermalinkForOutputFormat(s *helpers.PathSpec, f output.Form
return s.PermalinkForBaseURL(p.Link, baseURLstr)
}
-func isHtmlIndex(s string) bool {
- return strings.HasSuffix(s, "/index.html")
-}
-
func CreateTargetPaths(d TargetPathDescriptor) (tp TargetPaths) {
- if d.Type.Name == "" {
- panic("CreateTargetPath: missing type")
- }
-
// Normalize all file Windows paths to simplify what's next.
- if helpers.FilePathSeparator != slash {
- d.Dir = filepath.ToSlash(d.Dir)
+ if helpers.FilePathSeparator != "/" {
d.PrefixFilePath = filepath.ToSlash(d.PrefixFilePath)
-
}
- if d.URL != "" && !strings.HasPrefix(d.URL, "/") {
+ if !d.Type.Root && d.URL != "" && !strings.HasPrefix(d.URL, "/") {
// Treat this as a context relative URL
d.ForcePrefix = true
}
- pagePath := slash
- fullSuffix := d.Type.MediaType.FirstSuffix.FullSuffix
+ if d.URL != "" {
+ d.URL = filepath.ToSlash(d.URL)
+ if strings.Contains(d.URL, "..") {
+ d.URL = path.Join("/", d.URL)
+ }
+ }
+
+ if d.Type.Root && !d.ForcePrefix {
+ d.PrefixFilePath = ""
+ d.PrefixLink = ""
+ }
+
+ pb := getPagePathBuilder(d)
+ defer putPagePathBuilder(pb)
- var (
- pagePathDir string
- link string
- linkDir string
- )
+ pb.fullSuffix = d.Type.MediaType.FirstSuffix.FullSuffix
// The top level index files, i.e. the home page etc., needs
// the index base even when uglyURLs is enabled.
needsBase := true
- isUgly := d.UglyURLs && !d.Type.NoUgly
- baseNameSameAsType := d.BaseName != "" && d.BaseName == d.Type.BaseName
+ pb.isUgly = (d.UglyURLs || d.Type.Ugly) && !d.Type.NoUgly
+ pb.baseNameSameAsType = !d.Path.IsBundle() && d.BaseName != "" && d.BaseName == d.Type.BaseName
- if d.ExpandedPermalink == "" && baseNameSameAsType {
- isUgly = true
+ if d.ExpandedPermalink == "" && pb.baseNameSameAsType {
+ pb.isUgly = true
}
- if d.Kind != kinds.KindPage && d.URL == "" && len(d.Sections) > 0 {
+ if d.Type == output.HTTPStatusHTMLFormat || d.Type == output.SitemapFormat || d.Type == output.RobotsTxtFormat {
+ pb.noSubResources = true
+ } else if d.Kind != kinds.KindPage && d.URL == "" && d.Section.Base() != "/" {
if d.ExpandedPermalink != "" {
- pagePath = pjoin(pagePath, d.ExpandedPermalink)
+ pb.Add(d.ExpandedPermalink)
} else {
- pagePath = pjoin(d.Sections...)
+ pb.Add(d.Section.Base())
}
needsBase = false
}
if d.Type.Path != "" {
- pagePath = pjoin(pagePath, d.Type.Path)
+ pb.Add(d.Type.Path)
}
if d.Kind != kinds.KindHome && d.URL != "" {
- pagePath = pjoin(pagePath, d.URL)
+ pb.Add(paths.FieldsSlash(d.URL)...)
if d.Addends != "" {
- pagePath = pjoin(pagePath, d.Addends)
+ pb.Add(d.Addends)
}
- pagePathDir = pagePath
- link = pagePath
hasDot := strings.Contains(d.URL, ".")
- hasSlash := strings.HasSuffix(d.URL, slash)
+ hasSlash := strings.HasSuffix(d.URL, "/")
if hasSlash || !hasDot {
- pagePath = pjoin(pagePath, d.Type.BaseName+fullSuffix)
+ pb.Add(d.Type.BaseName + pb.fullSuffix)
} else if hasDot {
- pagePathDir = path.Dir(pagePathDir)
+ pb.fullSuffix = paths.Ext(d.URL)
}
- if !isHtmlIndex(pagePath) {
- link = pagePath
- } else if !hasSlash {
- link += slash
+ if pb.IsHtmlIndex() {
+ pb.linkUpperOffset = 1
}
- linkDir = pagePathDir
-
if d.ForcePrefix {
// Prepend language prefix if not already set in URL
- if d.PrefixFilePath != "" && !strings.HasPrefix(d.URL, slash+d.PrefixFilePath) {
- pagePath = pjoin(d.PrefixFilePath, pagePath)
- pagePathDir = pjoin(d.PrefixFilePath, pagePathDir)
+ if d.PrefixFilePath != "" && !strings.HasPrefix(d.URL, "/"+d.PrefixFilePath) {
+ pb.prefixPath = d.PrefixFilePath
}
- if d.PrefixLink != "" && !strings.HasPrefix(d.URL, slash+d.PrefixLink) {
- link = pjoin(d.PrefixLink, link)
- linkDir = pjoin(d.PrefixLink, linkDir)
+ if d.PrefixLink != "" && !strings.HasPrefix(d.URL, "/"+d.PrefixLink) {
+ pb.prefixLink = d.PrefixLink
}
}
-
- } else if d.Kind == kinds.KindPage {
-
+ } else if !kinds.IsBranch(d.Kind) {
if d.ExpandedPermalink != "" {
- pagePath = pjoin(pagePath, d.ExpandedPermalink)
+ pb.Add(d.ExpandedPermalink)
} else {
- if d.Dir != "" {
- pagePath = pjoin(pagePath, d.Dir)
+ if dir := d.Path.ContainerDir(); dir != "" {
+ pb.Add(dir)
}
if d.BaseName != "" {
- pagePath = pjoin(pagePath, d.BaseName)
+ pb.Add(d.BaseName)
+ } else {
+ pb.Add(d.Path.BaseNameNoIdentifier())
}
}
if d.Addends != "" {
- pagePath = pjoin(pagePath, d.Addends)
- }
-
- link = pagePath
-
- // TODO(bep) this should not happen after the fix in https://github.com/gohugoio/hugo/issues/4870
- // but we may need some more testing before we can remove it.
- if baseNameSameAsType {
- link = strings.TrimSuffix(link, d.BaseName)
+ pb.Add(d.Addends)
}
- pagePathDir = link
- link = link + slash
- linkDir = pagePathDir
-
- if isUgly {
- pagePath = addSuffix(pagePath, fullSuffix)
+ if pb.isUgly {
+ pb.ConcatLast(pb.fullSuffix)
} else {
- pagePath = pjoin(pagePath, d.Type.BaseName+fullSuffix)
+ pb.Add(d.Type.BaseName + pb.fullSuffix)
}
- if !isHtmlIndex(pagePath) {
- link = pagePath
+ if pb.IsHtmlIndex() {
+ pb.linkUpperOffset = 1
}
if d.PrefixFilePath != "" {
- pagePath = pjoin(d.PrefixFilePath, pagePath)
- pagePathDir = pjoin(d.PrefixFilePath, pagePathDir)
+ pb.prefixPath = d.PrefixFilePath
}
if d.PrefixLink != "" {
- link = pjoin(d.PrefixLink, link)
- linkDir = pjoin(d.PrefixLink, linkDir)
+ pb.prefixLink = d.PrefixLink
}
-
} else {
if d.Addends != "" {
- pagePath = pjoin(pagePath, d.Addends)
+ pb.Add(d.Addends)
}
needsBase = needsBase && d.Addends == ""
- // No permalink expansion etc. for node type pages (for now)
- base := ""
-
- if needsBase || !isUgly {
- base = d.Type.BaseName
- }
-
- pagePathDir = pagePath
- link = pagePath
- linkDir = pagePathDir
-
- if base != "" {
- pagePath = path.Join(pagePath, addSuffix(base, fullSuffix))
+ if needsBase || !pb.isUgly {
+ pb.Add(d.Type.BaseName + pb.fullSuffix)
} else {
- pagePath = addSuffix(pagePath, fullSuffix)
+ pb.ConcatLast(pb.fullSuffix)
}
- if !isHtmlIndex(pagePath) {
- link = pagePath
- } else {
- link += slash
+ if pb.IsHtmlIndex() {
+ pb.linkUpperOffset = 1
}
if d.PrefixFilePath != "" {
- pagePath = pjoin(d.PrefixFilePath, pagePath)
- pagePathDir = pjoin(d.PrefixFilePath, pagePathDir)
+ pb.prefixPath = d.PrefixFilePath
}
if d.PrefixLink != "" {
- link = pjoin(d.PrefixLink, link)
- linkDir = pjoin(d.PrefixLink, linkDir)
+ pb.prefixLink = d.PrefixLink
}
}
- pagePath = pjoin(slash, pagePath)
- pagePathDir = strings.TrimSuffix(path.Join(slash, pagePathDir), slash)
-
- hadSlash := strings.HasSuffix(link, slash)
- link = strings.Trim(link, slash)
- if hadSlash {
- link += slash
- }
-
- if !strings.HasPrefix(link, slash) {
- link = slash + link
- }
-
- linkDir = strings.TrimSuffix(path.Join(slash, linkDir), slash)
-
// if page URL is explicitly set in frontmatter,
// preserve its value without sanitization
if d.Kind != kinds.KindPage || d.URL == "" {
// Note: MakePathSanitized will lower case the path if
// disablePathToLower isn't set.
- pagePath = d.PathSpec.MakePathSanitized(pagePath)
- pagePathDir = d.PathSpec.MakePathSanitized(pagePathDir)
- link = d.PathSpec.MakePathSanitized(link)
- linkDir = d.PathSpec.MakePathSanitized(linkDir)
+ pb.Sanitize()
}
+ link := pb.Link()
+ pagePath := pb.PathFile()
+
tp.TargetFilename = filepath.FromSlash(pagePath)
- tp.SubResourceBaseTarget = filepath.FromSlash(pagePathDir)
- tp.SubResourceBaseLink = linkDir
- tp.Link = d.PathSpec.URLizeFilename(link)
+ if !pb.noSubResources {
+ tp.SubResourceBaseTarget = pb.PathDir()
+ tp.SubResourceBaseLink = pb.LinkDir()
+ }
+ if d.URL != "" {
+ tp.Link = paths.URLEscape(link)
+ } else {
+ // This is slightly faster for when we know we don't have any
+ // query or scheme etc.
+ tp.Link = paths.PathEscape(link)
+ }
if tp.Link == "" {
- tp.Link = slash
+ tp.Link = "/"
}
return
}
-func addSuffix(s, suffix string) string {
- return strings.Trim(s, slash) + suffix
+// When adding state here, remember to update putPagePathBuilder.
+type pagePathBuilder struct {
+ els []string
+
+ d TargetPathDescriptor
+
+ // Builder state.
+ isUgly bool
+ baseNameSameAsType bool
+ noSubResources bool
+ fullSuffix string // File suffix including any ".".
+ prefixLink string
+ prefixPath string
+ linkUpperOffset int
+}
+
+func (p *pagePathBuilder) Add(el ...string) {
+ // Filter empty and slashes.
+ n := 0
+ for _, e := range el {
+ if e != "" && e != slash {
+ el[n] = e
+ n++
+ }
+ }
+ el = el[:n]
+
+ p.els = append(p.els, el...)
}
-// Like path.Join, but preserves one trailing slash if present.
-func pjoin(elem ...string) string {
- hadSlash := strings.HasSuffix(elem[len(elem)-1], slash)
- joined := path.Join(elem...)
- if hadSlash && !strings.HasSuffix(joined, slash) {
- return joined + slash
+func (p *pagePathBuilder) ConcatLast(s string) {
+ if len(p.els) == 0 {
+ p.Add(s)
+ return
}
- return joined
+ old := p.els[len(p.els)-1]
+ if old == "" {
+ p.els[len(p.els)-1] = s
+ return
+ }
+ if old[len(old)-1] == '/' {
+ old = old[:len(old)-1]
+ }
+ p.els[len(p.els)-1] = old + s
+}
+
+func (p *pagePathBuilder) IsHtmlIndex() bool {
+ return p.Last() == "index.html"
+}
+
+func (p *pagePathBuilder) Last() string {
+ if p.els == nil {
+ return ""
+ }
+ return p.els[len(p.els)-1]
+}
+
+func (p *pagePathBuilder) Link() string {
+ link := p.Path(p.linkUpperOffset)
+
+ if p.baseNameSameAsType {
+ link = strings.TrimSuffix(link, p.d.BaseName)
+ }
+
+ if p.prefixLink != "" {
+ link = "/" + p.prefixLink + link
+ }
+
+ if p.linkUpperOffset > 0 && !strings.HasSuffix(link, "/") {
+ link += "/"
+ }
+
+ return link
+}
+
+func (p *pagePathBuilder) LinkDir() string {
+ if p.noSubResources {
+ return ""
+ }
+
+ pathDir := p.PathDirBase()
+
+ if p.prefixLink != "" {
+ pathDir = "/" + p.prefixLink + pathDir
+ }
+
+ return pathDir
+}
+
+func (p *pagePathBuilder) Path(upperOffset int) string {
+ upper := len(p.els)
+ if upperOffset > 0 {
+ upper -= upperOffset
+ }
+ pth := path.Join(p.els[:upper]...)
+ return paths.AddLeadingSlash(pth)
+}
+
+func (p *pagePathBuilder) PathDir() string {
+ dir := p.PathDirBase()
+ if p.prefixPath != "" {
+ dir = "/" + p.prefixPath + dir
+ }
+ return dir
+}
+
+func (p *pagePathBuilder) PathDirBase() string {
+ if p.noSubResources {
+ return ""
+ }
+
+ dir := p.Path(0)
+ isIndex := strings.HasPrefix(p.Last(), p.d.Type.BaseName+".")
+
+ if isIndex {
+ dir = paths.Dir(dir)
+ } else {
+ dir = strings.TrimSuffix(dir, p.fullSuffix)
+ }
+
+ if dir == "/" {
+ dir = ""
+ }
+
+ return dir
+}
+
+func (p *pagePathBuilder) PathFile() string {
+ dir := p.Path(0)
+ if p.prefixPath != "" {
+ dir = "/" + p.prefixPath + dir
+ }
+ return dir
+}
+
+func (p *pagePathBuilder) Prepend(el ...string) {
+ p.els = append(p.els[:0], append(el, p.els[0:]...)...)
+}
+
+func (p *pagePathBuilder) Sanitize() {
+ for i, el := range p.els {
+ p.els[i] = p.d.PathSpec.MakePathSanitized(el)
+ }
+}
+
+var pagePathBuilderPool = &sync.Pool{
+ New: func() any {
+ return &pagePathBuilder{}
+ },
+}
+
+func getPagePathBuilder(d TargetPathDescriptor) *pagePathBuilder {
+ b := pagePathBuilderPool.Get().(*pagePathBuilder)
+ b.d = d
+ return b
+}
+
+func putPagePathBuilder(b *pagePathBuilder) {
+ b.els = b.els[:0]
+ b.fullSuffix = ""
+ b.baseNameSameAsType = false
+ b.isUgly = false
+ b.noSubResources = false
+ b.prefixLink = ""
+ b.prefixPath = ""
+ b.linkUpperOffset = 0
+ pagePathBuilderPool.Put(b)
}
diff --git a/resources/page/page_paths_test.go b/resources/page/page_paths_test.go
deleted file mode 100644
index dd6457f77..000000000
--- a/resources/page/page_paths_test.go
+++ /dev/null
@@ -1,295 +0,0 @@
-// Copyright 2019 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package page_test
-
-import (
- "fmt"
- "path/filepath"
- "strings"
- "testing"
-
- "github.com/gohugoio/hugo/media"
- "github.com/gohugoio/hugo/resources/kinds"
- "github.com/gohugoio/hugo/resources/page"
-
- "github.com/gohugoio/hugo/output"
-)
-
-func TestPageTargetPath(t *testing.T) {
- pathSpec := newTestPathSpec()
-
- noExtNoDelimMediaType := media.WithDelimiterAndSuffixes(media.Builtin.TextType, "", "")
- noExtNoDelimMediaType.Delimiter = ""
-
- // Netlify style _redirects
- noExtDelimFormat := output.Format{
- Name: "NER",
- MediaType: noExtNoDelimMediaType,
- BaseName: "_redirects",
- }
-
- for _, langPrefixPath := range []string{"", "no"} {
- for _, langPrefixLink := range []string{"", "no"} {
- for _, uglyURLs := range []bool{false, true} {
-
- tests := []struct {
- name string
- d page.TargetPathDescriptor
- expected page.TargetPaths
- }{
- {"JSON home", page.TargetPathDescriptor{Kind: kinds.KindHome, Type: output.JSONFormat}, page.TargetPaths{TargetFilename: "/index.json", SubResourceBaseTarget: "", Link: "/index.json"}},
- {"AMP home", page.TargetPathDescriptor{Kind: kinds.KindHome, Type: output.AMPFormat}, page.TargetPaths{TargetFilename: "/amp/index.html", SubResourceBaseTarget: "/amp", Link: "/amp/"}},
- {"HTML home", page.TargetPathDescriptor{Kind: kinds.KindHome, BaseName: "_index", Type: output.HTMLFormat}, page.TargetPaths{TargetFilename: "/index.html", SubResourceBaseTarget: "", Link: "/"}},
- {"Netlify redirects", page.TargetPathDescriptor{Kind: kinds.KindHome, BaseName: "_index", Type: noExtDelimFormat}, page.TargetPaths{TargetFilename: "/_redirects", SubResourceBaseTarget: "", Link: "/_redirects"}},
- {"HTML section list", page.TargetPathDescriptor{
- Kind: kinds.KindSection,
- Sections: []string{"sect1"},
- BaseName: "_index",
- Type: output.HTMLFormat,
- }, page.TargetPaths{TargetFilename: "/sect1/index.html", SubResourceBaseTarget: "/sect1", Link: "/sect1/"}},
- {"HTML taxonomy term", page.TargetPathDescriptor{
- Kind: kinds.KindTerm,
- Sections: []string{"tags", "hugo"},
- BaseName: "_index",
- Type: output.HTMLFormat,
- }, page.TargetPaths{TargetFilename: "/tags/hugo/index.html", SubResourceBaseTarget: "/tags/hugo", Link: "/tags/hugo/"}},
- {"HTML taxonomy", page.TargetPathDescriptor{
- Kind: kinds.KindTaxonomy,
- Sections: []string{"tags"},
- BaseName: "_index",
- Type: output.HTMLFormat,
- }, page.TargetPaths{TargetFilename: "/tags/index.html", SubResourceBaseTarget: "/tags", Link: "/tags/"}},
- {
- "HTML page", page.TargetPathDescriptor{
- Kind: kinds.KindPage,
- Dir: "/a/b",
- BaseName: "mypage",
- Sections: []string{"a"},
- Type: output.HTMLFormat,
- }, page.TargetPaths{TargetFilename: "/a/b/mypage/index.html", SubResourceBaseTarget: "/a/b/mypage", Link: "/a/b/mypage/"},
- },
-
- {
- "HTML page with index as base", page.TargetPathDescriptor{
- Kind: kinds.KindPage,
- Dir: "/a/b",
- BaseName: "index",
- Sections: []string{"a"},
- Type: output.HTMLFormat,
- }, page.TargetPaths{TargetFilename: "/a/b/index.html", SubResourceBaseTarget: "/a/b", Link: "/a/b/"},
- },
-
- {
- "HTML page with special chars", page.TargetPathDescriptor{
- Kind: kinds.KindPage,
- Dir: "/a/b",
- BaseName: "My Page!",
- Type: output.HTMLFormat,
- }, page.TargetPaths{TargetFilename: "/a/b/my-page/index.html", SubResourceBaseTarget: "/a/b/my-page", Link: "/a/b/my-page/"},
- },
- {"RSS home", page.TargetPathDescriptor{Kind: "rss", Type: output.RSSFormat}, page.TargetPaths{TargetFilename: "/index.xml", SubResourceBaseTarget: "", Link: "/index.xml"}},
- {"RSS section list", page.TargetPathDescriptor{
- Kind: "rss",
- Sections: []string{"sect1"},
- Type: output.RSSFormat,
- }, page.TargetPaths{TargetFilename: "/sect1/index.xml", SubResourceBaseTarget: "/sect1", Link: "/sect1/index.xml"}},
- {
- "AMP page", page.TargetPathDescriptor{
- Kind: kinds.KindPage,
- Dir: "/a/b/c",
- BaseName: "myamp",
- Type: output.AMPFormat,
- }, page.TargetPaths{TargetFilename: "/amp/a/b/c/myamp/index.html", SubResourceBaseTarget: "/amp/a/b/c/myamp", Link: "/amp/a/b/c/myamp/"},
- },
- {
- "AMP page with URL with suffix", page.TargetPathDescriptor{
- Kind: kinds.KindPage,
- Dir: "/sect/",
- BaseName: "mypage",
- URL: "/some/other/url.xhtml",
- Type: output.HTMLFormat,
- }, page.TargetPaths{TargetFilename: "/some/other/url.xhtml", SubResourceBaseTarget: "/some/other", Link: "/some/other/url.xhtml"},
- },
- {
- "JSON page with URL without suffix", page.TargetPathDescriptor{
- Kind: kinds.KindPage,
- Dir: "/sect/",
- BaseName: "mypage",
- URL: "/some/other/path/",
- Type: output.JSONFormat,
- }, page.TargetPaths{TargetFilename: "/some/other/path/index.json", SubResourceBaseTarget: "/some/other/path", Link: "/some/other/path/index.json"},
- },
- {
- "JSON page with URL without suffix and no trailing slash", page.TargetPathDescriptor{
- Kind: kinds.KindPage,
- Dir: "/sect/",
- BaseName: "mypage",
- URL: "/some/other/path",
- Type: output.JSONFormat,
- }, page.TargetPaths{TargetFilename: "/some/other/path/index.json", SubResourceBaseTarget: "/some/other/path", Link: "/some/other/path/index.json"},
- },
- {
- "HTML page with URL without suffix and no trailing slash", page.TargetPathDescriptor{
- Kind: kinds.KindPage,
- Dir: "/sect/",
- BaseName: "mypage",
- URL: "/some/other/path",
- Type: output.HTMLFormat,
- }, page.TargetPaths{TargetFilename: "/some/other/path/index.html", SubResourceBaseTarget: "/some/other/path", Link: "/some/other/path/"},
- },
- {
- "HTML page with URL containing double hyphen", page.TargetPathDescriptor{
- Kind: kinds.KindPage,
- Dir: "/sect/",
- BaseName: "mypage",
- URL: "/some/other--url/",
- Type: output.HTMLFormat,
- }, page.TargetPaths{TargetFilename: "/some/other--url/index.html", SubResourceBaseTarget: "/some/other--url", Link: "/some/other--url/"},
- },
- {
- "HTML page with expanded permalink", page.TargetPathDescriptor{
- Kind: kinds.KindPage,
- Dir: "/a/b",
- BaseName: "mypage",
- ExpandedPermalink: "/2017/10/my-title/",
- Type: output.HTMLFormat,
- }, page.TargetPaths{TargetFilename: "/2017/10/my-title/index.html", SubResourceBaseTarget: "/2017/10/my-title", Link: "/2017/10/my-title/"},
- },
- {
- "Paginated HTML home", page.TargetPathDescriptor{
- Kind: kinds.KindHome,
- BaseName: "_index",
- Type: output.HTMLFormat,
- Addends: "page/3",
- }, page.TargetPaths{TargetFilename: "/page/3/index.html", SubResourceBaseTarget: "/page/3", Link: "/page/3/"},
- },
- {
- "Paginated Taxonomy terms list", page.TargetPathDescriptor{
- Kind: kinds.KindTerm,
- BaseName: "_index",
- Sections: []string{"tags", "hugo"},
- Type: output.HTMLFormat,
- Addends: "page/3",
- }, page.TargetPaths{TargetFilename: "/tags/hugo/page/3/index.html", SubResourceBaseTarget: "/tags/hugo/page/3", Link: "/tags/hugo/page/3/"},
- },
- {
- "Regular page with addend", page.TargetPathDescriptor{
- Kind: kinds.KindPage,
- Dir: "/a/b",
- BaseName: "mypage",
- Addends: "c/d/e",
- Type: output.HTMLFormat,
- }, page.TargetPaths{TargetFilename: "/a/b/mypage/c/d/e/index.html", SubResourceBaseTarget: "/a/b/mypage/c/d/e", Link: "/a/b/mypage/c/d/e/"},
- },
- }
-
- for i, test := range tests {
- t.Run(fmt.Sprintf("langPrefixPath=%s,langPrefixLink=%s,uglyURLs=%t,name=%s", langPrefixPath, langPrefixLink, uglyURLs, test.name),
- func(t *testing.T) {
- test.d.ForcePrefix = true
- test.d.PathSpec = pathSpec
- test.d.UglyURLs = uglyURLs
- test.d.PrefixFilePath = langPrefixPath
- test.d.PrefixLink = langPrefixLink
- test.d.Dir = filepath.FromSlash(test.d.Dir)
- isUgly := uglyURLs && !test.d.Type.NoUgly
-
- expected := test.expected
-
- // TODO(bep) simplify
- if test.d.Kind == kinds.KindPage && test.d.BaseName == test.d.Type.BaseName {
- } else if test.d.Kind == kinds.KindHome && test.d.Type.Path != "" {
- } else if test.d.Type.MediaType.FirstSuffix.Suffix != "" && (!strings.HasPrefix(expected.TargetFilename, "/index") || test.d.Addends != "") && test.d.URL == "" && isUgly {
- expected.TargetFilename = strings.Replace(expected.TargetFilename,
- "/"+test.d.Type.BaseName+"."+test.d.Type.MediaType.FirstSuffix.Suffix,
- "."+test.d.Type.MediaType.FirstSuffix.Suffix, 1)
- expected.Link = strings.TrimSuffix(expected.Link, "/") + "." + test.d.Type.MediaType.FirstSuffix.Suffix
-
- }
-
- if test.d.PrefixFilePath != "" && !strings.HasPrefix(test.d.URL, "/"+test.d.PrefixFilePath) {
- expected.TargetFilename = "/" + test.d.PrefixFilePath + expected.TargetFilename
- expected.SubResourceBaseTarget = "/" + test.d.PrefixFilePath + expected.SubResourceBaseTarget
- }
-
- if test.d.PrefixLink != "" && !strings.HasPrefix(test.d.URL, "/"+test.d.PrefixLink) {
- expected.Link = "/" + test.d.PrefixLink + expected.Link
- }
-
- expected.TargetFilename = filepath.FromSlash(expected.TargetFilename)
- expected.SubResourceBaseTarget = filepath.FromSlash(expected.SubResourceBaseTarget)
-
- pagePath := page.CreateTargetPaths(test.d)
-
- if !eqTargetPaths(pagePath, expected) {
- t.Fatalf("[%d] [%s] targetPath expected\n%#v, got:\n%#v", i, test.name, expected, pagePath)
- }
- })
- }
- }
- }
- }
-}
-
-func TestPageTargetPathPrefix(t *testing.T) {
- pathSpec := newTestPathSpec()
- tests := []struct {
- name string
- d page.TargetPathDescriptor
- expected page.TargetPaths
- }{
- {
- "URL set, prefix both, no force",
- page.TargetPathDescriptor{Kind: kinds.KindPage, Type: output.JSONFormat, URL: "/mydir/my.json", ForcePrefix: false, PrefixFilePath: "pf", PrefixLink: "pl"},
- page.TargetPaths{TargetFilename: "/mydir/my.json", SubResourceBaseTarget: "/mydir", SubResourceBaseLink: "/mydir", Link: "/mydir/my.json"},
- },
- {
- "URL set, prefix both, force",
- page.TargetPathDescriptor{Kind: kinds.KindPage, Type: output.JSONFormat, URL: "/mydir/my.json", ForcePrefix: true, PrefixFilePath: "pf", PrefixLink: "pl"},
- page.TargetPaths{TargetFilename: "/pf/mydir/my.json", SubResourceBaseTarget: "/pf/mydir", SubResourceBaseLink: "/pl/mydir", Link: "/pl/mydir/my.json"},
- },
- }
-
- for i, test := range tests {
- t.Run(fmt.Sprintf(test.name),
- func(t *testing.T) {
- test.d.PathSpec = pathSpec
- expected := test.expected
- expected.TargetFilename = filepath.FromSlash(expected.TargetFilename)
- expected.SubResourceBaseTarget = filepath.FromSlash(expected.SubResourceBaseTarget)
-
- pagePath := page.CreateTargetPaths(test.d)
-
- if pagePath != expected {
- t.Fatalf("[%d] [%s] targetPath expected\n%#v, got:\n%#v", i, test.name, expected, pagePath)
- }
- })
- }
-}
-
-func eqTargetPaths(p1, p2 page.TargetPaths) bool {
- if p1.Link != p2.Link {
- return false
- }
-
- if p1.SubResourceBaseTarget != p2.SubResourceBaseTarget {
- return false
- }
-
- if p1.TargetFilename != p2.TargetFilename {
- return false
- }
-
- return true
-}
diff --git a/resources/page/pagegroup.go b/resources/page/pagegroup.go
index d091c6bef..e691a112e 100644
--- a/resources/page/pagegroup.go
+++ b/resources/page/pagegroup.go
@@ -244,7 +244,7 @@ func (p Pages) groupByDateField(format string, sorter func(p Pages) Pages, getDa
return nil, nil
}
- firstPage := sp[0].(Page)
+ firstPage := sp[0]
date := getDate(firstPage)
// Pages may be a mix of multiple languages, so we need to use the language
@@ -258,7 +258,7 @@ func (p Pages) groupByDateField(format string, sorter func(p Pages) Pages, getDa
i := 0
for _, e := range sp[1:] {
- date = getDate(e.(Page))
+ date = getDate(e)
formatted := formatter.Format(date, format)
if r[i].Key.(string) != formatted {
r = append(r, PageGroup{Key: formatted})
diff --git a/resources/page/pagemeta/page_frontmatter.go b/resources/page/pagemeta/page_frontmatter.go
index 98ab6b222..d804f27a7 100644
--- a/resources/page/pagemeta/page_frontmatter.go
+++ b/resources/page/pagemeta/page_frontmatter.go
@@ -47,9 +47,8 @@ type FrontMatterHandler struct {
// FrontMatterDescriptor describes how to handle front matter for a given Page.
// It has pointers to values in the receiving page which gets updated.
type FrontMatterDescriptor struct {
-
- // This the Page's front matter.
- Frontmatter map[string]any
+ // This is the Page's params.
+ Params map[string]any
// This is the Page's base filename (BaseFilename), e.g. page.md., or
// if page is a leaf bundle, the bundle folder name (ContentBaseName).
@@ -63,9 +62,6 @@ type FrontMatterDescriptor struct {
// The below are pointers to values on Page and will be modified.
- // This is the Page's params.
- Params map[string]any
-
// This is the Page's dates.
Dates *resource.Dates
@@ -365,7 +361,7 @@ type frontmatterFieldHandlers int
func (f *frontmatterFieldHandlers) newDateFieldHandler(key string, setter func(d *FrontMatterDescriptor, t time.Time)) frontMatterFieldHandler {
return func(d *FrontMatterDescriptor) (bool, error) {
- v, found := d.Frontmatter[key]
+ v, found := d.Params[key]
if !found {
return false, nil
@@ -396,7 +392,7 @@ func (f *frontmatterFieldHandlers) newDateFilenameHandler(setter func(d *FrontMa
setter(d, date)
- if _, found := d.Frontmatter["slug"]; !found {
+ if _, found := d.Params["slug"]; !found {
// Use slug from filename
d.PageURLs.Slug = slug
}
diff --git a/resources/page/pagemeta/page_frontmatter_test.go b/resources/page/pagemeta/page_frontmatter_test.go
index f040af163..1aff8b511 100644
--- a/resources/page/pagemeta/page_frontmatter_test.go
+++ b/resources/page/pagemeta/page_frontmatter_test.go
@@ -29,11 +29,10 @@ import (
func newTestFd() *pagemeta.FrontMatterDescriptor {
return &pagemeta.FrontMatterDescriptor{
- Frontmatter: make(map[string]any),
- Params: make(map[string]any),
- Dates: &resource.Dates{},
- PageURLs: &pagemeta.URLPath{},
- Location: time.UTC,
+ Params: make(map[string]any),
+ Dates: &resource.Dates{},
+ PageURLs: &pagemeta.URLPath{},
+ Location: time.UTC,
}
}
@@ -106,13 +105,13 @@ func TestFrontMatterDatesHandlers(t *testing.T) {
case ":git":
d.GitAuthorDate = d1
}
- d.Frontmatter["date"] = d2
+ d.Params["date"] = d2
c.Assert(handler.HandleDates(d), qt.IsNil)
c.Assert(d.Dates.FDate, qt.Equals, d1)
c.Assert(d.Params["date"], qt.Equals, d2)
d = newTestFd()
- d.Frontmatter["date"] = d2
+ d.Params["date"] = d2
c.Assert(handler.HandleDates(d), qt.IsNil)
c.Assert(d.Dates.FDate, qt.Equals, d2)
c.Assert(d.Params["date"], qt.Equals, d2)
@@ -120,54 +119,6 @@ func TestFrontMatterDatesHandlers(t *testing.T) {
}
}
-func TestFrontMatterDatesCustomConfig(t *testing.T) {
- t.Parallel()
-
- c := qt.New(t)
-
- cfg := config.New()
- cfg.Set("frontmatter", map[string]any{
- "date": []string{"mydate"},
- "lastmod": []string{"publishdate"},
- "publishdate": []string{"publishdate"},
- })
-
- conf := testconfig.GetTestConfig(nil, cfg)
- handler, err := pagemeta.NewFrontmatterHandler(nil, conf.GetConfigSection("frontmatter").(pagemeta.FrontmatterConfig))
- c.Assert(err, qt.IsNil)
-
- testDate, err := time.Parse("2006-01-02", "2018-02-01")
- c.Assert(err, qt.IsNil)
-
- d := newTestFd()
- d.Frontmatter["mydate"] = testDate
- testDate = testDate.Add(24 * time.Hour)
- d.Frontmatter["date"] = testDate
- testDate = testDate.Add(24 * time.Hour)
- d.Frontmatter["lastmod"] = testDate
- testDate = testDate.Add(24 * time.Hour)
- d.Frontmatter["publishdate"] = testDate
- testDate = testDate.Add(24 * time.Hour)
- d.Frontmatter["expirydate"] = testDate
-
- c.Assert(handler.HandleDates(d), qt.IsNil)
-
- c.Assert(d.Dates.FDate.Day(), qt.Equals, 1)
- c.Assert(d.Dates.FLastmod.Day(), qt.Equals, 4)
- c.Assert(d.Dates.FPublishDate.Day(), qt.Equals, 4)
- c.Assert(d.Dates.FExpiryDate.Day(), qt.Equals, 5)
-
- c.Assert(d.Params["date"], qt.Equals, d.Dates.FDate)
- c.Assert(d.Params["mydate"], qt.Equals, d.Dates.FDate)
- c.Assert(d.Params["publishdate"], qt.Equals, d.Dates.FPublishDate)
- c.Assert(d.Params["expirydate"], qt.Equals, d.Dates.FExpiryDate)
-
- c.Assert(handler.IsDateKey("date"), qt.Equals, false) // This looks odd, but is configured like this.
- c.Assert(handler.IsDateKey("mydate"), qt.Equals, true)
- c.Assert(handler.IsDateKey("publishdate"), qt.Equals, true)
- c.Assert(handler.IsDateKey("pubdate"), qt.Equals, true)
-}
-
func TestFrontMatterDatesDefaultKeyword(t *testing.T) {
t.Parallel()
@@ -186,10 +137,10 @@ func TestFrontMatterDatesDefaultKeyword(t *testing.T) {
testDate, _ := time.Parse("2006-01-02", "2018-02-01")
d := newTestFd()
- d.Frontmatter["mydate"] = testDate
- d.Frontmatter["date"] = testDate.Add(1 * 24 * time.Hour)
- d.Frontmatter["mypubdate"] = testDate.Add(2 * 24 * time.Hour)
- d.Frontmatter["publishdate"] = testDate.Add(3 * 24 * time.Hour)
+ d.Params["mydate"] = testDate
+ d.Params["date"] = testDate.Add(1 * 24 * time.Hour)
+ d.Params["mypubdate"] = testDate.Add(2 * 24 * time.Hour)
+ d.Params["publishdate"] = testDate.Add(3 * 24 * time.Hour)
c.Assert(handler.HandleDates(d), qt.IsNil)
diff --git a/resources/page/pages.go b/resources/page/pages.go
index 77e56a062..088abb9ac 100644
--- a/resources/page/pages.go
+++ b/resources/page/pages.go
@@ -66,9 +66,7 @@ func ToPages(seq any) (Pages, error) {
return v.Pages, nil
case []Page:
pages := make(Pages, len(v))
- for i, vv := range v {
- pages[i] = vv
- }
+ copy(pages, v)
return pages, nil
case []any:
pages := make(Pages, len(v))
diff --git a/resources/page/pages_related.go b/resources/page/pages_related.go
index 217aced47..3322a4fbf 100644
--- a/resources/page/pages_related.go
+++ b/resources/page/pages_related.go
@@ -35,7 +35,6 @@ var (
// A PageGenealogist finds related pages in a page collection. This interface is implemented
// by Pages and PageGroup, which makes it available as `{{ .RegularRelated . }}` etc.
type PageGenealogist interface {
-
// Template example:
// {{ $related := .RegularPages.Related . }}
Related(ctx context.Context, opts any) (Pages, error)
@@ -76,7 +75,6 @@ func (p Pages) Related(ctx context.Context, optsv any) (Pages, error) {
}
return result, nil
-
}
// RelatedIndices searches the given indices with the search keywords from the
@@ -186,6 +184,7 @@ func (s *RelatedDocsHandler) getIndex(p Pages) *related.InvertedIndex {
}
return nil
}
+
func (s *RelatedDocsHandler) getOrCreateIndex(ctx context.Context, p Pages) (*related.InvertedIndex, error) {
s.mu.RLock()
cachedIndex := s.getIndex(p)
diff --git a/resources/page/pages_sort.go b/resources/page/pages_sort.go
index 32b1b3895..3f4875702 100644
--- a/resources/page/pages_sort.go
+++ b/resources/page/pages_sort.go
@@ -54,6 +54,19 @@ func getOrdinals(p1, p2 Page) (int, int) {
return p1o.Ordinal(), p2o.Ordinal()
}
+func getWeight0s(p1, p2 Page) (int, int) {
+ p1w, ok1 := p1.(resource.Weight0Provider)
+ if !ok1 {
+ return -1, -1
+ }
+ p2w, ok2 := p2.(resource.Weight0Provider)
+ if !ok2 {
+ return -1, -1
+ }
+
+ return p1w.Weight0(), p2w.Weight0()
+}
+
// Sort stable sorts the pages given the receiver's sort order.
func (by pageBy) Sort(pages Pages) {
ps := &pageSorter{
@@ -72,12 +85,17 @@ var (
if o1 != o2 && o1 != -1 && o2 != -1 {
return o1 < o2
}
+ // Weight0, as by the weight of the taxonomy entrie in the front matter.
+ w01, w02 := getWeight0s(p1, p2)
+ if w01 != w02 && w01 != -1 && w02 != -1 {
+ return w01 < w02
+ }
if p1.Weight() == p2.Weight() {
if p1.Date().Unix() == p2.Date().Unix() {
c := collatorStringCompare(func(p Page) string { return p.LinkTitle() }, p1, p2)
if c == 0 {
- if p1.File().IsZero() || p2.File().IsZero() {
- return p1.File().IsZero()
+ if p1.File() == nil || p2.File() == nil {
+ return p1.File() == nil
}
return compare.LessStrings(p1.File().Filename(), p2.File().Filename())
}
@@ -102,7 +120,7 @@ var (
if p1.Date().Unix() == p2.Date().Unix() {
c := compare.Strings(p1.LinkTitle(), p2.LinkTitle())
if c == 0 {
- if !p1.File().IsZero() && !p2.File().IsZero() {
+ if p1.File() != nil && p2.File() != nil {
return compare.LessStrings(p1.File().Filename(), p2.File().Filename())
}
}
@@ -192,7 +210,6 @@ var collatorStringLess = func(p Page) (less func(s1, s2 string) bool, close func
func() {
coll.Unlock()
}
-
}
// ByWeight sorts the Pages by weight and returns a copy.
@@ -406,7 +423,6 @@ func (p Pages) ByParam(paramsKey any) Pages {
s2 := cast.ToString(v2)
return stringLess(s1, s2)
-
}
pages, _ := spc.get(key, pageBy(paramsKeyComparator).Sort, p)
diff --git a/resources/page/pages_sort_test.go b/resources/page/pages_sort_test.go
index 728237230..12fa4a1e1 100644
--- a/resources/page/pages_sort_test.go
+++ b/resources/page/pages_sort_test.go
@@ -109,7 +109,6 @@ func TestSortByN(t *testing.T) {
byLen := func(p Pages) Pages {
return p.ByLength(ctx)
-
}
for i, this := range []struct {
@@ -273,7 +272,7 @@ func createSortTestPages(num int) Pages {
for i := 0; i < num; i++ {
p := newTestPage()
p.path = fmt.Sprintf("/x/y/p%d.md", i)
- p.title = fmt.Sprintf("Title %d", i%(num+1/2))
+ p.title = fmt.Sprintf("Title %d", i%((num+1)/2))
p.params = map[string]any{
"arbitrarily": map[string]any{
"nested": ("xyz" + fmt.Sprintf("%v", 100-i)),
diff --git a/resources/page/permalinks.go b/resources/page/permalinks.go
index 4577f5240..1677d3a90 100644
--- a/resources/page/permalinks.go
+++ b/resources/page/permalinks.go
@@ -120,12 +120,18 @@ func (l PermalinkExpander) Expand(key string, p Page) (string, error) {
return expand(p)
}
+// Allow " " and / to represent the root section.
+var sectionCutSet = " /"
+
+func init() {
+ if string(os.PathSeparator) != "/" {
+ sectionCutSet += string(os.PathSeparator)
+ }
+}
+
func (l PermalinkExpander) parse(patterns map[string]string) (map[string]func(Page) (string, error), error) {
expanders := make(map[string]func(Page) (string, error))
- // Allow " " and / to represent the root section.
- const sectionCutSet = " /" + string(os.PathSeparator)
-
for k, pattern := range patterns {
k = strings.Trim(k, sectionCutSet)
@@ -295,7 +301,7 @@ func (l PermalinkExpander) pageToPermalinkSections(p Page, _ string) (string, er
}
func (l PermalinkExpander) translationBaseName(p Page) string {
- if p.File().IsZero() {
+ if p.File() == nil {
return ""
}
return p.File().TranslationBaseName()
diff --git a/resources/page/permalinks_integration_test.go b/resources/page/permalinks_integration_test.go
index 6c2411ad7..9a76ac602 100644
--- a/resources/page/permalinks_integration_test.go
+++ b/resources/page/permalinks_integration_test.go
@@ -1,4 +1,4 @@
-// Copyright 2023 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -102,7 +102,6 @@ slug: "mytagslug"
"taxonomy": {"tags": "/tagsslug/:slug/"},
"term": {"tags": "/tagsslug/tag/:slug/"},
})
-
}
func TestPermalinksOldSetup(t *testing.T) {
@@ -145,7 +144,6 @@ slug: "p1slugvalue"
"taxonomy": {},
"term": {"withpageslug": "/pageslug/:slug/"},
})
-
}
func TestPermalinksNestedSections(t *testing.T) {
@@ -194,5 +192,4 @@ List.
b.AssertFileContent("public/libros/index.html", "List.")
b.AssertFileContent("public/libros/fiction/index.html", "List.")
b.AssertFileContent("public/libros/fiction/2023/book1/index.html", "Single.")
-
}
diff --git a/resources/page/permalinks_test.go b/resources/page/permalinks_test.go
index 194387d5c..a3a45bb88 100644
--- a/resources/page/permalinks_test.go
+++ b/resources/page/permalinks_test.go
@@ -1,4 +1,4 @@
-// Copyright 2023 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -202,7 +202,6 @@ func TestPermalinkExpansionSliceSyntax(t *testing.T) {
c.Assert(fn1("[:last]"), qt.DeepEquals, []string{})
c.Assert(fn1("[1:last]"), qt.DeepEquals, []string{})
c.Assert(fn1("[1]"), qt.DeepEquals, []string{})
-
})
c.Run("Out of bounds", func(c *qt.C) {
@@ -218,9 +217,7 @@ func TestPermalinkExpansionSliceSyntax(t *testing.T) {
c.Assert(fn4("[]"), qt.IsNil)
c.Assert(fn4("[1:}"), qt.IsNil)
c.Assert(fn4("foo"), qt.IsNil)
-
})
-
}
func BenchmarkPermalinkExpand(b *testing.B) {
diff --git a/resources/page/site.go b/resources/page/site.go
index 0480ce674..9ef76505d 100644
--- a/resources/page/site.go
+++ b/resources/page/site.go
@@ -21,7 +21,6 @@ import (
"github.com/gohugoio/hugo/config/privacy"
"github.com/gohugoio/hugo/config/services"
"github.com/gohugoio/hugo/identity"
- "github.com/gohugoio/hugo/tpl"
"github.com/gohugoio/hugo/config"
@@ -88,8 +87,12 @@ type Site interface {
Taxonomies() TaxonomyList
// Returns the last modification date of the content.
+ // Deprecated: Use .Lastmod instead.
LastChange() time.Time
+ // Returns the last modification date of the content.
+ Lastmod() time.Time
+
// Returns the Menus for this site.
Menus() navigation.Menus
@@ -108,10 +111,6 @@ type Site interface {
// Returns the site config.
Config() SiteConfig
- // Returns the identity of this site.
- // This is for internal use only.
- GetIdentity() identity.Identity
-
// Author is deprecated and will be removed in a future release.
Author() map[string]interface{}
@@ -127,9 +126,6 @@ type Site interface {
// Deprecated: Use Config().Privacy.Disqus instead.
DisqusShortname() string
- // For internal use only.
- GetPageWithTemplateInfo(info tpl.Info, ref ...string) (Page, error)
-
// BuildDrafts is deprecated and will be removed in a future release.
BuildDrafts() bool
@@ -154,6 +150,9 @@ func (s Sites) First() Site {
return s[0]
}
+// Some additional interfaces implemented by siteWrapper that's not on Site.
+var _ identity.ForEeachIdentityByNameProvider = (*siteWrapper)(nil)
+
type siteWrapper struct {
s Site
}
@@ -165,6 +164,10 @@ func WrapSite(s Site) Site {
return &siteWrapper{s: s}
}
+func (s *siteWrapper) Key() string {
+ return s.s.Language().Lang
+}
+
func (s *siteWrapper) Social() map[string]string {
return s.s.Social()
}
@@ -260,7 +263,11 @@ func (s *siteWrapper) Taxonomies() TaxonomyList {
}
func (s *siteWrapper) LastChange() time.Time {
- return s.s.LastChange()
+ return s.s.Lastmod()
+}
+
+func (s *siteWrapper) Lastmod() time.Time {
+ return s.s.Lastmod()
}
func (s *siteWrapper) Menus() navigation.Menus {
@@ -283,14 +290,6 @@ func (s *siteWrapper) Data() map[string]any {
return s.s.Data()
}
-func (s *siteWrapper) GetIdentity() identity.Identity {
- return s.s.GetIdentity()
-}
-
-func (s *siteWrapper) GetPageWithTemplateInfo(info tpl.Info, ref ...string) (Page, error) {
- return s.s.GetPageWithTemplateInfo(info, ref...)
-}
-
func (s *siteWrapper) BuildDrafts() bool {
return s.s.BuildDrafts()
}
@@ -312,6 +311,11 @@ func (s *siteWrapper) RSSLink() template.URL {
return s.s.RSSLink()
}
+// For internal use only.
+func (s *siteWrapper) ForEeachIdentityByName(name string, f func(identity.Identity) bool) {
+ s.s.(identity.ForEeachIdentityByNameProvider).ForEeachIdentityByName(name, f)
+}
+
type testSite struct {
h hugo.HugoInfo
l *langs.Language
@@ -341,6 +345,10 @@ func (testSite) LastChange() (t time.Time) {
return
}
+func (testSite) Lastmod() (t time.Time) {
+ return
+}
+
func (t testSite) Title() string {
return "foo"
}
@@ -386,10 +394,6 @@ func (t testSite) MainSections() []string {
return nil
}
-func (t testSite) GetIdentity() identity.Identity {
- return identity.KeyValueIdentity{Key: "site", Value: t.l.Lang}
-}
-
// Deprecated: use hugo.IsServer instead
func (t testSite) IsServer() bool {
return false
@@ -439,10 +443,6 @@ func (s testSite) Config() SiteConfig {
return SiteConfig{}
}
-func (testSite) GetPageWithTemplateInfo(info tpl.Info, ref ...string) (Page, error) {
- return nil, nil
-}
-
// Deprecated: Use .Site.Config.Services.Disqus.Shortname instead
func (testSite) DisqusShortname() string {
return ""
diff --git a/resources/page/siteidentities/identities.go b/resources/page/siteidentities/identities.go
new file mode 100644
index 000000000..8481999cf
--- /dev/null
+++ b/resources/page/siteidentities/identities.go
@@ -0,0 +1,34 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package siteidentities
+
+import (
+ "github.com/gohugoio/hugo/identity"
+)
+
+const (
+ // Identifies site.Data.
+ // The change detection in /data is currently very coarse grained.
+ Data = identity.StringIdentity("site.Data")
+)
+
+// FromString returns the identity from the given string,
+// or identity.Anonymous if not found.
+func FromString(name string) (identity.Identity, bool) {
+ switch name {
+ case "Data":
+ return Data, true
+ }
+ return identity.Anonymous, false
+}
diff --git a/resources/page/taxonomy.go b/resources/page/taxonomy.go
index 3aa0c7a7b..66c9e6fae 100644
--- a/resources/page/taxonomy.go
+++ b/resources/page/taxonomy.go
@@ -1,4 +1,4 @@
-// Copyright 2023 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
diff --git a/resources/page/testhelpers_page_test.go b/resources/page/testhelpers_page_test.go
deleted file mode 100644
index 95124cb58..000000000
--- a/resources/page/testhelpers_page_test.go
+++ /dev/null
@@ -1,38 +0,0 @@
-// Copyright 2023 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package page_test
-
-import (
- "github.com/gohugoio/hugo/common/loggers"
- "github.com/gohugoio/hugo/config"
- "github.com/gohugoio/hugo/config/testconfig"
- "github.com/gohugoio/hugo/helpers"
- "github.com/gohugoio/hugo/hugofs"
- "github.com/spf13/afero"
-)
-
-func newTestPathSpec() *helpers.PathSpec {
- return newTestPathSpecFor(config.New())
-}
-
-func newTestPathSpecFor(cfg config.Provider) *helpers.PathSpec {
- mfs := afero.NewMemMapFs()
- conf := testconfig.GetTestConfig(mfs, cfg)
- fs := hugofs.NewFrom(mfs, conf.BaseConfig())
- ps, err := helpers.NewPathSpec(fs, conf, loggers.NewDefault())
- if err != nil {
- panic(err)
- }
- return ps
-}
diff --git a/resources/page/testhelpers_test.go b/resources/page/testhelpers_test.go
index ca2c4ff53..e80ed422d 100644
--- a/resources/page/testhelpers_test.go
+++ b/resources/page/testhelpers_test.go
@@ -1,4 +1,4 @@
-// Copyright 2023 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -21,10 +21,7 @@ import (
"path/filepath"
"time"
- "github.com/gohugoio/hugo/hugofs/files"
- "github.com/gohugoio/hugo/identity"
"github.com/gohugoio/hugo/markup/tableofcontents"
- "github.com/gohugoio/hugo/tpl"
"github.com/gohugoio/hugo/resources/resource"
@@ -32,6 +29,7 @@ import (
"github.com/gohugoio/hugo/common/hugo"
"github.com/gohugoio/hugo/common/maps"
+ "github.com/gohugoio/hugo/common/paths"
"github.com/gohugoio/hugo/config"
"github.com/gohugoio/hugo/hugofs"
"github.com/gohugoio/hugo/langs"
@@ -54,7 +52,7 @@ func newTestPage() *testPage {
func newTestPageWithFile(filename string) *testPage {
filename = filepath.FromSlash(filename)
- file := source.NewTestFile(filename)
+ file := source.NewFileInfoFrom(filename, filename)
l, err := langs.NewLanguage(
"en",
@@ -107,7 +105,7 @@ type testPage struct {
params map[string]any
data map[string]any
- file source.File
+ file *source.File
currentSection *testPage
sectionEntries []string
@@ -141,7 +139,7 @@ func (p *testPage) BaseFileName() string {
panic("testpage: not implemented")
}
-func (p *testPage) BundleType() files.ContentClass {
+func (p *testPage) BundleType() string {
panic("testpage: not implemented")
}
@@ -201,7 +199,7 @@ func (p *testPage) Extension() string {
panic("testpage: not implemented")
}
-func (p *testPage) File() source.File {
+func (p *testPage) File() *source.File {
return p.file
}
@@ -225,10 +223,6 @@ func (p *testPage) GetPage(ref string) (Page, error) {
panic("testpage: not implemented")
}
-func (p *testPage) GetPageWithTemplateInfo(info tpl.Info, ref string) (Page, error) {
- panic("testpage: not implemented")
-}
-
func (p *testPage) GetParam(key string) any {
panic("testpage: not implemented")
}
@@ -261,15 +255,15 @@ func (p *testPage) Hugo() hugo.HugoInfo {
panic("testpage: not implemented")
}
-func (p *testPage) InSection(other any) (bool, error) {
+func (p *testPage) InSection(other any) bool {
panic("testpage: not implemented")
}
-func (p *testPage) IsAncestor(other any) (bool, error) {
+func (p *testPage) IsAncestor(other any) bool {
panic("testpage: not implemented")
}
-func (p *testPage) IsDescendant(other any) (bool, error) {
+func (p *testPage) IsDescendant(other any) bool {
panic("testpage: not implemented")
}
@@ -301,6 +295,10 @@ func (p *testPage) IsTranslated() bool {
panic("testpage: not implemented")
}
+func (p *testPage) Ancestors() Pages {
+ panic("testpage: not implemented")
+}
+
func (p *testPage) Keywords() []string {
return nil
}
@@ -415,16 +413,12 @@ func (p *testPage) Parent() Page {
panic("testpage: not implemented")
}
-func (p *testPage) Ancestors() Pages {
- panic("testpage: not implemented")
-}
-
func (p *testPage) Path() string {
return p.path
}
-func (p *testPage) Pathc() string {
- return p.path
+func (p *testPage) PathInfo() *paths.Path {
+ panic("testpage: not implemented")
}
func (p *testPage) Permalink() string {
@@ -604,10 +598,6 @@ func (p *testPage) WordCount(context.Context) int {
panic("testpage: not implemented")
}
-func (p *testPage) GetIdentity() identity.Identity {
- panic("testpage: not implemented")
-}
-
func createTestPages(num int) Pages {
pages := make(Pages, num)
diff --git a/resources/page/zero_file.autogen.go b/resources/page/zero_file.autogen.go
index 72d98998e..4b7c034a1 100644
--- a/resources/page/zero_file.autogen.go
+++ b/resources/page/zero_file.autogen.go
@@ -14,75 +14,3 @@
// This file is autogenerated.
package page
-
-import (
- "github.com/gohugoio/hugo/common/loggers"
- "github.com/gohugoio/hugo/hugofs"
- "github.com/gohugoio/hugo/source"
-)
-
-// ZeroFile represents a zero value of source.File with warnings if invoked.
-type zeroFile struct {
- log loggers.Logger
-}
-
-func NewZeroFile(log loggers.Logger) source.File {
- return zeroFile{log: log}
-}
-
-func (zeroFile) IsZero() bool {
- return true
-}
-
-func (z zeroFile) Path() (o0 string) {
- z.log.Warnln(".File.Path on zero object. Wrap it in if or with: {{ with .File }}{{ .Path }}{{ end }}")
- return
-}
-func (z zeroFile) Section() (o0 string) {
- z.log.Warnln(".File.Section on zero object. Wrap it in if or with: {{ with .File }}{{ .Section }}{{ end }}")
- return
-}
-func (z zeroFile) Lang() (o0 string) {
- z.log.Warnln(".File.Lang on zero object. Wrap it in if or with: {{ with .File }}{{ .Lang }}{{ end }}")
- return
-}
-func (z zeroFile) Filename() (o0 string) {
- z.log.Warnln(".File.Filename on zero object. Wrap it in if or with: {{ with .File }}{{ .Filename }}{{ end }}")
- return
-}
-func (z zeroFile) Dir() (o0 string) {
- z.log.Warnln(".File.Dir on zero object. Wrap it in if or with: {{ with .File }}{{ .Dir }}{{ end }}")
- return
-}
-func (z zeroFile) Extension() (o0 string) {
- z.log.Warnln(".File.Extension on zero object. Wrap it in if or with: {{ with .File }}{{ .Extension }}{{ end }}")
- return
-}
-func (z zeroFile) Ext() (o0 string) {
- z.log.Warnln(".File.Ext on zero object. Wrap it in if or with: {{ with .File }}{{ .Ext }}{{ end }}")
- return
-}
-func (z zeroFile) LogicalName() (o0 string) {
- z.log.Warnln(".File.LogicalName on zero object. Wrap it in if or with: {{ with .File }}{{ .LogicalName }}{{ end }}")
- return
-}
-func (z zeroFile) BaseFileName() (o0 string) {
- z.log.Warnln(".File.BaseFileName on zero object. Wrap it in if or with: {{ with .File }}{{ .BaseFileName }}{{ end }}")
- return
-}
-func (z zeroFile) TranslationBaseName() (o0 string) {
- z.log.Warnln(".File.TranslationBaseName on zero object. Wrap it in if or with: {{ with .File }}{{ .TranslationBaseName }}{{ end }}")
- return
-}
-func (z zeroFile) ContentBaseName() (o0 string) {
- z.log.Warnln(".File.ContentBaseName on zero object. Wrap it in if or with: {{ with .File }}{{ .ContentBaseName }}{{ end }}")
- return
-}
-func (z zeroFile) UniqueID() (o0 string) {
- z.log.Warnln(".File.UniqueID on zero object. Wrap it in if or with: {{ with .File }}{{ .UniqueID }}{{ end }}")
- return
-}
-func (z zeroFile) FileInfo() (o0 hugofs.FileMetaInfo) {
- z.log.Warnln(".File.FileInfo on zero object. Wrap it in if or with: {{ with .File }}{{ .FileInfo }}{{ end }}")
- return
-}