search package - github.com/blevesearch/bleve/v2/search - Go Packages

View Source

const (
	MaxGeoBufPoolSize = 24 * 1024
	MinGeoBufPoolSize = 24
)

Assigning the size of the largest buffer in the pool to 24KB and the smallest buffer to 24 bytes. The pools are used to read a sequence of vertices which are always 24 bytes each.

View Source

const (
	KnnPreSearchDataKey     = "_knn_pre_search_data_key"
	SynonymPreSearchDataKey = "_synonym_pre_search_data_key"
	BM25PreSearchDataKey    = "_bm25_pre_search_data_key"
)

*PreSearchDataKey are used to store the data gathered during the presearch phase which would be use in the actual search phase.

BM25 specific multipliers which control the scoring of a document.

BM25_b - controls the extent to which doc's field length normalize term frequency part of score BM25_k1 - controls the saturation of the score due to term frequency the default values are as per elastic search's implementation

View Source

var MakeDocumentMatchHandlerKey = MakeDocumentMatchHandlerKeyType(
	"MakeDocumentMatchHandlerKey")

View Source

var MakeKNNDocumentMatchHandlerKey = MakeDocumentMatchHandlerKeyType(
	"MakeKNNDocumentMatchHandlerKey")

LevenshteinDistanceMax same as LevenshteinDistance but attempts to bail early once we know the distance will be greater than max in which case the first return val will be the max and the second will be true, indicating max was exceeded

func (ap ArrayPositions) Compare(other ArrayPositions) int
type BM25Stats struct {
	DocCount         float64        `json:"doc_count"`
	FieldCardinality map[string]int `json:"field_cardinality"`
}
func (p BytesSlice) Swap(i, j int)

ContextKey is used to identify the context key in the context.Context

const (
	SearchIncrementalCostKey ContextKey = "_search_incremental_cost_key"
	QueryTypeKey             ContextKey = "_query_type_key"
	FuzzyMatchPhraseKey      ContextKey = "_fuzzy_match_phrase_key"
	IncludeScoreBreakdownKey ContextKey = "_include_score_breakdown_key"

	
	
	PreSearchKey ContextKey = "_presearch_key"

	
	
	GetScoringModelCallbackKey ContextKey = "_get_scoring_model"

	
	SearchIOStatsCallbackKey ContextKey = "_search_io_stats_callback_key"

	
	GeoBufferPoolCallbackKey ContextKey = "_geo_buffer_pool_callback_key"

	
	
	
	
	
	
	
	SearchTypeKey ContextKey = "_search_type_key"

	
	
	SearcherStartCallbackKey ContextKey = "_searcher_start_callback_key"
	SearcherEndCallbackKey   ContextKey = "_searcher_end_callback_key"

	
	
	FieldTermSynonymMapKey ContextKey = "_field_term_synonym_map_key"

	
	
	BM25StatsKey ContextKey = "_bm25_stats_key"

	
	
	ScoreFusionKey ContextKey = "_fusion_rescoring_key"
)
type DateRangeFacet struct {
	Name  string  `json:"name"`
	Start *string `json:"start,omitempty"`
	End   *string `json:"end,omitempty"`
	Count int     `json:"count"`
}
type DateRangeFacets []*DateRangeFacet
func (drf DateRangeFacets) Add(dateRangeFacet *DateRangeFacet) DateRangeFacets
func (drf DateRangeFacets) Swap(i, j int)
type DocumentMatch struct {
	Index           string                `json:"index,omitempty"`
	ID              string                `json:"id"`
	IndexInternalID index.IndexInternalID `json:"-"`
	Score           float64               `json:"score"`
	Expl            *Explanation          `json:"explanation,omitempty"`
	Locations       FieldTermLocationMap  `json:"locations,omitempty"`
	Fragments       FieldFragmentMap      `json:"fragments,omitempty"`
	Sort            []string              `json:"sort,omitempty"`
	DecodedSort     []string              `json:"decoded_sort,omitempty"`

	
	
	
	Fields map[string]interface{} `json:"fields,omitempty"`

	
	HitNumber uint64 `json:"-"`

	
	
	
	
	FieldTermLocations []FieldTermLocation `json:"-"`

	
	
	
	
	
	
	ScoreBreakdown map[int]float64 `json:"score_breakdown,omitempty"`

	
	
	
	
	
	
	IndexNames []string `json:"index_names,omitempty"`
}
func (dm *DocumentMatch) AddFieldValue(name string, value interface{})
func (dm *DocumentMatch) Complete(prealloc []Location) []Location

Complete performs final preparation & transformation of the DocumentMatch at the end of search processing, also allowing the caller to provide an optional preallocated locations slice

func (dm *DocumentMatch) Reset() *DocumentMatch

Reset allows an already allocated DocumentMatch to be reused

func (dm *DocumentMatch) Size() int
type DocumentMatchCollection []*DocumentMatch

type DocumentMatchHandler

type DocumentMatchHandler func(hit *DocumentMatch) error

DocumentMatchHandler is the type of document match callback bleve will invoke during the search. Eventually, bleve will indicate the completion of an ongoing search, by passing a nil value for the document match callback. The application should take a copy of the hit/documentMatch if it wish to own it or need prolonged access to it.

type DocumentMatchPool struct {
	TooSmall DocumentMatchPoolTooSmall
	
}

DocumentMatchPool manages use/reuse of DocumentMatch instances it pre-allocates space from a single large block with the expected number of instances. It is not thread-safe as currently all aspects of search take place in a single goroutine.

func NewDocumentMatchPool(size, sortsize int) *DocumentMatchPool

NewDocumentMatchPool will build a DocumentMatchPool with memory pre-allocated to accommodate the requested number of DocumentMatch instances

func (p *DocumentMatchPool) Get() *DocumentMatch

Get returns an available DocumentMatch from the pool if the pool was not allocated with sufficient size, an allocation will occur to satisfy this request. As a side-effect this will grow the size of the pool.

func (p *DocumentMatchPool) Put(d *DocumentMatch)

Put returns a DocumentMatch to the pool

type DocumentMatchPoolTooSmall func(p *DocumentMatchPool) *DocumentMatch

DocumentMatchPoolTooSmall is a callback function that can be executed when the DocumentMatchPool does not have sufficient capacity By default we just perform just-in-time allocation, but you could log a message, or panic, etc.

type Explanation struct {
	Value        float64        `json:"value"`
	Message      string         `json:"message"`
	PartialMatch bool           `json:"partial_match,omitempty"`
	Children     []*Explanation `json:"children,omitempty"`
}
func (expl *Explanation) Size() int
type FacetBuilder interface {
	StartDoc()
	UpdateVisitor(term []byte)
	EndDoc()

	Result() *FacetResult
	Field() string

	Size() int
}
type FacetResult struct {
	Field         string             `json:"field"`
	Total         int                `json:"total"`
	Missing       int                `json:"missing"`
	Other         int                `json:"other"`
	Terms         *TermFacets        `json:"terms,omitempty"`
	NumericRanges NumericRangeFacets `json:"numeric_ranges,omitempty"`
	DateRanges    DateRangeFacets    `json:"date_ranges,omitempty"`
}
func (fr *FacetResult) Fixup(size int)
func (fr *FacetResult) Merge(other *FacetResult)
func (fr *FacetResult) Size() int
func (fr FacetResults) Merge(other FacetResults)
type FacetsBuilder struct {
	
}
func (fb *FacetsBuilder) Add(name string, facetBuilder FacetBuilder)
func (fb *FacetsBuilder) EndDoc()
func (fb *FacetsBuilder) RequiredFields() []string
func (fb *FacetsBuilder) Results() FacetResults
func (fb *FacetsBuilder) Size() int
func (fb *FacetsBuilder) StartDoc()
type FieldTermLocation struct {
	Field    string
	Term     string
	Location Location
}
func MergeFieldTermLocations(dest []FieldTermLocation, matches []*DocumentMatch) []FieldTermLocation
func MergeLocations(locations []FieldTermLocationMap) FieldTermLocationMap

field -> term -> synonyms

func (f FieldTermSynonymMap) MergeWith(fts FieldTermSynonymMap)
type GetScoringModelCallbackFn func() string
type Location struct {
	
	Pos uint64 `json:"pos"`

	
	Start uint64 `json:"start"`
	End   uint64 `json:"end"`

	
	ArrayPositions ArrayPositions `json:"array_positions"`
}
func (l *Location) Size() int
type Locations []*Location
func (p Locations) Dedupe() Locations
func (p Locations) Swap(i, j int)

type MakeDocumentMatchHandler

type MakeDocumentMatchHandler func(ctx *SearchContext) (
	callback DocumentMatchHandler, loadID bool, err error)

MakeDocumentMatchHandler is an optional DocumentMatchHandler builder function which the applications can pass to bleve. These builder methods gives a DocumentMatchHandler function to bleve, which it will invoke on every document matches.

type NumericRangeFacet struct {
	Name  string   `json:"name"`
	Min   *float64 `json:"min,omitempty"`
	Max   *float64 `json:"max,omitempty"`
	Count int      `json:"count"`
}
type NumericRangeFacets []*NumericRangeFacet
func (nrf NumericRangeFacets) Add(numericRangeFacet *NumericRangeFacet) NumericRangeFacets
func (nrf NumericRangeFacets) Swap(i, j int)
type ScoreExplCorrectionCallbackFunc func(queryMatch *DocumentMatch, knnMatch *DocumentMatch) (float64, *Explanation)

SearchContext represents the context around a single search

func (sc *SearchContext) Size() int
type SearchIOStatsCallbackFunc func(uint64)

Implementation of SearchIncrementalCostCallbackFn should handle the following messages

  • add: increment the cost of a search operation (which can be specific to a query type as well)
  • abort: query was aborted due to a cancel of search's context (for eg), which can be handled differently as well
  • done: indicates that a search was complete and the tracked cost can be handled safely by the implementation.
type SearchIncrementalCostCallbackMsg uint
type SearchQueryType uint
func ParseSearchSortObj(input map[string]interface{}) (SearchSort, error)
func ParseSearchSortString(input string) SearchSort
type SearcherOptions struct {
	Explain            bool
	IncludeTermVectors bool
	Score              string
}
type SortDocID struct {
	Desc bool
}

SortDocID will sort results by the document identifier

func (s *SortDocID) Copy() SearchSort
func (s *SortDocID) Descending() bool

Descending determines the order of the sort

func (s *SortDocID) RequiresDocID() bool

RequiresDocID says this SearchSort does require the DocID be loaded

func (s *SortDocID) RequiresFields() []string

RequiresFields says this SearchStore does not require any stored fields

func (s *SortDocID) RequiresScoring() bool

RequiresScoring says this SearchStore does not require scoring

func (s *SortDocID) Reverse()

UpdateVisitor is a no-op for SortDocID as it's value is not dependent on any field terms

Value returns the sort value of the DocumentMatch

SortField will sort results by the value of a stored field

Field is the name of the field
Descending reverse the sort order (default false)
Type allows forcing of string/number/date behavior (default auto)
Mode controls behavior for multi-values fields (default first)
Missing controls behavior of missing values (default last)
func (s *SortField) Copy() SearchSort
func (s *SortField) Descending() bool

Descending determines the order of the sort

func (s *SortField) RequiresDocID() bool

RequiresDocID says this SearchSort does not require the DocID be loaded

func (s *SortField) RequiresFields() []string

RequiresFields says this SearchStore requires the specified stored field

func (s *SortField) RequiresScoring() bool

RequiresScoring says this SearchStore does not require scoring

func (s *SortField) Reverse()

UpdateVisitor notifies this sort field that in this document this field has the specified term

Value returns the sort value of the DocumentMatch it also resets the state of this SortField for processing the next document

type SortFieldMissing int

SortFieldMissing controls where documents missing a field value should be sorted

const (
	
	SortFieldMissingLast SortFieldMissing = iota

	
	SortFieldMissingFirst
)

SortFieldMode describes the behavior if the field has multiple values

const (
	
	SortFieldDefault SortFieldMode = iota 
	
	SortFieldMin
	
	SortFieldMax
)

SortFieldType lets you control some internal sort behavior normally leaving this to the zero-value of SortFieldAuto is fine

const (
	
	SortFieldAuto SortFieldType = iota
	
	SortFieldAsString
	
	SortFieldAsNumber
	
	SortFieldAsDate
)

SortGeoDistance will sort results by the distance of an indexed geo point, from the provided location.

Field is the name of the field
Descending reverse the sort order (default false)

NewSortGeoDistance creates SearchSort instance for sorting documents by their distance from the specified point.

func (s *SortGeoDistance) Copy() SearchSort
func (s *SortGeoDistance) Descending() bool

Descending determines the order of the sort

func (s *SortGeoDistance) RequiresDocID() bool

RequiresDocID says this SearchSort does not require the DocID be loaded

RequiresFields says this SearchStore requires the specified stored field

func (s *SortGeoDistance) RequiresScoring() bool

RequiresScoring says this SearchStore does not require scoring

func (s *SortGeoDistance) Reverse()

UpdateVisitor notifies this sort field that in this document this field has the specified term

Value returns the sort value of the DocumentMatch it also resets the state of this SortField for processing the next document

type SortOrder []SearchSort
func ParseSortOrderStrings(in []string) SortOrder
func (so SortOrder) CacheDescending() []bool
func (so SortOrder) CacheIsScore() []bool
func (so SortOrder) Compare(cachedScoring, cachedDesc []bool, i, j *DocumentMatch) int

Compare will compare two document matches using the specified sort order if both are numbers, we avoid converting back to term

func (so SortOrder) Copy() SortOrder
func (so SortOrder) RequiredFields() []string
func (so SortOrder) RequiresDocID() bool
func (so SortOrder) RequiresScore() bool
func (so SortOrder) Reverse()
func (so SortOrder) Value(doc *DocumentMatch)
type SortScore struct {
	Desc bool
}

SortScore will sort results by the document match score

func (s *SortScore) Copy() SearchSort
func (s *SortScore) Descending() bool

Descending determines the order of the sort

func (s *SortScore) RequiresDocID() bool

RequiresDocID says this SearchSort does not require the DocID be loaded

func (s *SortScore) RequiresFields() []string

RequiresFields says this SearchStore does not require any store fields

func (s *SortScore) RequiresScoring() bool

RequiresScoring says this SearchStore does require scoring

func (s *SortScore) Reverse()

UpdateVisitor is a no-op for SortScore as it's value is not dependent on any field terms

Value returns the sort value of the DocumentMatch

type TermFacet struct {
	Term  string `json:"term"`
	Count int    `json:"count"`
}
type TermFacets struct {
	
}
func (tf *TermFacets) Add(termFacets ...*TermFacet)
func (tf *TermFacets) Len() int

TermFacets used to be a type alias for []*TermFacet. To maintain backwards compatibility, we have to implement custom JSON marshalling.

func (tf *TermFacets) Swap(i, j int)
func (tf *TermFacets) Terms() []*TermFacet
func (tf *TermFacets) TrimToTopN(n int)
func MergeTermLocationMaps(rv, other TermLocationMap) TermLocationMap
func (t TermLocationMap) AddLocation(term string, location *Location)