Home Backend Development Golang Building a Web Search Engine in Go with Elasticsearch

Building a Web Search Engine in Go with Elasticsearch

Nov 05, 2024 am 10:33 AM

Web search engines are essential for indexing vast amounts of online information, making it accessible in milliseconds. In this project, I built a search engine in Go (Golang) named RelaxSearch. It combines web scraping, periodic data indexing, and search functionality by integrating with Elasticsearch—a powerful search and analytics engine. In this blog, I’ll walk you through the main components of RelaxSearch, the architecture, and how it efficiently scrapes and indexes data for fast, keyword-based search.

Overview of RelaxSearch

RelaxSearch is built around two primary modules:

  1. RelaxEngine: A web scraper powered by cron jobs, which periodically crawls specified websites, extracts content, and indexes it in Elasticsearch.
  2. RelaxWeb: A RESTful API server that allows users to search the indexed data, providing pagination, filtering, and content highlighting for user-friendly responses.

Project Motivation

Creating a search engine project from scratch is a great way to understand web scraping, data indexing, and efficient search techniques. I wanted to create a simple but functional search engine with fast data retrieval and easy extensibility, utilizing Go’s efficiency and Elasticsearch’s powerful indexing.

Key Features

  • Automated Crawling: Using cron jobs, RelaxEngine can run at regular intervals, scraping data and storing it in Elasticsearch.
  • Full-Text Search: RelaxWeb provides full-text search capability, indexing content by keywords, making retrieval fast.
  • REST API: Accessible through a RESTful API with parameters for pagination, date filtering, and content highlighting.
  • Data Storage: The indexed content is stored in Elasticsearch, allowing for scalable and highly responsive queries.

Architecture of RelaxSearch

1. RelaxEngine (Web Scraper and Indexer)

RelaxEngine is a web scraper written in Go that navigates web pages, extracting and storing content. It runs as a cron job, so it can operate at regular intervals (e.g., every 30 minutes) to keep the index updated with the latest web data. Here’s how it works:

  • Seed URL: RelaxEngine starts scraping from a specified seed URL and then follows links within the site up to a configurable depth.
  • Content Parsing: For each page, it extracts titles, descriptions, and keywords, constructing an informative dataset.
  • Indexing in Elasticsearch: The scraped content is indexed in Elasticsearch, ready for full-text search. Each page's data is stored with a unique identifier, title, description, and other metadata.

2. RelaxWeb (Search API)

RelaxWeb provides a RESTful API endpoint, making it easy to query and retrieve data stored in Elasticsearch. The API accepts several parameters, such as keywords, pagination, and date filtering, returning relevant content in JSON format.

  • API Endpoint: /search
  • Query Parameters:
    • keyword: Main search term.
    • from and size: Pagination control.
    • dateRangeStart and dateRangeEnd: Filter results based on the timestamp of data.

Building a Web Search Engine in Go with Elasticsearch

Key Components and Code Snippets

Below are some important components and code excerpts from RelaxSearch to illustrate how it works.

Main Go Code for RelaxEngine

The core functionality is in the main.go file, where RelaxEngine initializes a scheduler using gocron to manage cron jobs, sets up the Elasticsearch client, and begins crawling from the seed URL.

func main() {
    cfg := config.LoadConfig()
    esClient := crawler.NewElasticsearchClient(cfg.ElasticsearchURL)
    c := crawler.NewCrawler(cfg.DepthLimit, 5)
    seedURL := "https://example.com/" // Replace with starting URL

    s := gocron.NewScheduler(time.UTC)
    s.Every(30).Minutes().Do(func() {
        go c.StartCrawling(seedURL, 0, esClient)
    })
    s.StartBlocking()
}
Copy after login

Crawler and Indexing Logic

The crawler.go file handles web page requests, extracts content, and indexes it. Using the elastic package, each scraped page is stored in Elasticsearch.

func (c *Crawler) StartCrawling(pageURL string, depth int, esClient *elastic.Client) {
    if depth > c.DepthLimit || c.isVisited(pageURL) {
        return
    }
    c.markVisited(pageURL)
    links, title, content, description, err := c.fetchAndParsePage(pageURL)
    if err == nil {
        pageData := PageData{URL: pageURL, Title: title, Content: content, Description: description}
        IndexPageData(esClient, pageData)
    }
    for _, link := range links {
        c.StartCrawling(link, depth+1, esClient)
    }
}
Copy after login

Search API Code in RelaxWeb

In relaxweb service, an API endpoint provides full-text search capabilities. The endpoint /search receives requests and queries Elasticsearch, returning relevant content based on keywords.

func searchHandler(w http.ResponseWriter, r *http.Request) {
    keyword := r.URL.Query().Get("keyword")
    results := queryElasticsearch(keyword)
    json.NewEncoder(w).Encode(results)
}
Copy after login

Setting Up RelaxSearch

  1. Clone the Repository
   git clone https://github.com/Ravikisha/RelaxSearch.git
   cd RelaxSearch
Copy after login
  1. Configuration

    Update .env files for both RelaxEngine and RelaxWeb with Elasticsearch credentials.

  2. Run with Docker

    RelaxSearch uses Docker for easy setup. Simply run:

   docker-compose up --build
Copy after login

Building a Web Search Engine in Go with Elasticsearch

Building a Web Search Engine in Go with Elasticsearch

Building a Web Search Engine in Go with Elasticsearch

Challenges and Improvements

  • Scalability: Elasticsearch scales well, but handling extensive scraping with numerous links requires optimizations for larger-scale deployments.
  • Robust Error Handling: Enhancing error handling and retry mechanisms would increase resilience.

Conclusion

RelaxSearch is an educational and practical demonstration of a basic search engine. While it is still a prototype, this project has been instrumental in understanding the fundamentals of web scraping, full-text search, and efficient data indexing with Go and Elasticsearch. It opens avenues for improvements and real-world application in scalable environments.

Explore the GitHub repository to try out RelaxSearch for yourself!

The above is the detailed content of Building a Web Search Engine in Go with Elasticsearch. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

Hot Topics

Java Tutorial
1664
14
PHP Tutorial
1268
29
C# Tutorial
1243
24
Golang's Purpose: Building Efficient and Scalable Systems Golang's Purpose: Building Efficient and Scalable Systems Apr 09, 2025 pm 05:17 PM

Go language performs well in building efficient and scalable systems. Its advantages include: 1. High performance: compiled into machine code, fast running speed; 2. Concurrent programming: simplify multitasking through goroutines and channels; 3. Simplicity: concise syntax, reducing learning and maintenance costs; 4. Cross-platform: supports cross-platform compilation, easy deployment.

Golang vs. Python: Performance and Scalability Golang vs. Python: Performance and Scalability Apr 19, 2025 am 12:18 AM

Golang is better than Python in terms of performance and scalability. 1) Golang's compilation-type characteristics and efficient concurrency model make it perform well in high concurrency scenarios. 2) Python, as an interpreted language, executes slowly, but can optimize performance through tools such as Cython.

Golang and C  : Concurrency vs. Raw Speed Golang and C : Concurrency vs. Raw Speed Apr 21, 2025 am 12:16 AM

Golang is better than C in concurrency, while C is better than Golang in raw speed. 1) Golang achieves efficient concurrency through goroutine and channel, which is suitable for handling a large number of concurrent tasks. 2)C Through compiler optimization and standard library, it provides high performance close to hardware, suitable for applications that require extreme optimization.

Golang's Impact: Speed, Efficiency, and Simplicity Golang's Impact: Speed, Efficiency, and Simplicity Apr 14, 2025 am 12:11 AM

Goimpactsdevelopmentpositivelythroughspeed,efficiency,andsimplicity.1)Speed:Gocompilesquicklyandrunsefficiently,idealforlargeprojects.2)Efficiency:Itscomprehensivestandardlibraryreducesexternaldependencies,enhancingdevelopmentefficiency.3)Simplicity:

Golang vs. Python: Key Differences and Similarities Golang vs. Python: Key Differences and Similarities Apr 17, 2025 am 12:15 AM

Golang and Python each have their own advantages: Golang is suitable for high performance and concurrent programming, while Python is suitable for data science and web development. Golang is known for its concurrency model and efficient performance, while Python is known for its concise syntax and rich library ecosystem.

Golang vs. C  : Performance and Speed Comparison Golang vs. C : Performance and Speed Comparison Apr 21, 2025 am 12:13 AM

Golang is suitable for rapid development and concurrent scenarios, and C is suitable for scenarios where extreme performance and low-level control are required. 1) Golang improves performance through garbage collection and concurrency mechanisms, and is suitable for high-concurrency Web service development. 2) C achieves the ultimate performance through manual memory management and compiler optimization, and is suitable for embedded system development.

Golang and C  : The Trade-offs in Performance Golang and C : The Trade-offs in Performance Apr 17, 2025 am 12:18 AM

The performance differences between Golang and C are mainly reflected in memory management, compilation optimization and runtime efficiency. 1) Golang's garbage collection mechanism is convenient but may affect performance, 2) C's manual memory management and compiler optimization are more efficient in recursive computing.

C   and Golang: When Performance is Crucial C and Golang: When Performance is Crucial Apr 13, 2025 am 12:11 AM

C is more suitable for scenarios where direct control of hardware resources and high performance optimization is required, while Golang is more suitable for scenarios where rapid development and high concurrency processing are required. 1.C's advantage lies in its close to hardware characteristics and high optimization capabilities, which are suitable for high-performance needs such as game development. 2.Golang's advantage lies in its concise syntax and natural concurrency support, which is suitable for high concurrency service development.

See all articles