Home Web Front-end JS Tutorial Running local LLM (Ollama) in your nodejs project.

Running local LLM (Ollama) in your nodejs project.

Nov 28, 2024 pm 06:45 PM

We all love AI, and since recent years the boom in Artificial Intelligence has changed the world and is taking it into a new era. For any use problem there is a use case of AI, being it asking Gemini about a cooking recipe, Chatgpt for assignments, Claude for programming, V0 for frontend design, devs and students are so much dependent on AI these days which leads to almost every new day a startup emerging featuring AI.

Running local LLM (Ollama) in your nodejs project.

This leads to aspiring developers like me question on how can I make something like this? The answer is in the picture above only. API call to these models. But, they are not cheap and an unemployed student like me has no means to purchase the subscription. This lead to the idea of running the AI locally and then serving it on port for api calls. This article would give you a step by step guide on how you can setup Ollama and access the LLMs through your nodejs code.

Installing Ollama

This step is for windows users. If you are on other operating systems then follow this guide.

  • Head over to Ollama, and download their installer.

Running local LLM (Ollama) in your nodejs project.

  • Once done, fire up the setup and Install the application.

Running local LLM (Ollama) in your nodejs project.

  • This will then install the client on your machine, and now you can head over to the library section of ollama's official website to pick the model you want to use.

Running local LLM (Ollama) in your nodejs project.

  • Here, I'll be using codellama:7b for my machine.
  • Open your CMD or Powershell and run the command ollama run , this will download the model on your machine if it already does not exist and then would run it.

Serving LLM on Port

  • Now you have Ollama on your system and also have the required LLM, so the next step would be to serve it on your machine's port for your node app to access it.
  • Before proceeding, close the Ollama from background and check if the default port assigned to ollama is empty or not by using this command ollama serve, if this throws an error then it means the port is occupied.
  • You'll need to clear that port before proceeding, the default port for Ollama is 11434
  • Use the following command to check what process is running on that port netstat -ano | findstr :11434
  • Note down the PID from the above result and use this command to clear the port. taskkill /PID /F
  • Once done open new cmd terminal and run the following command ollama serve
  • Now you'll see something like this which means your LLMs are now accessible through API calls.

Running local LLM (Ollama) in your nodejs project.

Using ollama npm package to for req response handling

  • Start your node project by following the commands
npm init -y
npm i typescript ollama
npx tsc --init
Copy after login
  • this will create a repo for you to start working, first head over to tsconfig.json file, uncomment and set these values
"rootDir": "./src",
"outDir": "./dist",
Copy after login
  • Create a src folder and inside the folder create the index.js file.
import ollama from 'ollama';

async function main() {
    const response = await ollama.chat({
        model: 'codellama:7b',
        messages: [
            {
                role: 'user', 
                content: 'What color is the sky?'
            }
        ],
    })
    console.log(response.message.content)

}

main()

Copy after login
  • Now before running the code, edit the scripts in package.json
"scripts": {
    "dev": "tsc -b && node dist/index.js"
  },
Copy after login
  • This would build the ts code into js code for running.
  • Run the application by using the command npm run dev inside the terminal.

Running local LLM (Ollama) in your nodejs project.

  • There you are. Finally being able to access your local LLM with nodejs.
  • You can read more about the node package ollama here.

Thank you for reading, Hope this article could help you in any case and if it did then feel free to connect on my socials!

Linkedin | Github

The above is the detailed content of Running local LLM (Ollama) in your nodejs project.. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

Hot Topics

Java Tutorial
1664
14
PHP Tutorial
1266
29
C# Tutorial
1239
24
Demystifying JavaScript: What It Does and Why It Matters Demystifying JavaScript: What It Does and Why It Matters Apr 09, 2025 am 12:07 AM

JavaScript is the cornerstone of modern web development, and its main functions include event-driven programming, dynamic content generation and asynchronous programming. 1) Event-driven programming allows web pages to change dynamically according to user operations. 2) Dynamic content generation allows page content to be adjusted according to conditions. 3) Asynchronous programming ensures that the user interface is not blocked. JavaScript is widely used in web interaction, single-page application and server-side development, greatly improving the flexibility of user experience and cross-platform development.

The Evolution of JavaScript: Current Trends and Future Prospects The Evolution of JavaScript: Current Trends and Future Prospects Apr 10, 2025 am 09:33 AM

The latest trends in JavaScript include the rise of TypeScript, the popularity of modern frameworks and libraries, and the application of WebAssembly. Future prospects cover more powerful type systems, the development of server-side JavaScript, the expansion of artificial intelligence and machine learning, and the potential of IoT and edge computing.

JavaScript Engines: Comparing Implementations JavaScript Engines: Comparing Implementations Apr 13, 2025 am 12:05 AM

Different JavaScript engines have different effects when parsing and executing JavaScript code, because the implementation principles and optimization strategies of each engine differ. 1. Lexical analysis: convert source code into lexical unit. 2. Grammar analysis: Generate an abstract syntax tree. 3. Optimization and compilation: Generate machine code through the JIT compiler. 4. Execute: Run the machine code. V8 engine optimizes through instant compilation and hidden class, SpiderMonkey uses a type inference system, resulting in different performance performance on the same code.

Python vs. JavaScript: The Learning Curve and Ease of Use Python vs. JavaScript: The Learning Curve and Ease of Use Apr 16, 2025 am 12:12 AM

Python is more suitable for beginners, with a smooth learning curve and concise syntax; JavaScript is suitable for front-end development, with a steep learning curve and flexible syntax. 1. Python syntax is intuitive and suitable for data science and back-end development. 2. JavaScript is flexible and widely used in front-end and server-side programming.

JavaScript: Exploring the Versatility of a Web Language JavaScript: Exploring the Versatility of a Web Language Apr 11, 2025 am 12:01 AM

JavaScript is the core language of modern web development and is widely used for its diversity and flexibility. 1) Front-end development: build dynamic web pages and single-page applications through DOM operations and modern frameworks (such as React, Vue.js, Angular). 2) Server-side development: Node.js uses a non-blocking I/O model to handle high concurrency and real-time applications. 3) Mobile and desktop application development: cross-platform development is realized through ReactNative and Electron to improve development efficiency.

How to Build a Multi-Tenant SaaS Application with Next.js (Frontend Integration) How to Build a Multi-Tenant SaaS Application with Next.js (Frontend Integration) Apr 11, 2025 am 08:22 AM

This article demonstrates frontend integration with a backend secured by Permit, building a functional EdTech SaaS application using Next.js. The frontend fetches user permissions to control UI visibility and ensures API requests adhere to role-base

From C/C   to JavaScript: How It All Works From C/C to JavaScript: How It All Works Apr 14, 2025 am 12:05 AM

The shift from C/C to JavaScript requires adapting to dynamic typing, garbage collection and asynchronous programming. 1) C/C is a statically typed language that requires manual memory management, while JavaScript is dynamically typed and garbage collection is automatically processed. 2) C/C needs to be compiled into machine code, while JavaScript is an interpreted language. 3) JavaScript introduces concepts such as closures, prototype chains and Promise, which enhances flexibility and asynchronous programming capabilities.

Building a Multi-Tenant SaaS Application with Next.js (Backend Integration) Building a Multi-Tenant SaaS Application with Next.js (Backend Integration) Apr 11, 2025 am 08:23 AM

I built a functional multi-tenant SaaS application (an EdTech app) with your everyday tech tool and you can do the same. First, what’s a multi-tenant SaaS application? Multi-tenant SaaS applications let you serve multiple customers from a sing

See all articles