Farfalle

screenshot of Farfalle
nextjs
react
tailwind

AI search engine - self-host with local or cloud LLMs

Overview:

Farfalle is an open-source AI-powered search engine that offers the capability of running local models or using cloud models for search functionalities. It supports various search providers and allows users to ask questions to both cloud models and local models, enhancing the search experience.

Features:

  • Multiple Search Providers: Utilize search capabilities from different providers like Tavily, Searxng, Serper, and Bing.
  • Answer Questions with Models: Ability to answer questions with cloud models like OpenAI/gpt4-o, OpenAI/gpt3.5-turbo, Groq/Llama3, or local models like llama3, mistral, gemma, phi3.
  • Custom LLM Support: Users can leverage custom Language Model Models (LLMs) through LiteLLM for tailored search experiences.
  • Agent-assisted Search: Conduct searches with an agent that plans and executes the search for better and accurate results.

Getting Started Locally:

  1. Prerequisites:

    • Docker
    • Ollama (if using local models)
    • Download and start any supported local models: llama3, mistral, gemma, phi3
    • Start the ollama server: ollama serve
    • Get API keys for optional providers like Tavily, Serper, OpenAI, Bing, Groq
  2. Quick Start:

    • Modify the .env file with API keys (optional for Ollama)
nextjs
Next.js

Next.js is a React-based web framework that enables server-side rendering, static site generation, and other powerful features for building modern web applications.

react
React

React is a widely used JavaScript library for building user interfaces and single-page applications. It follows a component-based architecture and uses a virtual DOM to efficiently update and render UI components

tailwind
Tailwind

Tailwind CSS is a utility-first CSS framework that provides pre-defined classes for building responsive and customizable user interfaces.