Fractalize Prompt
Fractalize Prompt is a tool that uses a dynamic, fractal network of agents instead of a
Explore
9,659 skills indexed with the new KISS metadata standard.
Fractalize Prompt is a tool that uses a dynamic, fractal network of agents instead of a
<div class="Box-sc-g0xbh4-0 bJMeLZ js-snippet-clipboard-copy-unpositioned" data-hpc="true"><article class="markdown-body entry-content container-lg" itemprop="text"><div class="markdown-heading" dir="auto"><h1 tabindex="-1" class="heading-element" dir="auto"><font style="vertical-align: inherit;"><font style="vertical-align: inherit;">解密提示</font></font></h1><a id="user-content-decryptprompt" class="anchor" aria-label="永久链接:解密提示" href="#decryptprompt"><svg class="octicon octicon-link" viewBox="0 0 16 16" version="1.1" width="16" height="16" aria-hidden="true"><path d="m7.775 3.275 1.25-1.25a3.5 3.5 0 1 1 4.95 4.95l-2.5 2.5a3.5 3.5 0 0 1-4.95 0 .751.751 0 0 1 .018-1.042.751.751 0 0 1 1.042-.018 1.998 1.998 0 0 0 2.83 0l2.5-2.5a2.002 2.002 0 0 0-2.83-2.83l-1.25 1.25a.751.751 0 0 1-1.042-.018.751.751 0 0 1-.018-1.042Zm-4.69 9.64a1.998 1.998 0 0 0 2.83 0l1.25-1.25a.751.751 0 0 1 1.042.018.751.751 0 0 1 .018 1.042l-1.25 1.25a3.5 3.5 0 1 1-4.95-4.95l2.5-2.5a3.5 3.5 0 0 1 4.95 0 .751.751 0 0 1-.018 1.042.751.751 0 0 1-1.042.018 1.998 1.998 0 0 0-2.83 0l-2.5 2.5a1.998 1.998 0 0 0 0 2.83Z"></path></svg></a></div>
[](https://bolt.diy)
[](https://bolt.diy)
This fork of Bolt.new allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
[](https://bolt.diy)
[](https://bolt.diy)
[](https://github.com/codespaces/new?hide_repo_select=true&machine=basicLinux32gb&repo=725257907&ref=main&devcontainer_path=.devcontainer%2Fdevcontainer.json&geo=UsEast)
[](https://bolt.diy)
I've authored an e-book called **"The Art of ChatGPT Prompting: A Guide to
[](https://bolt.diy)
This fork of Bolt.new (oTToDev) allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
_Unmute video for voice-over_
**Status:** Pre‑alpha | **Architecture:** Agenctic Build Server Leveraging Open & Proprietary LLMs
Aplicación en python e integración LLM con el API de GROQ, de un asesor nutricional personal.
<p align="center">
This is the repository associated with the Master’s Degree Project, [Automated Reproducible Malware Analysis: A Standardized Testbed for Prompt-Driven LLMs](https://www.diva-portal.org/smash/get/diva2:1973561/FULLTEXT01.pdf) (120 ECTS) in Informatics with a Specialization in Privacy, Information and Cyber Security, created during the Spring term 2025. The entire project is released under the Apache 2.0 license and is free to use in further research or other types of development. Although it is not explicitly required to include any citations to the original degree project, it is greatly appreciated if the material is used.
[](https://bolt.diy)
This project implements an autonomous AI agent that scans real-time market data via APIs, identifies high-momentum stocks from gainers lists, parses financial news using LLM (LLaMA), and classifies opportunities using custom prompts. The agent outputs actionable investment signals and risk flags with full automation, serving as a front-end screener for forecasting pipelines. Integrated with Pushover app for on-the-fly mobile phone notifications.
This fork of Bolt.new allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
This Streamlit web app lets you upload `.eml` email files and use Anthropic Claude (via AWS Bedrock) to:
[](https://bolt.diy)
[](https://bolt.diy)
$ cargo install code2prompt