STORM: How to Install and Use This Revolutionary Content Generation Tool ๐
Friday, Jan 3, 2025 | 9 minute read
Transform your content creation journey with an innovative tool that efficiently organizes ideas, conducts research, and generates complete articles effortlessly. It’s designed to boost creativity and save precious time! โณ๐กโจ
“In the age of information overload, learning how to efficiently generate high-quality content has become a must for every creator.” ๐
In this rapidly accelerating wave of artificial intelligence development, content generation tools have sprouted up like mushrooms after rain, greatly easing our creative journeys! STORM stands out as a leader in this field, opening a brand-new window to unleash the magic of writing in a simpler and more efficient way! This tool organizes your ideas systematically, allowing you to unleash your creativity while saving a significant amount of precious time and energy! ๐กโจ
1. What is STORM? Unveiling the Revolutionary Writing Tool โจ
STORM is a cutting-edge system designed to generate structured, Wikipedia-like articles from scratch, offering exceptional information generation capabilities and automated writing features! ๐ It quickly conducts research via the internet, easily collects reference materials, organizes outlines, and ultimately transforms these outlines into complete articles, dramatically increasing content creation efficiency! ๐โโ๏ธ
The core features of STORM include rapid information retrieval and integration, supporting users in generating systematic knowledge outputs, widely applied across various fields such as education, research, and content creation, and becoming an excellent little helper! ๐๐
2. The Sharp Tools of STORM: A Detailed Analysis of Unique Features โ๏ธ
The writing process with STORM is divided into two main stages: the pre-writing stage and the writing stage. In the pre-writing phase, the system collects information through online research and generates an outline; during the writing stage, it produces complete articles based on these structured outlines, making the entire process simple and efficient! โฉ
Another impressive product, Co-STORM, is an enhanced version of STORM that introduces the concept of collaborative research, allowing users to observe and guide conversations between different AI agents, thus facilitating knowledge organization and understanding โ simply amazing! ๐ค
Furthermore, STORM features a question generation mechanism guided by perspectives, and this is not just talk! It generates multi-faceted questions through research of current literature, enhancing the richness, depth, and breadth of content! ๐๐ฌ
3. Why Developers Choose STORM? Unveiling Its Potential Value ๐
STORM’s efficient knowledge organization and deduction capabilities significantly enhance the quality of information processing and content creation, enabling developers to better structure their thoughts, especially excelling in complex projects or research! ๐
Compared to traditional content creation tools, STORM offers more advanced features that enable automated generation of structured knowledge content, going beyond mere editing and formatting. This can save users a considerable amount of time and effort! โณโจ
Developers have reported that after using STORM, they saw significant improvements in the speed of writing articles and the quality of content. Many users highly praise the accuracy and relevance of content generated by STORM, providing great satisfaction! ๐๐
With STORM, we are witnessing a transformation in information generation technology and a tremendous leap in creative efficiency, making the writing journey for every developer much easier and more effective!
4. Installing Knowledge Storm Directly ๐ ๏ธ
To get started with Knowledge Storm, installing this library is the first step! Simply run the following command using pip
to install:
pip install knowledge-storm
- This command fetches the latest version of the library from the Python Package Index (PyPI) and installs it in your Python environmentโsuper easy! ๐
Clone the Source Code for Customization ๐ง
If you want to dive deeper into customizing Knowledge Storm, start by cloning the repository from GitHub and installing the necessary dependencies:
-
Clone the repository:
git clone https://github.com/stanford-oval/storm.git cd storm
- Running this command will copy the repository to your local machine and switch to the cloned directoryโgive it a try! ๐
-
Install the required packages:
conda create -n storm python=3.11 conda activate storm pip install -r requirements.txt
- The first command (
conda create
) creates a new conda environment named ‘storm’ and specifies using Python 3.11. - The second command (
conda activate storm
) activates the newly created environment, getting everything ready. - Finally, the last command installs all necessary Python packages listed in the
requirements.txt
file so you can efficiently run all of Knowledge Storm’s features! ๐งโ๐ป
- The first command (
5. API Overview ๐
Knowledge Storm supports various language model components and retrieval modules, here are some example components:
- Language Models:
OpenAIModel
,AzureOpenAIModel
,ClaudeModel
, etc. - Retrieval Modules:
YouRM
,BingSearch
,VectorRM
, etc.
Example of STORM โจ
Here’s an example using the You.com search engine with the STORMWikiRunner
class combined with an OpenAI model:
import os
from knowledge_storm import STORMWikiRunnerArguments, STORMWikiRunner, STORMWikiLMConfigs
from knowledge_storm.lm import OpenAIModel
from knowledge_storm.rm import YouRM
- The above import statements load the required classes and functions from the
knowledge_storm
library;STORMWikiRunner
,STORMWikiRunnerArguments
, andSTORMWikiLMConfigs
are fundamental classes for building the runner. Let’s see how to use them! ๐ฅ
Next, configure the language model and retrieval settings:
lm_configs = STORMWikiLMConfigs() # Create configurations for the language model
openai_kwargs = {
'api_key': os.getenv("OPENAI_API_KEY"), # Retrieve OpenAI API key from environment variables
'temperature': 1.0, # Influences the randomness of the modelโs output; 1.0 indicates moderate randomness
'top_p': 0.9, # Core sampling to control diversity; 0.9 means considering the top 90% of the probability mass
}
- Here,
STORMWikiLMConfigs()
initializes a new language model configuration object for convenient operation. - The
openai_kwargs
dictionary holds parameters such as the API key, temperature, and top_p that directly influence the characteristics of the text generated by the OpenAI modelโcreative and fun! ๐
Now, initialize the language models:
gpt_35 = OpenAIModel(model='gpt-3.5-turbo', max_tokens=500, **openai_kwargs)
gpt_4 = OpenAIModel(model='gpt-4', max_tokens=3000, **openai_kwargs)
- This creates two instances of the
OpenAIModel
, specifying the ‘gpt-3.5-turbo’ and ‘gpt-4’ models along with their maximum token limitsโlet’s get started!
Next, set the language models into our configuration:
lm_configs.set_conv_simulator_lm(gpt_35) # Set conversation simulator model
lm_configs.set_question_asker_lm(gpt_35) # Set question model
lm_configs.set_outline_gen_lm(gpt_4) # Set outline generation model
lm_configs.set_article_gen_lm(gpt_4) # Set article generation model
lm_configs.set_article_polish_lm(gpt_4) # Set article polishing model
- Each method call assigns different language models to the respective tasks in the pipeline, refining step by step to make your generated content more precise!
Initialize the retrieval module and the runner:
engine_args = STORMWikiRunnerArguments(...) # Set the parameters for running the STORM engine
rm = YouRM(ydc_api_key=os.getenv('YDC_API_KEY'), k=engine_args.search_top_k) # Initialize the retrieval module
runner = STORMWikiRunner(engine_args, lm_configs, rm) # Create an instance of STORMWikiRunner
STORMWikiRunnerArguments
configures the operations of the STORM engine, gearing up for a grand rollout!- The
YouRM
object initializes with the API key from You.comโs search engine and specifies the number of top search results to retrieve. - Finally, integrate the engine parameters, language model configurations, and retrieval methods to create a powerful
STORMWikiRunner
instance! ๐ ๏ธ
Now, let’s call the STORMWikiRunner
:
topic = input('Topic: ') # Get user-inputted topic
runner.run(
topic=topic,
do_research=True, # Research based on the topic
do_generate_outline=True, # Generate outline based on research results
do_generate_article=True, # Generate article using the outline
do_polish_article=True, # Polish the generated article
)
runner.post_run() # Handle follow-up tasks after running
runner.summary() # Display a summary of the execution process
- Here, we prompt the user for a topic, then call
runner.run()
while passing options for research, outline generation, article generation, and polishing! โจ Thepost_run()
method wraps up all end operations, andsummary()
shows a delightful recap of the execution. ๐
Example of Co-STORM ๐
Co-STORM employs a collaborative approach to generate articles; here’s an example integrating Bing search and OpenAI modelsโabsolutely fantastic!
from knowledge_storm.collaborative_storm.engine import CollaborativeStormLMConfigs, RunnerArgument, CoStormRunner
from knowledge_storm.lm import OpenAIModel
from knowledge_storm.logging_wrapper import LoggingWrapper
from knowledge_storm.rm import BingSearch
- As usual, we import the relevant classes to execute collaborative storm operations.
CoStormRunner
is designed for internationalized content generation, harnessing the powerful features of search enginesโready to use! ๐
Next, set up language model configurations:
lm_config = CollaborativeStormLMConfigs() # Initialize collaborative language model configurations
openai_kwargs = {
"api_key": os.getenv("OPENAI_API_KEY"), # Access OpenAI API key from environment variable
"api_provider": "openai", # Specify the API provider
"temperature": 1.0, # Set the creativity level of the model
"top_p": 0.9, # Strategy for core sampling
}
- Here, the standard configurations are specially designed for collaborative execution, with an additional entry for
api_provider
. Ready to run!
Initialize the language models for Q&A and discourse management:
question_answering_lm = OpenAIModel(model='gpt-4', max_tokens=1000, **openai_kwargs) # Q&A model
discourse_manage_lm = OpenAIModel(model='gpt-4', max_tokens=500, **openai_kwargs) # Discourse model
- We set two different OpenAI models, one for Q&A and the other for managing dialogues, forming a harmonious pair.
Next, assign the models to the launcher settings:
lm_config.set_question_answering_lm(question_answering_lm) # Assign Q&A model in the configuration
lm_config.set_discourse_manage_lm(discourse_manage_lm) # Assign discourse management model in the configuration
# ... (more configurations)
- Each model is assigned to its role within the collaborative setup to fully meet your creative needs!
Prepare to gather user topic input and organize the actions:
topic = input('Topic: ') # Gather user-inputted topic
runner_argument = RunnerArgument(topic=topic, ...) # Structure the running arguments
logging_wrapper = LoggingWrapper(lm_config) # Enable logging functionality
bing_rm = BingSearch(bing_search_api_key=os.environ.get("BING_SEARCH_API_KEY"), k=runner_argument.retrieve_top_k) # Instantiate Bing search retrieval
costorm_runner = CoStormRunner(lm_config=lm_config, runner_argument=runner_argument, logging_wrapper=logging_wrapper, rm=bing_rm) # Create a Co-STORM running instance
- Collect all components, ready for a grand collaborative content journey!
Invoke the Co-STORMRunner
to create:
costorm_runner.warm_start() # Prepare the runner for operation
conv_turn = costorm_runner.step() # Process dialogue step
costorm_runner.step(user_utterance="YOUR UTTERANCE HERE") # Input user utterance for dialogue processing
costorm_runner.knowledge_base.reorganize() # Update the knowledge base
article = costorm_runner.generate_report() # Generate the final article from the collaborative process
print(article) # Output the generated article
- Use
warm_start()
to prepare the runner, proceed through user input for dialogue processing, and finally update the knowledge base to print the generated article in one go!
6. Quick Start Example Script โก
Want to experience the charm of STORM and Co-STORM? You can find various example scripts in the examples
folder! First, ensure you set your API keys properly in the secrets.toml
file:
OPENAI_API_KEY="your_openai_api_key"
YDC_API_KEY="your_youcom_api_key"
BING_SEARCH_API_KEY="your_bing_api_key"
Run STORM Using GPT Model ๐
Ready to execute STORM with the GPT model? Run the following command:
python examples/storm_examples/run_storm_wiki_gpt.py \
--output-dir $OUTPUT_DIR \
--retriever you \
--do-research \
--do-generate-outline \
--do-generate-article \
--do-polish-article
- Simply run this command to execute the
run_storm_wiki_gpt.py
script with the corresponding parameters, specifying the operations to be executedโeasy peasy!
Run Co-STORM Using GPT Model ๐
To run Co-STORM, follow these steps:
- Ensure youโve updated the
secrets.toml
file with your personal API keys. - Execute:
python examples/costorm_examples/run_costorm_gpt.py \
--output-dir $OUTPUT_DIR \
--retriever bing
- This series of steps will utilize Bing for information retrieval and output results as needed! ๐
That’s the magic of STORM! With it, your creative journey will be incredibly smooth, so letโs embark on this creative adventure together! ๐