-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Graphrag integration #4612
base: main
Are you sure you want to change the base?
Graphrag integration #4612
Conversation
hi @lspinheiro - this is exciting. its also marked as DRAFT in the subject line but not marked as such in the PR - I'm marking as draft and please set it back by clicking Ready to Review when you are ready. |
Exciting to see this!! I love the tool idea. The tool itself can also be stateful and shared by multiple agents. |
Thanks @ekzhu and @rysweet . This should be ready for review now. Still needs improvements as mentioned in the description, but the tools can be used. I used the following test script. import asyncio
from autogen_core import CancellationToken
from autogen_ext.models.openai import AzureOpenAIChatCompletionClient
from autogen_ext.tools.graphrag import (
GlobalSearchTool,
LocalSearchTool,
GlobalDataConfig,
LocalDataConfig,
EmbeddingConfig,
)
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
async def main():
openai_client = AzureOpenAIChatCompletionClient(
model="gpt-4o-mini",
azure_endpoint="https://<resource-name>.openai.azure.com",
azure_deployment="gpt-4o-mini",
api_version="2024-08-01-preview",
azure_ad_token_provider=get_bearer_token_provider(DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default")
)
# Global search example
global_config = GlobalDataConfig(
input_dir="./autogen-test/ragtest/output"
)
global_tool = GlobalSearchTool.from_config(
openai_client=openai_client,
data_config=global_config
)
global_args = {
"query": "What does the station-master says about Dr. Becher?"
}
global_result = await global_tool.run_json(global_args, CancellationToken())
print("\nGlobal Search Result:")
print(global_result)
# Local search example
local_config = LocalDataConfig(
input_dir="./autogen-test/ragtest/output"
)
embedding_config = EmbeddingConfig(
model="text-embedding-3-small",
api_base="https://<resource-name>.openai.azure.com",
deployment_name="text-embedding-3-small",
api_version="2023-05-15",
api_type="azure",
azure_ad_token_provider=get_bearer_token_provider(DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default"),
max_retries=10,
request_timeout=180.0,
)
local_tool = LocalSearchTool.from_config(
openai_client=openai_client,
data_config=local_config,
embedding_config=embedding_config
)
local_args = {
"query": "What does the station-master says about Dr. Becher?"
}
local_result = await local_tool.run_json(local_args, CancellationToken())
print("\nLocal Search Result:")
print(local_result)
if __name__ == "__main__":
asyncio.run(main()) |
@jackgerrits , I had to add |
Thank you! More documentation would help me review this PR. I would like to be able to build the docs page on this PR and see the example. |
python/packages/autogen-ext/src/autogen_ext/tools/graphrag/_model_adapter.py
Outdated
Show resolved
Hide resolved
Related #4438 |
@gagb , I added a sample with a readme and some docstrings that should help with the review. |
python/packages/autogen-ext/src/autogen_ext/tools/graphrag/_local_search.py
Outdated
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/tools/graphrag/_local_search.py
Show resolved
Hide resolved
) | ||
|
||
# Set up global search tool | ||
global_tool = GlobalSearchTool.from_settings(settings_path="./settings.yaml") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a minimum example for the settings.yaml
that can work with a few text documents?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We would like to have a dev to just copy and run the example in a script and it will work out of box.
So let's also include the installation instructions:
pip install "autogen-agentchat==0.4.0.dev12" "autogen-ext[graphrag]==0.4.0.dev12"
And have a text file downloaded from an online URL. e.g., the README.md
file in our repo, with GraphRAG commands that build an index.
Then show an example of a YAML file for setttings.
Then the Python script that runs the agent with the tool.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a minimum example for the
settings.yaml
that can work with a few text documents?
@ekzhu I wanted to avoid bringing the graphrag setup steps over in an intentional way. The set up process is reasonably complex and requires initialization (which generates the settings), setting up data, tuning prompts and then indexing. My thought was that it would be best to direct the users to the graphrag docs to understand how to go through the initial steps and then, once indexing is done, they can easily integrate with autogen with the tools.
The reasoning was that it is quite easy for things to go wrong in those steps and if I think users should look for the solutions to problems they may during setup in the graphrag repo and docs and adding steps that may not work out-of-the-box may encourage them to ask for help in autogen's issue. What do you think of that? Maybe we can expand the note to make this clearer?
Why are these changes needed?
This PR adds initial integration between graphrag and autogen by exposing local and global search as tools that can be used in
autogen-agentchat
. To be followed up with a user-guide/cookbook. I I added no tests because the test data I used was fairly large and I'm not sure we have a stablished way to add tests for those more complex integrations but there is a script below that I used. The indexing needs to be done in graphrag first, the goal is to illustrate the e2e steps in a notebook.Would appreciate some initial feedback, hoping to gradually extend with more flexible configuration, integration of drift search and examples.
Related issue number
Checks