You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+41-31
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,13 @@
2
2
3
3
Bedrock Deep Research is a Streamlit-based application using Amazon Bedrock, LangGraph, and LangChain AWS libraries that automates article/report generation through AI-powered research, content writing, and image generation. It combines web research, structured content generation, and human feedback to produce comprehensive, well-researched articles with accompanying header images (generated by [Amazon Bedrock Nova Canvas](https://docs.aws.amazon.com/nova/latest/userguide/what-is-nova.html)). This repo is inspired by LangChain's [Deep Researcher](https://github.com/langchain-ai/open_deep_research/tree/main).
4
4
5
-
## Repository Structure
5
+
## Features
6
+
-**Automated Research**: Performs targeted web searches to gather relevant information
-**Interactive Feedback Loop**: Incorporates human feedback to refine article outlines
9
+
-**AI-Generated Imagery**: Produces relevant header images for visual appeal
10
+
11
+
### Repository Structure
6
12
```
7
13
bedrock_deep_research/
8
14
├── bedrock_deep_research.py # Main Streamlit application entry point
@@ -22,18 +28,37 @@ bedrock_deep_research/
22
28
```
23
29
24
30
25
-
### Prerequisites
26
-
- Python 3.12 (to install, visit this link: https://www.python.org/downloads/).
27
-
Check your python version using `python --version`.
28
-
If your global python isn't set as `3.12`, follow the steps here: https://python-poetry.org/docs/managing-environments/)
29
-
- Poetry for dependency management
30
-
-[AWS Bedrock access](https://docs.aws.amazon.com/bedrock/latest/userguide/model-access-modify.html) for LLM inference (supported models provided in `SUPPORTED_MODELS` variable in `./bedrock_deep_research/config.py`)
31
+
## Work Flow
32
+
The application follows a sequential workflow from topic input to final article generation, with feedback loops for refinement.
31
33
34
+

32
35
33
-
## Setup
36
+
Key Components of the graph:
37
+
1.**Initial Researcher**: It performs initial web searches to gather context
38
+
2.**Article Outline Generator**: creates structured outline using research data
39
+
-**Human Feedback Provider**: This incorporates human feedback for the outline
40
+
3.**Section Writer**: A subgraph that generates content after web research.
41
+
4.**Compilation**: combines all elements into a cohesive article
42
+
5.**Final Section Generation**: Generate the overview and the last paragraph based on the other sections.
The setup is meant to be used locally with [AWS authentication](https://docs.aws.amazon.com/cli/v1/userguide/cli-authentication-short-term.html), as well as within Amazon Sagemaker: either in [JupyterLab](https://docs.aws.amazon.com/sagemaker/latest/dg/studio-updated-jl.html) or [Code Editor](https://docs.aws.amazon.com/sagemaker/latest/dg/code-editor.html) instance.
36
49
50
+
Note: Current setup is assuming `us-east-1` region (as defined in `env.tmp` file).
51
+
52
+
53
+
## Prerequisites
54
+
-**Python 3.12** (to install, visit this link: https://www.python.org/downloads/).
55
+
Check your python version using `python --version`.
56
+
If your global python isn't set as `3.12`, follow the steps here: https://python-poetry.org/docs/managing-environments/)
57
+
-**Poetry** for dependency management
58
+
- Make sure you have **enabled model access** via [AWS Bedrock access](https://docs.aws.amazon.com/bedrock/latest/userguide/model-access-modify.html) in **`us-east-1`** region. You can find the supported models provided in `SUPPORTED_MODELS` variable in `./bedrock_deep_research/config.py`.
59
+
-**Tavily API** key for web research capabilities.
60
+
61
+
37
62
### 1. Installation
38
63
```bash
39
64
# Clone the repository
@@ -49,7 +74,7 @@ poetry install
49
74
50
75
### 2. Create and set your Tavily API key
51
76
52
-
Go to https://app.tavily.com/home and create an API KEY for free. Copy the API Key and paste it into the `env.tmp` file.
77
+
Go to https://app.tavily.com/home and create a free API KEY. Copy the API Key and paste it into the `env.tmp` file.
53
78
54
79
### 3. Setup the environment variables
55
80
@@ -65,19 +90,15 @@ cp env.tmp .env
65
90
streamlit run bedrock_deep_research.py
66
91
```
67
92
68
-
- In the web interface:
69
-
- Enter your article topic
70
-
- Specify writing guidelines
71
-
- Configure search parameters
72
-
- Click "Generate Outline"
93
+
## Using the Application
94
+
73
95
74
-
- Review and provide feedback on the outline:
75
-
- Submit feedback to refine the outline
76
-
- Accept the outline to proceed with full article generation
96
+
| Step | Description | Visual Reference |
97
+
|------|-------------|------------------|
98
+
|**1. Enter Article Details**| • Enter your article topic in the main input field<br>• Add specific writing guidelines in the text area provided<br>• Adjust search parameters using the configuration panel<br>• Click the "Generate Outline" button to start the process ||
99
+
|**2. Review and Refine the Outline**| • Review the AI-generated article outline<br>• Provide specific feedback in the feedback field to improve the structure<br>• Use the editing tools to make direct modifications if needed<br>• Click "Accept Outline" when you're satisfied with the structure ||
100
+
|**3. Generate the Complete Article**| • Review the fully researched and written article with its custom header image<br>• Use the formatting tools to make any final adjustments<br>• Click "Copy to Clipboard" to export your article<br>• Or select "New Article" to start the process again ||
77
101
78
-
- Once complete:
79
-
- Review the generated article with header image
80
-
- Copy to clipboard or start a new article
81
102
82
103
### Configuration details
83
104
@@ -103,17 +124,6 @@ export LOG_LEVEL=DEBUG
103
124
streamlit run bedrock_deep_research.py
104
125
```
105
126
106
-
## Data Flow
107
-
The application follows a sequential workflow from topic input to final article generation, with feedback loops for refinement.
108
-
109
-
Key Components of the graph:
110
-
- Initial Researcher: It performs initial web searches to gather context
111
-
- Article Outline Generator: creates structured outline using research data
112
-
- Human Feedback Provider: This incorporates human feedback for the outline
113
-
- Section Writer: A subgraph that generates content after web research.
- Final compilation: combines all elements into a cohesive article
116
-
117
127
118
128
### Contributing
119
129
Contributions are welcome! Please open an issue or submit a pull request if you have any improvements or bug fixes. Read CONTRIBUTING.md for more details.
0 commit comments