-
Notifications
You must be signed in to change notification settings - Fork 13
feat: support AWS bedrock base models #25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
feat: support AWS bedrock base models #25
Conversation
|
Hey @nirga , Quick question about integrating Stability AI models in our Hub. I'm looking at AWS Bedrock's Stable Diffusion 3.5 integration (from their model catalog). Not sure about the best format to implement this:
Would appreciate your thoughts on this. Thanks! |
I think it should be in a new api @detunjiSamuel |
Hey @nirga , I have completed this PR and would appreciate your review.
Please let me know if any additional information or changes are needed. Thank you! |
💵 To receive payouts, sign up on Algora, link your Github account and connect with Stripe. |
… into support_aws_bedrock_as_provider # Conflicts: # Cargo.toml # src/providers/bedrock/logs/amazon_titan_embed_text_v2_0_embeddings.json # src/providers/bedrock/logs/anthropic_claude_3_haiku_20240307_v1_0_chat_completion.json # src/providers/bedrock/logs/us_amazon_nova_lite_v1_0_chat_completion.json # src/providers/bedrock/mod.rs # src/providers/bedrock/models.rs # src/providers/bedrock/test.rs
Hey @detunjiSamuel - I want to merge this PR, can you sign the CLA? |
Ping @detunjiSamuel - can you sign the CLA? |
AWS Bedrock Provider Integration
Added support for AWS Bedrock as a new LLM provider:
Key Changes
Added Bedrock provider implementation with model-specific handlers:
Testing Notes
All tests pass using AWS credentials in us-east-1/2 regions
Verified error handling for invalid credentials/models
Tested non-streaming responses ( models in Bedrock don't seem to have streaming types )
Review notes
The model ID from AWS link does not work consistently.
Instead, use the
Inference profile ARN
orInference profile ID
from the cross-region reference tab as your model_id.Issue: #20
/claim #20