Issue OpenAIException

#1
by prenes - opened

๐Ÿ’ฅ Error
Error in generating model output:
litellm.ContextWindowExceededError: litellm.BadRequestError: ContextWindowExceededError: OpenAIException - Error code: 400 - {'error': {'message': "This model's maximum context length is 128000 tokens. However, your messages resulted in 223850 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}

prenes changed discussion title from Issue to Issue OpenAIException

Sign up or log in to comment