Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🐛PermissionDeniedError: Cloudflare Blocking API Access #1344

Open
hherpa opened this issue Nov 3, 2024 · 0 comments
Open

🐛PermissionDeniedError: Cloudflare Blocking API Access #1344

hherpa opened this issue Nov 3, 2024 · 0 comments
Labels
Priority: Normal Minor issue impacting one or more users Type: Bug Something isn't working

Comments

@hherpa
Copy link

hherpa commented Nov 3, 2024

I then moved on to investigating the Groq API and ran the code from the documentation. You can see the error and the code that led to it in this notebook.

Steps to Reproduce

notebook
os: win11

Relevant Logs/Tracbacks

-----------------------------------------------------------
PermissionDeniedError     Traceback (most recent call last)
Cell In[21], line 1
----> 1 response = llm.complete("Explain the importance of low latency LLMs")
      2 print(response)

File ~\AppData\Roaming\Python\Python311\site-packages\llama_index\core\instrumentation\dispatcher.py:311, in Dispatcher.span.<locals>.wrapper(func, instance, args, kwargs)
    308             _logger.debug(f"Failed to reset active_span_id: {e}")
    310 try:
--> 311     result = func(*args, **kwargs)
    312     if isinstance(result, asyncio.Future):
    313         # If the result is a Future, wrap it
    314         new_future = asyncio.ensure_future(result)

File ~\AppData\Roaming\Python\Python311\site-packages\llama_index\llms\openai_like\base.py:99, in OpenAILike.complete(self, prompt, formatted, **kwargs)
     96 if not formatted:
     97     prompt = self.completion_to_prompt(prompt)
---> 99 return super().complete(prompt, **kwargs)

File ~\AppData\Roaming\Python\Python311\site-packages\llama_index\core\instrumentation\dispatcher.py:311, in Dispatcher.span.<locals>.wrapper(func, instance, args, kwargs)
    308             _logger.debug(f"Failed to reset active_span_id: {e}")
    310 try:
--> 311     result = func(*args, **kwargs)
    312     if isinstance(result, asyncio.Future):
    313         # If the result is a Future, wrap it
    314         new_future = asyncio.ensure_future(result)

File ~\AppData\Roaming\Python\Python311\site-packages\llama_index\core\llms\callbacks.py:431, in llm_completion_callback.<locals>.wrap.<locals>.wrapped_llm_predict(_self, *args, **kwargs)
    422 event_id = callback_manager.on_event_start(
    423     CBEventType.LLM,
    424     payload={
   (...)
    428     },
    429 )
    430 try:
--> 431     f_return_val = f(_self, *args, **kwargs)
    432 except BaseException as e:
    433     callback_manager.on_event_end(
    434         CBEventType.LLM,
    435         payload={EventPayload.EXCEPTION: e},
    436         event_id=event_id,
    437     )

File ~\AppData\Roaming\Python\Python311\site-packages\llama_index\llms\openai\base.py:375, in OpenAI.complete(self, prompt, formatted, **kwargs)
    373 else:
    374     complete_fn = self._complete
--> 375 return complete_fn(prompt, **kwargs)

File ~\AppData\Roaming\Python\Python311\site-packages\llama_index\core\base\llms\generic_utils.py:173, in chat_to_completion_decorator.<locals>.wrapper(prompt, **kwargs)
    170 def wrapper(prompt: str, **kwargs: Any) -> CompletionResponse:
    171     # normalize input
    172     messages = prompt_to_messages(prompt)
--> 173     chat_response = func(messages, **kwargs)
    174     # normalize output
    175     return chat_response_to_completion_response(chat_response)

File ~\AppData\Roaming\Python\Python311\site-packages\llama_index\llms\openai\base.py:106, in llm_retry_decorator.<locals>.wrapper(self, *args, **kwargs)
     97     return f(self, *args, **kwargs)
     99 retry = create_retry_decorator(
    100     max_retries=max_retries,
    101     random_exponential=True,
   (...)
    104     max_seconds=20,
    105 )
--> 106 return retry(f)(self, *args, **kwargs)

File ~\AppData\Roaming\Python\Python311\site-packages\tenacity\__init__.py:289, in BaseRetrying.wraps.<locals>.wrapped_f(*args, **kw)
    287 @functools.wraps(f)
    288 def wrapped_f(*args: t.Any, **kw: t.Any) -> t.Any:
--> 289     return self(f, *args, **kw)

File ~\AppData\Roaming\Python\Python311\site-packages\tenacity\__init__.py:379, in Retrying.__call__(self, fn, *args, **kwargs)
    377 retry_state = RetryCallState(retry_object=self, fn=fn, args=args, kwargs=kwargs)
    378 while True:
--> 379     do = self.iter(retry_state=retry_state)
    380     if isinstance(do, DoAttempt):
    381         try:

File ~\AppData\Roaming\Python\Python311\site-packages\tenacity\__init__.py:314, in BaseRetrying.iter(self, retry_state)
    312 is_explicit_retry = fut.failed and isinstance(fut.exception(), TryAgain)
    313 if not (is_explicit_retry or self.retry(retry_state)):
--> 314     return fut.result()
    316 if self.after is not None:
    317     self.after(retry_state)

File C:\Program Files\Python311\Lib\concurrent\futures\_base.py:449, in Future.result(self, timeout)
    447     raise CancelledError()
    448 elif self._state == FINISHED:
--> 449     return self.__get_result()
    451 self._condition.wait(timeout)
    453 if self._state in [CANCELLED, CANCELLED_AND_NOTIFIED]:

File C:\Program Files\Python311\Lib\concurrent\futures\_base.py:401, in Future.__get_result(self)
    399 if self._exception:
    400     try:
--> 401         raise self._exception
    402     finally:
    403         # Break a reference cycle with the exception in self._exception
    404         self = None

File ~\AppData\Roaming\Python\Python311\site-packages\tenacity\__init__.py:382, in Retrying.__call__(self, fn, *args, **kwargs)
    380 if isinstance(do, DoAttempt):
    381     try:
--> 382         result = fn(*args, **kwargs)
    383     except BaseException:  # noqa: B902
    384         retry_state.set_exception(sys.exc_info())  # type: ignore[arg-type]

File ~\AppData\Roaming\Python\Python311\site-packages\llama_index\llms\openai\base.py:429, in OpenAI._chat(self, messages, **kwargs)
    426 message_dicts = to_openai_message_dicts(messages, model=self.model)
    428 if self.reuse_client:
--> 429     response = client.chat.completions.create(
    430         messages=message_dicts,
    431         stream=False,
    432         **self._get_model_kwargs(**kwargs),
    433     )
    434 else:
    435     with client:

File ~\AppData\Roaming\Python\Python311\site-packages\openai\_utils\_utils.py:274, in required_args.<locals>.inner.<locals>.wrapper(*args, **kwargs)
    272             msg = f"Missing required argument: {quote(missing[0])}"
    273     raise TypeError(msg)
--> 274 return func(*args, **kwargs)

File ~\AppData\Roaming\Python\Python311\site-packages\openai\resources\chat\completions.py:815, in Completions.create(self, messages, model, audio, frequency_penalty, function_call, functions, logit_bias, logprobs, max_completion_tokens, max_tokens, metadata, modalities, n, parallel_tool_calls, presence_penalty, response_format, seed, service_tier, stop, store, stream, stream_options, temperature, tool_choice, tools, top_logprobs, top_p, user, extra_headers, extra_query, extra_body, timeout)
    775 @required_args(["messages", "model"], ["messages", "model", "stream"])
    776 def create(
    777     self,
   (...)
    812     timeout: float | httpx.Timeout | None | NotGiven = NOT_GIVEN,
    813 ) -> ChatCompletion | Stream[ChatCompletionChunk]:
    814     validate_response_format(response_format)
--> 815     return self._post(
    816         "/chat/completions",
    817         body=maybe_transform(
    818             {
    819                 "messages": messages,
    820                 "model": model,
    821                 "audio": audio,
    822                 "frequency_penalty": frequency_penalty,
    823                 "function_call": function_call,
    824                 "functions": functions,
    825                 "logit_bias": logit_bias,
    826                 "logprobs": logprobs,
    827                 "max_completion_tokens": max_completion_tokens,
    828                 "max_tokens": max_tokens,
    829                 "metadata": metadata,
    830                 "modalities": modalities,
    831                 "n": n,
    832                 "parallel_tool_calls": parallel_tool_calls,
    833                 "presence_penalty": presence_penalty,
    834                 "response_format": response_format,
    835                 "seed": seed,
    836                 "service_tier": service_tier,
    837                 "stop": stop,
    838                 "store": store,
    839                 "stream": stream,
    840                 "stream_options": stream_options,
    841                 "temperature": temperature,
    842                 "tool_choice": tool_choice,
    843                 "tools": tools,
    844                 "top_logprobs": top_logprobs,
    845                 "top_p": top_p,
    846                 "user": user,
    847             },
    848             completion_create_params.CompletionCreateParams,
    849         ),
    850         options=make_request_options(
    851             extra_headers=extra_headers, extra_query=extra_query, extra_body=extra_body, timeout=timeout
    852         ),
    853         cast_to=ChatCompletion,
    854         stream=stream or False,
    855         stream_cls=Stream[ChatCompletionChunk],
    856     )

File ~\AppData\Roaming\Python\Python311\site-packages\openai\_base_client.py:1277, in SyncAPIClient.post(self, path, cast_to, body, options, files, stream, stream_cls)
   1263 def post(
   1264     self,
   1265     path: str,
   (...)
   1272     stream_cls: type[_StreamT] | None = None,
   1273 ) -> ResponseT | _StreamT:
   1274     opts = FinalRequestOptions.construct(
   1275         method="post", url=path, json_data=body, files=to_httpx_files(files), **options
   1276     )
-> 1277     return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))

File ~\AppData\Roaming\Python\Python311\site-packages\openai\_base_client.py:954, in SyncAPIClient.request(self, cast_to, options, remaining_retries, stream, stream_cls)
    951 else:
    952     retries_taken = 0
--> 954 return self._request(
    955     cast_to=cast_to,
    956     options=options,
    957     stream=stream,
    958     stream_cls=stream_cls,
    959     retries_taken=retries_taken,
    960 )

File ~\AppData\Roaming\Python\Python311\site-packages\openai\_base_client.py:1058, in SyncAPIClient._request(self, cast_to, options, retries_taken, stream, stream_cls)
   1055         err.response.read()
   1057     log.debug("Re-raising status error")
-> 1058     raise self._make_status_error_from_response(err.response) from None
   1060 return self._process_response(
   1061     cast_to=cast_to,
   1062     options=options,
   (...)
   1066     retries_taken=retries_taken,
   1067 )

PermissionDeniedError: <!DOCTYPE html>
<!--[if lt IE 7]> <html class="no-js ie6 oldie" lang="en-US"> <![endif]-->
<!--[if IE 7]>    <html class="no-js ie7 oldie" lang="en-US"> <![endif]-->
<!--[if IE 8]>    <html class="no-js ie8 oldie" lang="en-US"> <![endif]-->
<!--[if gt IE 8]><!--> <html class="no-js" lang="en-US"> <!--<![endif]-->
<head>
<title>Attention Required! | Cloudflare</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
<meta http-equiv="X-UA-Compatible" content="IE=Edge" />
<meta name="robots" content="noindex, nofollow" />
<meta name="viewport" content="width=device-width,initial-scale=1" />
<link rel="stylesheet" id="cf_styles-css" href="/cdn-cgi/styles/cf.errors.css" />
<!--[if lt IE 9]><link rel="stylesheet" id='cf_styles-ie-css' href="/cdn-cgi/styles/cf.errors.ie.css" /><![endif]-->
<style>body{margin:0;padding:0}</style>


<!--[if gte IE 10]><!-->
<script>
  if (!navigator.cookieEnabled) {
    window.addEventListener('DOMContentLoaded', function () {
      var cookieEl = document.getElementById('cookie-alert');
      cookieEl.style.display = 'block';
    })
  }
</script>
<!--<![endif]-->


</head>
<body>
  <div id="cf-wrapper">
    <div class="cf-alert cf-alert-error cf-cookie-error" id="cookie-alert" data-translate="enable_cookies">Please enable cookies.</div>
    <div id="cf-error-details" class="cf-error-details-wrapper">
      <div class="cf-wrapper cf-header cf-error-overview">
        <h1 data-translate="block_headline">Sorry, you have been blocked</h1>
        <h2 class="cf-subheadline"><span data-translate="unable_to_access">You are unable to access</span> groq.com</h2>
      </div><!-- /.header -->

      <div class="cf-section cf-highlight">
        <div class="cf-wrapper">
          <div class="cf-screenshot-container cf-screenshot-full">
            
              <span class="cf-no-screenshot error"></span>
            
          </div>
        </div>
      </div><!-- /.captcha-container -->

      <div class="cf-section cf-wrapper">
        <div class="cf-columns two">
          <div class="cf-column">
            <h2 data-translate="blocked_why_headline">Why have I been blocked?</h2>

            <p data-translate="blocked_why_detail">This website is using a security service to protect itself from online attacks. The action you just performed triggered the security solution. There are several actions that could trigger this block including submitting a certain word or phrase, a SQL command or malformed data.</p>
          </div>

          <div class="cf-column">
            <h2 data-translate="blocked_resolve_headline">What can I do to resolve this?</h2>

            <p data-translate="blocked_resolve_detail">You can email the site owner to let them know you were blocked. Please include what you were doing when this page came up and the Cloudflare Ray ID found at the bottom of this page.</p>
          </div>
        </div>
      </div><!-- /.section -->

      <div class="cf-error-footer cf-wrapper w-240 lg:w-full py-10 sm:py-4 sm:px-8 mx-auto text-center sm:text-left border-solid border-0 border-t border-gray-300">
  <p class="text-13">
    <span class="cf-footer-item sm:block sm:mb-1">Cloudflare Ray ID: <strong class="font-semibold">8dc8bf45ddde9d9d</strong></span>
    <span class="cf-footer-separator sm:hidden">&bull;</span>
    <span id="cf-footer-item-ip" class="cf-footer-item hidden sm:block sm:mb-1">
      Your IP:
      <button type="button" id="cf-footer-ip-reveal" class="cf-footer-ip-reveal-btn">Click to reveal</button>
      <span class="hidden" id="cf-footer-ip">213.151.4.51</span>
      <span class="cf-footer-separator sm:hidden">&bull;</span>
    </span>
    <span class="cf-footer-item sm:block sm:mb-1"><span>Performance &amp; security by</span> <a rel="noopener noreferrer" href="https://www.cloudflare.com/5xx-error-landing" id="brand_link" target="_blank">Cloudflare</a></span>
    
  </p>
  <script>(function(){function d(){var b=a.getElementById("cf-footer-item-ip"),c=a.getElementById("cf-footer-ip-reveal");b&&"classList"in b&&(b.classList.remove("hidden"),c.addEventListener("click",function(){c.classList.add("hidden");a.getElementById("cf-footer-ip").classList.remove("hidden")}))}var a=document;document.addEventListener&&a.addEventListener("DOMContentLoaded",d)})();</script>
</div><!-- /.error-footer -->


    </div><!-- /#cf-error-details -->
  </div><!-- /#cf-wrapper -->

  <script>
  window._cf_translation = {};
  
  
</script>

</body>
</html>
@hherpa hherpa added Priority: Normal Minor issue impacting one or more users Type: Bug Something isn't working labels Nov 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Priority: Normal Minor issue impacting one or more users Type: Bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant