Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Network error #8

Closed
coderFrankenstain opened this issue Apr 14, 2023 · 30 comments
Closed

Network error #8

coderFrankenstain opened this issue Apr 14, 2023 · 30 comments

Comments

@coderFrankenstain
Copy link

I've set up my Chatflows and saved, and then I start talking to the bot, and it returns Network error.

Apple M1
macos 13.2.1

pics

企业微信截图_6e728585-d1d9-4bb7-991a-2c5cc68a313f
企业微信截图_2222f9e6-0d58-4249-b4da-7398ab3c5453

@JsBeta
Copy link

JsBeta commented Apr 14, 2023

so did i
image

image

image

@HenryHengZJ
Copy link
Contributor

This is most likely due to slow network or OpenAI/SERP requests return 500 response.

Few things to check:
1.) Is OpenAI API key works correctly?
2.) Are you able to get successful response from OpenAI API endpoints: https://platform.openai.com/docs/api-reference/completions/create?
3.) Are you able to get successful response from SERP API endpoint?

@coderFrankenstain
Copy link
Author

I use curl to test OpenAI / SERP , both of them can work . but the agent still network

企业微信截图_81bba396-9c06-40db-b3ab-5d4ff9de9b7d_副本
企业微信截图_f04a8516-4821-4898-9421-386dbb6a31b0
企业微信截图_67c7b008-e428-485b-9565-44281fa014ff

@HenryHengZJ
Copy link
Contributor

@coderFrankenstain thanks for trying those curl commands. Is this occurring on Agent only, does it works for Chains?
Curious to see does it works if you setup a node repo separately that execute the same thing as the flow:
https://js.langchain.com/docs/modules/agents/agents/examples/llm_mrkl

@coderFrankenstain
Copy link
Author

@HenryHengZJ I try to run this demo https://js.langchain.com/docs/modules/agents/agents/examples/llm_mrkl ,it return error like this

企业微信截图_16817205028540_副本

@wjhtinger
Copy link

How about this issue? I meet the same problem

@barom
Copy link

barom commented Jun 2, 2023

@wjhtinger 国内运行是要代理的,可以用tunnel代理。
参考:openai/openai-quickstart-node#79

@wjhtinger
Copy link

@wjhtinger 国内运行是要代理的,可以用tunnel代理。 参考:openai/openai-quickstart-node#79

我在windows上装的,windows有代理,可以访问外网,这样也不行?

@droc12
Copy link

droc12 commented Jun 11, 2023

@wjhtinger 国内运行是要代理的,可以用tunnel代理。 参考:openai/openai-quickstart-node#79

Do you know which file specifically needs to be modified? Thank you very much.

@Bitnut
Copy link

Bitnut commented Jun 27, 2023

@wjhtinger 国内运行是要代理的,可以用tunnel代理。 参考:openai/openai-quickstart-node#79

我在windows上装的,windows有代理,可以访问外网,这样也不行?

Axios which is used by Flowise does not respect your local proxy config, so you must modify Flowise code to use your proxy.

@xShuisheng
Copy link

@wjhtinger 国内运行是要代理的,可以用tunnel代理。 参考:openai/openai-quickstart-node#79

我在windows上装的,windows有代理,可以访问外网,这样也不行?

Axios which is used by Flowise does not respect your local proxy config, so you must modify Flowise code to use your proxy.

请问你知道应该修改哪个文件的代码来设置proxy吗

@cuizheng0520
Copy link

cuizheng0520 commented Jun 30, 2023 via email

@xShuisheng
Copy link

我也想知道 xShuisheng @.>于2023年6月30日 周五11:59写道:

@wjhtinger https://github.com/wjhtinger 国内运行是要代理的,可以用tunnel代理。 参考: openai/openai-quickstart-node#79 <openai/openai-quickstart-node#79> 我在windows上装的,windows有代理,可以访问外网,这样也不行? Axios which is used by Flowise does not respect your local proxy config, so you must modify Flowise code to use your proxy. 请问你知道应该修改哪个文件的代码来设置proxy吗 — Reply to this email directly, view it on GitHub <#8 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/A6WQJXKBFHZFRKBROF7WTITXNZFITANCNFSM6AAAAAAW53DU4M . You are receiving this because you are subscribed to this thread.Message ID: @.
>

试了好像之前的一些版本不会有这个问题,可能还是代码的bug导致的,你可以试试旧的版本

@cuizheng0520
Copy link

哪个版本没有问题呀?我折腾好几天了

@xShuisheng
Copy link

哪个版本没有问题呀?我折腾好几天了
我自己试了 [email protected]那个版本是没有问题的,往后的还没有试过,你可以试一试

@carlye
Copy link

carlye commented Jul 7, 2023

@wjhtinger 国内运行是要代理的,可以用tunnel代理。 参考:openai/openai-quickstart-node#79

我在windows上装的,windows有代理,可以访问外网,这样也不行?

Axios which is used by Flowise does not respect your local proxy config, so you must modify Flowise code to use your proxy.

请问你知道应该修改哪个文件的代码来设置proxy吗

请问有找到是需要修改哪个文件吗?不管是docker里的还是,yarn install回来的,我没有找到这个对应openai.js文件对应修改的地方。

@carlye
Copy link

carlye commented Jul 7, 2023

哪个版本没有问题呀?我折腾好几天了
我自己试了 [email protected]那个版本是没有问题的,往后的还没有试过,你可以试一试

我也下载了flowise1.2.1版本代码,yarn install yarn build yarn start以后依然访问会失败network error,我有梯子,直接访问openai的api可以访问。

@carlye
Copy link

carlye commented Jul 10, 2023

找到办法了,已经解决。

@LoganJinDev
Copy link

找到办法了,已经解决。

您好 请问如何解决的可以讲一下吗 谢谢 我尝试给axios.create增加代理 但是无效

@carlye
Copy link

carlye commented Jul 18, 2023

找到办法了,已经解决。

您好 请问如何解决的可以讲一下吗 谢谢 我尝试给axios.create增加代理 但是无效

源码方式运行就是需要修改npm install之后的node_modules目录下openai目录下/dist/目录下的 base.js
export const BASE_PATH = "https://api.openai.com/v1".replace(/\/+$/, "");
找到这行,把BASE_PATH设置成梯子地址。

docker方式的,进入到docker容器里,找到lib下的node_modules下相同位置的文件。改的内容一样

@LoganJinDev
Copy link

找到办法了,已经解决了。

您好请问如何解决的可以讲一下吗 谢谢 我尝试给 axios.create 增加代理但是无效

运行方式运行方式就是需要npm源码安装之后的node_modules目录下openai目录下/dist/目录下的base.js修改 export const BASE_PATH = " [https://api.openai.com/v1".replace(//+$ /](https://api.openai.com/v1%22.replace(/%5C/+$/) , ""); 找到这行,把BASE_PATH设置成梯子地址。

docker方式的,进入docker容器里,找到lib下的node_modules下相同位置的文件。改的内容一样

好的 多谢

@ionescofung
Copy link

梯子地址是什么意思?

@carlye
Copy link

carlye commented Jul 24, 2023

梯子地址是什么意思?
参考这个
https://github.com/chatanywhere/GPT_API_free

@kirad1231
Copy link

找到办法了,已经解决。

您好 请问如何解决的可以讲一下吗 谢谢 我尝试给axios.create增加代理 但是无效

源码方式运行就是需要修改npm install之后的node_modules目录下openai目录下/dist/目录下的 base.js export const BASE_PATH = "https://api.openai.com/v1".replace(//+$/; 找到这行,把BASE_PATH设置成梯子地址。

docker方式的,进入到docker容器里,找到lib下的node_modules下相同位置的文件。改的内容一样

大佬能把步骤再具体点的吗,
我遇到两个问题一个是找不到完全一样的路径和base.js文件,我的在以下路径\AppData\Roaming\npm\node_modules\flowise\node_modules\openai\dist,
而且BASE_PATH看起来不太一样。以下:
const axios_1 = require("axios");
exports.BASE_PATH = "https://api.openai.com/v1".replace(/\/+$/, "");

还有就是梯子地址尝试了填入https://api.chatanywhere.com.cn/v1去替换https://api.openai.com/v1;
但是报错401。

所以想了解一下具体步骤和梯子地址该怎么设置呢?

@carlye
Copy link

carlye commented Jul 26, 2023

找到办法了,已经解决。

您好 请问如何解决的可以讲一下吗 谢谢 我尝试给axios.create增加代理 但是无效

源码方式运行就是需要修改npm install之后的node_modules目录下openai目录下/dist/目录下的 base.js export const BASE_PATH = "https://api.openai.com/v1".replace(//+$/; 找到这行,把BASE_PATH设置成梯子地址。
docker方式的,进入到docker容器里,找到lib下的node_modules下相同位置的文件。改的内容一样

大佬能把步骤再具体点的吗, 我遇到两个问题一个是找不到完全一样的路径和base.js文件,我的在以下路径\AppData\Roaming\npm\node_modules\flowise\node_modules\openai\dist, 而且BASE_PATH看起来不太一样。以下: const axios_1 = require("axios"); exports.BASE_PATH = "https://api.openai.com/v1".replace(//+$/;

还有就是梯子地址尝试了填入https://api.chatanywhere.com.cn/v1去替换https://api.openai.com/v1; 但是报错401。

所以想了解一下具体步骤和梯子地址该怎么设置呢?

我用的是flowise 1.2.14版本。启动是yarn start。
yarn install后,node_modules目录在项目目录下。
image

@FrenchGithubUser
Copy link

I have been trying to figure out how to set a proxy with flowise but couldn't get it to work.

From what I understand, flowise does not directly make http requests to the internet, but another library does. In my test cases, it seemed to be either langchain or openai.

I tried to set a proxy in the global axios object axios_1 in those libs, but it didn't seem to be taken into account.

Also, langchain seems to have an option to set a proxy with their python sdk, which is not the case with their js/ts lib.

If someone has more information on what to change, I would be very happy to make the necessary changes, as cleanly as possible, and to open a PR.

@CoderYellow
Copy link

langchain-ai/langchainjs#2339
seems langchain js already support http poxy, can we add a enviroment variable like http_proxy to support it

@CoderYellow
Copy link

CoderYellow commented Nov 5, 2023

@chz8494
Copy link
Contributor

chz8494 commented Nov 5, 2023

I have successfully coded it to use proxy. In my case, i have 2 major issues with my network.

  1. our network allows all other websites except openai related sites, meaning any chat ai related actions it needs to go through proxy. So in all azure chat ai related ts file, I just need to add proxy there in the call action.
  2. Other calls like serp or httpget my network could work without proxy, but somehow the default fetch function that used in langchain makes trouble, it can pick up my local proxy setup in env, but the response cannot be resolved, I have to modify it to use node-fetch instead, and because langchain is dep, I have to change code in module dist, which is quite annoying.

So the 1st problem is easy to fix, we can add proxy per chat/agent module. But I can’t figure out a permanent fix for the 2nd problem.

hemati pushed a commit to hemati/Flowise that referenced this issue Dec 27, 2023
@HenryHengZJ
Copy link
Contributor

We have added a proxy params to OpenAI: #3153

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

17 participants