Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deploying pages #2

Open
Ahmad-2213 opened this issue Dec 17, 2024 · 29 comments
Open

Deploying pages #2

Ahmad-2213 opened this issue Dec 17, 2024 · 29 comments
Labels
stale This issue will be closed within 7 days.

Comments

@Ahmad-2213
Copy link

Ahmad-2213 commented Dec 17, 2024

For pages deployment, even though i see the "hello world" message , but the link isn't working.
I have also bind a sub domain and the configuration is generated as expected, however , when it comes to be tested on xray-based clients, it fails, whilst the script works flawlessly on workers.

@Ahmad-2213 Ahmad-2213 changed the title pages Deploying pages Dec 17, 2024
@arkanpay
Copy link

Same as here. Deploying on Cloudflare Pages is necessary because, currently, it doesn't have a rate limit.

@vrnobody
Copy link
Owner

vrnobody commented Dec 18, 2024

Unfortunately, I agree with #1 . It seems that pages does not support gRPC.

@Ahmad-2213
Copy link
Author

Ahmad-2213 commented Dec 18, 2024

Unfortunately, I agree with #1 . It seems that pages does not support gRPC.

hoommm i see!
is grpc header necessary to deploy xhttp?
maybe another mode of this protocol could be implemented?(stream-up?)
if it's not possible, we may try to decrease the number of requests by longer keep alives and use the same connection for other requests.(xmux) (what are the best xmux settings for this objective?)

@vrnobody
Copy link
Owner

Adding a gRPC header is like exploiting a vulnerability of Cloudflare. Strictly speaking stream-one mode is not regular HTTP POST. As @yomnxkcs has pointed out in issue 443, other xhttp modes will not work.
I am not familiar with xmux settings. Perhaps you can try diffent configuration combinations and share your findings.

@Ahmad-2213
Copy link
Author

Ahmad-2213 commented Dec 18, 2024

@RPRX
What do you think on this issue?

@RPRX
Copy link

RPRX commented Dec 22, 2024

我才看到你们已经把这个搞出来了

我没有研究过 workers 和 pages 的区别,但我看到 response type 是 application/grpc,或许改成 text/event-stream 或没有试试

BTW,response 似乎没有 X-Padding,需要补上 @vrnobody

@vrnobody
Copy link
Owner

刚刚加了 padding.

pages 和 workers 的网络不一样,上次测试的时候感觉是 upload 的后续数据包进不去 pages 服务器,而不是 pages 的数据包出不来。同时 pages 没有日志功能对调试很不友好,等个有缘人来试这个吧。

@Ahmad-2213
Copy link
Author

You can access logs for pages following this path:
Deployments Tab >> Under Production/All Deployments section , Select View Details >>
Functions

here you can scroll down to find Real-time Logs and begin log stream.

@vrnobody

@Ahmad-2213
Copy link
Author

Here is a demonstration of logs:
IMG_20241223_081042

@vrnobody
Copy link
Owner

You can access logs for pages following this path: Deployments Tab >> Under Production/All Deployments section , Select View Details >> Functions

here you can scroll down to find Real-time Logs and begin log stream.

@vrnobody

Thanks for the tips. I just tried to deploy xhttp on pages again. I had tested server-send-event header, chunked encoding, padding. Sadly all those tricks don't work.

@Ahmad-2213
Copy link
Author

Ahmad-2213 commented Dec 23, 2024

You can access logs for pages following this path: Deployments Tab >> Under Production/All Deployments section , Select View Details >> Functions
here you can scroll down to find Real-time Logs and begin log stream.
@vrnobody

Thanks for the tips. I just tried to deploy xhttp on pages again. I had tested server-send-event header, chunked encoding, padding. Sadly all those tricks don't work.

But it seems that requests are caught by the server because logs confirm this claim.
So the problem might be lying on Server-send-headers which cause this issue to happen.
I suspect that if we could try to simulate the WS method for sending data, it'll work probably?!
However you mentioned that you've tried various header types.

@vrnobody
Copy link
Owner

There is a slight difference between stream-one mode and regular HTTP POST. Stream-one mod needs to send multiple upload packets separately with in one request. From the logs I observed, only the first packet makes it way into the pages server, other packets are dropped or something else. I still think the problem lies on the upload side. Those tricks only work on the download side. I'm out of ideas.

@gtnttot

This comment was marked as off-topic.

@Ahmad-2213
Copy link
Author

There is a slight difference between stream-one mode and regular HTTP POST. Stream-one mod needs to send multiple upload packets separately with in one request. From the logs I observed, only the first packet makes it way into the pages server, other packets are dropped or something else. I still think the problem lies on the upload side. Those tricks only work on the download side. I'm out of ideas.

@RPRX what do you think?

@vrnobody

This comment was marked as off-topic.

@RPRX
Copy link

RPRX commented Dec 24, 2024

可能 pages 就是不支持流式上行,或者只支持一半,比如发送响应头后掐断流式上行,这样的话 stream-up 还有戏

话说 workers 免费版虽然有请求数限制,但我印象中是每天 10 万?这完全够用了吧

@Ahmad-2213
Copy link
Author

Maybe pages just don't support streaming uplink, or only support half of it, for example, cutting off streaming uplink after sending the response header. In this case, stream-up still has a chance.

The free version of workers has a request limit, but I remember it was 100,000 per day? That's enough, right?

I don't think so.
If you share your link to 2-3 users , it'd reach the limitations, However, for those with high usage is not merely adequate.

@gtnttot
Copy link

gtnttot commented Dec 24, 2024

之前用ws协议时每天10万还够用,换了xhttp packet-up 就不够了,我个人用也就撑几个小时,要是再给别人用就更不够了。

@gtnttot

This comment was marked as off-topic.

@vrnobody
Copy link
Owner

可能 pages 就是不支持流式上行,或者只支持一半,比如发送响应头后掐断流式上行,这样的话 stream-up 还有戏

workers 是无状态的,怎么实现 stream-up? client 和目标网站要先 TLS 握手,所以响应后还会有数据交换。我是想不出解决办法了。静待有缘人吧。

@RPRX
Copy link

RPRX commented Dec 24, 2024

之前用ws协议时每天10万还够用,换了xhttp packet-up 就不够了,我个人用也就撑几个小时,要是再给别人用就更不够了。

stream-one 和 ws 的请求数是一样的,不要想着 workers 反代 packet-up,那样的话为什么不直接走常规 CDN?

workers 是无状态的,怎么实现 stream-up?

可能能用 KV 加上其它的东西关联起来,不过我说的不是 workers 是 pages,而 pages 可能不支持 KV 那些东西

@vrnobody

This comment was marked as off-topic.

@vrnobody
Copy link
Owner

可能能用 KV 加上其它的东西关联起来,不过我说的不是 workers 是 pages,而 pages 可能不支持 KV 那些东西

KV 只能存数据,现在要存的是目标网站的 connection 对象,那个功能叫 Cloudflare Durable Objects, 可贵了。估计用这个脚本的人不会考虑这东西。

@gtnttot

This comment was marked as off-topic.

@Ahmad-2213
Copy link
Author

Ahmad-2213 commented Jan 9, 2025

When I used the ws protocol, 100,000 packets a day was enough, but after switching to xhttp packet-up, it is not enough. I can only use it for a few hours, and it is not enough for others to use it.

The number of requests for stream-one and ws is the same. Don't think about using workers to reverse packet-up. In that case, why not just use regular CDN?

Workers are stateless, how to implement stream-up?

It may be possible to use KV and other things to associate them, but I'm not talking about workers but pages, and pages may not support KV and other things.

Pages support KV, it also leverages D1 database in free plan , these options are located at the Bindings section of pages settings.
@vrnobody Have you tried them?

@vrnobody
Copy link
Owner

Those databases can only store data, such as numbers/text. So integrating those databases can only achieve sharing data among requests. In order to implement stream-up/packet-up/packet-down/packet-mode, we have to share remote website connection handle, which basically is a memory pointer, among requests. Databases are not helpful in this scenario.

@vrnobody
Copy link
Owner

I forgot to mention, according to the pricing docs, pages and workers share the same quota. There is no difference whether deploying this script on pages or workers.

@Ahmad-2213
Copy link
Author

I forgot to mention, according to the pricing docs , pages and workers share the same quota. There is no difference whether deploying this script on pages or workers.

Maybe the terms are recently changed.
But when I was using Pages , at least 200.000 requests were recorded on pages and they wouldn't be rate-limited.

@Ahmad-2213
Copy link
Author

Those databases can only store data, such as numbers/text. So integrating those databases can only achieve sharing data among requests. In order to implement stream-up/packet-up/packet-down/packet-mode, we have to share remote website connection handle, which basically is a memory pointer, among requests. Databases are not helpful in this scenario.

@RPRX What do you think on this issue?

@github-actions github-actions bot added the stale This issue will be closed within 7 days. label Feb 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stale This issue will be closed within 7 days.
Projects
None yet
Development

No branches or pull requests

5 participants