
Cloudflare R2 permits
builders to retailer massive quantities of unstructured knowledge and entry every part
they required with zero egress charges. The S3-compatible API of R2 Storage permits
builders to create seamless migrations and superior integrations.
R2 can be utilized in
a number of situations resembling Cloud-based utility storage, Cloud storage for
internet content material and podcast episodes, Knowledge lakes – Analytics and massive knowledge, and
Output from massive batch processes, resembling machine studying mannequin artifacts or
knowledge units will be saved within the cloud.
Create bucket and API
tokes
Logs will be despatched to R2
straight from the Cloudflare dashboard or by way of API utilizing Cloudflare Logpush.
Create R2 Bucket
- Go to R2 and
choose ‘Create Bucket’ - Enter the bucket
identify and click on ‘Create Bucket’
Create R2 API
- Open R2 and Choose
‘Handle R2 API Tokens’ - Click on the ‘Create
API token’ - Choose ‘Edit
permissions’ on your token underneath ‘Permission’ - Make a replica of
the Secret Entry Key and Entry Key ID.
Just remember to have
the next permission: R2 write and Logshare edit. Additionally, a Cloudflare API
token with the next permissions will be created as a substitute: Zone
scope, logs edit permission, and Account scope, R2 write permission.
Handle your account by way of
the Cloudflare dashboard
Now you wish to allow
Logpush to R2 through the use of the dashboard.
- Log in to your Cloudflare account.
- Select the area or Enterprise account that you simply want to
use with Logpush. - Subsequent go to Analytics & logs and click on logs.
- A modal field will open after choosing Add Logpush job.
- Select the dataset to ship to a storage service.
- Now select the information fields you wish to embody in your
logs. You possibly can add or take away the fields later by adjusting your Logs >
Logpush settings. - Then choose R2 and enter the next vacation spot
particulars: Bucket path, R2 Entry Key ID, and Secret entry key. - Click on on validate entry.
- To complete allow the Logpush job, click on Save and Begin
Pushing.
Handle by API
Allow us to create a job by
sending a POST request with the next fields to the Logpush jobs endpoint:
- Title: you should use
your area identify as a job identify (elective) - destination_conf:
a bucket path, account ID, R2 entry key ID, and R2 secret entry key-containing
vacation spot for logs.
To segregate your logs into each day subfolders, we
suggest including the ‘DATE’ parameter to the destination_conf.
r2://<BUCKET_PATH>/{DATE}?account-id=<ACCOUNT_ID>&access-key-id=<R2_ACCESS_KEY_ID>&secret-access-key=<R2_SECRET_ACCESS_KEY>
- dataset: The log
class you wish to obtain.
Instance of a cURL
request:
curl -X POST
‘https://api.cloudflare.com/consumer/v4/zones/<ZONE_ID>/logpush/jobs’
-H ‘X-Auth-Key:
<API_KEY>’
-H ‘X-Auth-Electronic mail:
<EMAIL>’
-H ‘Content material-Kind:
utility/json’
-d ‘{
“identify”:
“<DOMAIN_NAME>”,
“logpull_options”:
“fields=ClientIP,ClientRequestHost,ClientRequestMethod,ClientRequestURI,EdgeEndTimestamp,EdgeResponseBytes,EdgeResponseStatus,EdgeStartTimestamp,RayID×tamps=rfc3339”,
“destination_conf”:
“r2://<BUCKET_PATH>/{DATE}?account-id=<ACCOUNT_ID>&access-key-id=<R2_ACCESS_KEY_ID>&secret-access-key=<R2_SECRET_ACCESS_KEY>”,
“dataset”:
“http_requests”,
“enabled”: true
}’| jq .
Should you want any help be at liberty to Get Help.
Additionally verify: Repair Cloudflare Error 524
To get extra updates you may comply with us on Fb, Twitter, LinkedIn
Subscribe to get free weblog content material to your Inbox