Cloudflare R2 permits
builders to retailer massive quantities of unstructured knowledge and entry every part
they required with zero egress charges. The S3-compatible API of R2 Storage permits
builders to create seamless migrations and superior integrations.
R2 can be utilized in
a number of situations resembling Cloud-based utility storage, Cloud storage for
internet content material and podcast episodes, Knowledge lakes – Analytics and massive knowledge, and
Output from massive batch processes, resembling machine studying mannequin artifacts or
knowledge units will be saved within the cloud.
Create bucket and API
Logs will be despatched to R2
straight from the Cloudflare dashboard or by way of API utilizing Cloudflare Logpush.
Create R2 Bucket
- Go to R2 and
choose ‘Create Bucket’
- Enter the bucket
identify and click on ‘Create Bucket’
Create R2 API
- Open R2 and Choose
‘Handle R2 API Tokens’
- Click on the ‘Create
- Choose ‘Edit
permissions’ on your token underneath ‘Permission’
- Make a replica of
the Secret Entry Key and Entry Key ID.
Just remember to have
the next permission: R2 write and Logshare edit. Additionally, a Cloudflare API
token with the next permissions will be created as a substitute: Zone
scope, logs edit permission, and Account scope, R2 write permission.
Handle your account by way of
the Cloudflare dashboard
Now you wish to allow
Logpush to R2 through the use of the dashboard.
- Log in to your Cloudflare account.
- Select the area or Enterprise account that you simply want to
use with Logpush.
- Subsequent go to Analytics & logs and click on logs.
- A modal field will open after choosing Add Logpush job.
- Select the dataset to ship to a storage service.
- Now select the information fields you wish to embody in your
logs. You possibly can add or take away the fields later by adjusting your Logs >
- Then choose R2 and enter the next vacation spot
particulars: Bucket path, R2 Entry Key ID, and Secret entry key.
- Click on on validate entry.
- To complete allow the Logpush job, click on Save and Begin
Handle by API
Allow us to create a job by
sending a POST request with the next fields to the Logpush jobs endpoint:
- Title: you should use
your area identify as a job identify (elective)
a bucket path, account ID, R2 entry key ID, and R2 secret entry key-containing
vacation spot for logs.
To segregate your logs into each day subfolders, we
suggest including the ‘DATE’ parameter to the destination_conf.
- dataset: The log
class you wish to obtain.
Instance of a cURL
curl -X POST
-H ‘X-Auth-Electronic mail:
-H ‘Content material-Kind:
}’| jq .
Should you want any help be at liberty to Get Help.
Additionally verify: Repair Cloudflare Error 524
To get extra updates you may comply with us on Fb, Twitter, LinkedIn
Subscribe to get free weblog content material to your Inbox