this blog is WIP, I will update it soon, github code for
outline 1
andoutline 2
are done,outline 3
is not done yet
github link: https://github.com/GrahamQuan/nextjs-file-upload
blog outline
- simple file upload & download (single file and multiple files)
- multi part large file upload
- production ready code (TODO: new blog)
setup bucket (Cloudflare R2)
custom domain setup
- domain be like:
https://assets.grahamquan.com
cloudflare dash
DNS -> Records
Type | Name | Target | Proxy status | TTL |
---|---|---|---|---|
cname | assets | <your_id>.r2.dev |
open (Proxied) | auto |
R2 storage -> overview -> setting -> Public access
connect domain, eg: https://assets.grahamquan.com
cors setup
cloudflare dash
R2 storage -> overview -> setting -> CORS policy
json
[
{
"AllowedOrigins": [
"http://localhost:3000",
"http://localhost:3001",
"https://assets.grahamquan.com"
],
"AllowedMethods": [
"GET",
"PUT",
"POST",
"HEAD",
"DELETE"
],
"AllowedHeaders": [
"*"
],
"ExposeHeaders": [
"ETag" <--- this for multi part large file upload
],
"MaxAgeSeconds": 3600
}
]
env setup
cloudflare dash
TODO
.env
BUCKET_ACCESS_KEY_ID=
BUCKET_SECRET_ACCESS_KEY=
BUCKET_ENDPOINT=
BUCKET_NAME=
BUCKET_REGION=
# if you have a custom domain, you can change the `BUCKET_PUBLIC_URL`
BUCKET_PUBLIC_URL=
simple single file upload with react
backend
bucket client
- use as separate file so we use the same setup
- under
lib/server-only/index.ts
so we can import it only on server side
lib/server-only/bucket-client.ts
import { S3Client } from '@aws-sdk/client-s3';
const BucketClient = new S3Client({
region: process.env.BUCKET_REGION,
endpoint: process.env.BUCKET_ENDPOINT,
credentials: {
accessKeyId: process.env.BUCKET_ACCESS_KEY_ID || '',
secretAccessKey: process.env.BUCKET_SECRET_ACCESS_KEY || '',
},
});
export default BucketClient;
create presigned url function
- under
lib/server-only/index.ts
so we can import it only on server side - use
nanoid
to generate unique id, key be like:2025/03/01/1234567890.png
, separate by time, which is good for organization - params is
mimeType
, which tells bucket what type of file it is, mimeType be likeimage/png
- return
fileUrl
for file preview andpresignedUrl
for file upload
lib/server-only/create-presigned-url.ts
import "...";
export default async function createPresignedUrl(
mimeType: string // be like: image/png
): Promise<{ fileUrl: string; presignedUrl: string }> {
// key be like: 2025/03/01/1234567890.png
const key = `${createDateFolderPath()}/${nanoid()}.${getFileExtensionByMimeType(
mimeType
)}`;
const command = new PutObjectCommand({
Bucket: process.env.BUCKET_NAME as string,
Key: key,
ContentType: mimeType,
});
const presignedUrl = await getSignedUrl(BucketClient, command, {
expiresIn: 3600,
});
const fileUrl = `${process.env.BUCKET_PUBLIC_URL}/${key}`;
return { fileUrl, presignedUrl };
}
API
- under
app/api/presigned-url/route.ts
- use
createPresignedUrl
to create presigned url - files type is
{ mimeType: string; fileSize: number}[]
, why is array? because you can generate multiple presigned url at one request - return
fileUrl
for file preview andpresignedUrl
for file upload - you should do authentication in here (i am not doing it for this demo)
app/api/presigned-url/route.ts
export async function POST(request: NextRequest) {
// ...
const results = await Promise.all(
files.map(async (file) => {
const { fileUrl, presignedUrl } = await createPresignedUrl(
file.mimeType
);
return {
fileUrl,
presignedUrl,
};
})
);
// ...
}
frontend
upload with form
- use
<form>
to upload file- why use
<form>
? because react Uncontrolled Components, good performance because it wont trigger react to render when file change - if you need to preview file, you can use
<input type="file" id="..." onChange={handleFileChange} />
- why use
- There are two ways to style
<input type="file" id="...">
- (1) use
<label htmlFor="...">
to style it, this howmui
do it - (2) use
<button>
to style it, but you need to do something likeinputRef.current?.click()
to trigger file select, this howantd
do it
- (1) use
- if you wanna upload multiple files
- change
multiple={false}
tomultiple={true}
- change file to be
File[]
likeconst formFileList = formData.getAll('file') as File[];
- more detail check my code
components/multi-files-upload.tsx
in github repo
- change
components/single-file-upload.tsx
const handleSubmit = async (e: React.FormEvent<HTMLFormElement>) => {
e.preventDefault();
const formData = new FormData(e.currentTarget);
const formFile = formData.get('file') as File;
// ...
const res = await fetch('/api/presigned-url', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
files: [
{
mimeType: file?.type || '',
fileSize: file?.size || 0,
},
],
}),
});
// ...
}
<form onSubmit={...}>
<div className='flex gap-3'>
<label
htmlFor='file' // match with <input> id
className='size-16 flex justify-center items-center rounded-lg bg-sky-500 hover:cursor-pointer'
>
<Upload className='size-8' />
<input
type='file'
name='file' // required for form submission
id='file' // required for <label>
multiple={false} // (1)true for multiple files (2)false for single file
onChange={handleFileChange}
className='hidden'
accept='image/*' // only allow image files
/>
</label>
</div>
</form>
download file
- add timestamp to url to avoid browser cache and CORS
hooks/use-file-download.ts
import { getFileNameFromUrl } from '@/lib/file-utils';
import { useCallback, useState } from 'react';
export default function useFileDownload() {
const [isDownloading, setIsDownloading] = useState(false);
const fileDownload = useCallback(
async (url: string) => {
try {
setIsDownloading(true);
// add timestamp to url to avoid browser cache and CORS
const response = await fetch(`${url}?t=${Date.now()}`);
if (!response.ok) {
throw new Error(`HTTP error! Status: ${response.status}`);
}
const blob = await response.blob();
const link = document.createElement('a');
link.href = URL.createObjectURL(blob);
link.download = getFileNameFromUrl(url);
document.body.appendChild(link);
link.click();
document.body.removeChild(link);
URL.revokeObjectURL(link.href);
setIsDownloading(false);
} catch (error) {
console.error('download file error:', error);
} finally {
setIsDownloading(false);
}
},
[setIsDownloading]
);
return { isDownloading, fileDownload };
}
multipart upload for large file
simple step
- get presigned url for each slice
- upload sliced file to bucket and get
ETag
from response header - tell server all slices is uploaded and get the file url so you can preview
ps
you need to setup cloudflare CORS policy to get
ETag
header, see config above
API (needs two api to handle the process)
- create presigned url for each file part
- completed when all file parts upload
(1) create presigned url for each file part
partSize
is the size of each part, default is 5MB, the minimum part size requirement for R2/S3 BucketpartNumber
start from1
, not0
, this is aws s3 api requirementuploadId
is the id of the multipart upload, it is used to identify the multipart upload
lib/server-only/create-multi-parts-presigned-url.ts
import "...";
/**
* Creates a presigned URL for a multipart upload.
*
* @param {Object} options - The options for creating the presigned URL.
* @param {string} options.mimeType - The MIME type of the file, e.g., 'image/png'.
* @param {number} options.fileSize - The size of the file in bytes. (5 Mb ~ 5 Gb)
* @param {number} [options.partSize=5242880] - The size of each part in bytes. Defaults to 5MB, the minimum part size requirement for R2/S3 Bucket.
*
*/
export default async function createMultiPartsPresignedUrl({
mimeType,
fileSize,
partSize = 5 * 1024 * 1024,
}: {
mimeType: string;
fileSize: number;
partSize?: number;
}) {
const bucketName = process.env.BUCKET_NAME || '';
// key be like: 2025/03/01/1234567890.png
const key = `${createDateFolderPath()}/${nanoid()}.${getFileExtensionByMimeType(
mimeType
)}`;
// Calculate the part size and count
const partCount = Math.ceil(fileSize / partSize);
// Initialize the multipart upload
const multipartUpload = await BucketClient.send(
new CreateMultipartUploadCommand({
Bucket: bucketName,
Key: key,
ContentType: mimeType,
})
);
const uploadId = multipartUpload.UploadId;
if (!uploadId) {
throw new Error('Failed to initialize multipart upload');
}
// Generate a presigned URL for each part
const presignedUrlList = [];
// partNumber start from 1
// because AWS S3 API PartNumber start from 1, max is 10000
for (let partNumber = 1; partNumber <= partCount; partNumber++) {
const command = new UploadPartCommand({
Bucket: bucketName,
Key: key,
UploadId: uploadId,
PartNumber: partNumber,
});
const presignedUrl = await getSignedUrl(BucketClient, command, {
expiresIn: 3600,
});
presignedUrlList.push({
presignedUrl,
partNumber,
});
}
return {
key,
uploadId,
presignedUrlList,
};
}
(2) completed when all file parts upload
- request body params is like
ts
{
key: string;
uploadId: string;
parts: {
partNumber: number;
etag: string;
}[];
};
- return
fileUrl
for file preview
app/api/completed-multi-part-upload/route.ts
import "...";
export async function POST(request: NextRequest) {
try {
// ...
// Prepare part information for CompleteMultipartUpload, ensuring it's sorted by PartNumber
const completedParts = parts
.sort((a, b) => a.partNumber - b.partNumber)
.map((part) => ({
PartNumber: part.partNumber,
ETag: part.etag,
}));
const completeCommand = new CompleteMultipartUploadCommand({
Bucket: process.env.BUCKET_NAME,
Key: key,
UploadId: uploadId,
MultipartUpload: {
Parts: completedParts,
},
});
try {
const result = await BucketClient.send(completeCommand);
console.log('Multipart upload completed successfully:', result);
} catch (error) {
console.error('S3 CompleteMultipartUpload error details:', error);
throw error;
}
const fileUrl = `${process.env.BUCKET_PUBLIC_URL}/${key}`;
// ...
}
front end
step
- create presigned url for each file part
- upload sliced file, get
ETag
from response header if any file part upload success - tell server all slices is uploaded and get the file url
components/multi-parts-large-file-upload.tsx
'use client';
export default function MultipartsLargeFileUpload() {
// ...
const handleSubmit = async (e: React.FormEvent<HTMLFormElement>) => {
e.preventDefault();
const formData = new FormData(e.currentTarget);
const formFile = formData.get('file') as File;
try {
// (1) create presigned url
const presignedUrlResponse = await fetch(
'/api/multi-parts-presigned-url',
{
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
files: [
{
mimeType: file?.type || '',
fileSize: file?.size || 0,
},
],
}),
}
);
const presignedUrlJson =
(await presignedUrlResponse.json()) as MultiPartsPresignedUrlResponse;
// ...
const { key, uploadId, presignedUrlList } = presignedUrlJson.data;
const slicedFileList = sliceFileToMultipart(formFile);
// (2) upload sliced file
const uploadResponse = await Promise.all(
presignedUrlList.map(async (el, idx) => {
if (!el.presignedUrl) {
throw new Error('No presigned url found for upload');
}
const sliceResponse = await fetch(el.presignedUrl, {
method: 'PUT',
body: slicedFileList[idx],
headers: {
'Content-Type': formFile.type,
},
});
if (!sliceResponse.ok) {
console.error('Upload failed with status:', sliceResponse.status);
throw new Error(`Upload failed: ${sliceResponse.status}`);
}
const etag =
sliceResponse.headers.get('Etag') ||
sliceResponse.headers.get('etag') ||
sliceResponse.headers.get('ETag') ||
'';
console.log('Part uploaded with ETag:', etag);
return {
etag,
partNumber: el.partNumber,
};
})
);
if (!uploadResponse.length) {
console.log('upload failed');
return;
}
const params: CompletedMultiPartUploadRequestBody = {
key,
uploadId,
parts: uploadResponse.map((el) => ({
partNumber: el.partNumber,
etag: el.etag,
})),
};
// (3) tell server all slices is uploaded and get the file url
const completedRes = await fetch('/api/completed-multi-part-upload', {
method: 'POST',
body: JSON.stringify(params),
});
const completedJson = (await completedRes.json()) as {
fileUrl: string;
};
setPreviewUrl(completedJson.fileUrl);
} catch (error: any) {
console.log(error);
} finally {
setLoading(false);
}
};
return ...
}
- file utils
lib/file-utils.ts
export function sliceFileToMultipart(
file: File,
sliceSize: number = 5 * 1024 * 1024
): Blob[] {
const totalSize = file.size;
const totalSlices = Math.ceil(totalSize / sliceSize);
const slices = [];
for (let i = 0; i < totalSlices; i++) {
const start = i * sliceSize;
const end = start + sliceSize;
const slice = file.slice(start, end);
slices.push(slice);
}
return slices;
}
production ready code (soon)
tech stack
-
nextjs
-
react
-
typescript
-
zod
-
shadcn ui
-
react-hook-form
-
react drag and drop
Step
- TODO