Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Router cache issue with NextJS and Vercel AI SDK #1465

Open
aneequrrehman opened this issue Apr 29, 2024 · 0 comments
Open

Router cache issue with NextJS and Vercel AI SDK #1465

aneequrrehman opened this issue Apr 29, 2024 · 0 comments
Labels

Comments

@aneequrrehman
Copy link

aneequrrehman commented Apr 29, 2024

Description

First of all, thank you for this library - it's great.

Secondly, this might not be an issue and could just be something I'm missing. So, I'm using the Vercel AI SDK with Next.js and experiencing an issue with client-side caching. When a user sends a message, the AI response is streamed to the browser, but if the user navigates away and comes back to the same page, the AI response (last message) is not shown due to the client-side Router cache.

Implementation

I'm loading the messages via server action using the conversationId from the URL. This data helps set the initial states for AI and UI via initialAIState and initialUIState props on the provider returned from createAI function.


export default async function ChatPage({ params: { id } }) {
    const aiConversation = await getAiConversation(id)

    if (aiConversation === null) {
        return <div>Chat not found...</div>
    }
        
    return <ChatAiProvider
        initialAIState={{
	    aiConversationId: aiConversation.id,
	    messages: aiConversation.messages.map((m) => ({
                role: m.from === 'ai' ? 'assistant' : 'user',
	        content: m.content,
	    })),
	}}
	initialUIState={aiConversation.messages.map((m) => ({
            id: m.id,
	    display: m.from === 'ai' ? (
                <AiMessage>
		    <Markdown>{m.content}</Markdown>
		</AiMessage>
	    ) : (
	        <UserMessage>{m.content}</UserMessage>
	    ),
	}))}
	>
         <Chat />
    </ChatAiProvider>
}

I do trigger revalidatePath to invalidate the client cache when the user sends the message but the issue arises after the AI responds to a user message - since it's streamed to the browser.

I tried moving it to client-side but createAI seems to only work server-side.

Update:

I could return InvalidateCache component but that seems like a hack plus what if user navigates away without letting the stream finish - the client caching problem persists.

...
const ui = render({
	model: 'gpt-3.5-turbo',
	provider: openai,
	messages: aiState.get().messages,
	text: ({ content, done }) => {
		if (done) {
			aiState.done({
				...aiState.get(),
				messages: [
					...aiState.get().messages,
					{
						role: 'assistant',
						content,
					},
				],
			})
		}

		return (
			<>
				<AiMessage>
					<Markdown>{content}</Markdown>
				</AiMessage>
				<InvalidateCache /> <-- this component can invalidate client cache using router.refresh()
			</>
		)
	},
})

return {
	id: Date.now().toString(),
	display: ui,
}

So, I was wondering is there a plan to implement something for these scenarios. Or am I doing something wrong here?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants