Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

NextJS 13 with Edge runtime error: exports is not defined #400

Closed
adhorodyski opened this issue Sep 1, 2023 · 6 comments
Closed

NextJS 13 with Edge runtime error: exports is not defined #400

adhorodyski opened this issue Sep 1, 2023 · 6 comments

Comments

@adhorodyski
Copy link

Summary

Hi! 馃憢

I've been using the library for a while now, following the docs under https://github.com/hashicorp/next-mdx-remote#react-server-components-rsc--nextjs-app-directory-support and it worked perfectly fine :)

Versions:

"next": "13.4.19",
"next-mdx-remote": "4.4.1"

Recently I've encountered the below error when trying to migrate a page to use the const runtime = "edge" with NextJS 13. Commenting out this line solves the issue.
Same applies to using next-mdx-remote/serialize with the edge runtime.

Attempted import error: 'MDXRemote' is not exported from 'next-mdx-remote/rsc' (imported as 'MDXRemote').

Screenshot 2023-09-01 at 13 43 18

What I've tried

Reproduction

https://codesandbox.io/p/sandbox/muddy-cache-nkwsqw

or

  • setup a new nextjs application using create-next-app
  • add next-mdx-remote as a dependency
  • try to use it on a page with export const runtime = "edge"
@SimplyMerlin
Copy link

running into the same issue.

@yongyi520
Copy link

I also encountered this issue

@mcexit
Copy link

mcexit commented Oct 31, 2023

I believe this is because next-mdx-remote package is importing os. See here under dist/rsc.js & dist/serialize.js, but I could be wrong. I found a similar bug report here. Basically anything that relies on a non-web API is bound to cause issues with the Edge Runtime.

Next.js already has an experimental URL import feature that you can use for remote files & for local files you can just use imports as is. I don't understand why Next.js even references this package in its MDX docs because the functionality this package provides should be a first-class citizen in Next.js, and it sort of is, but they don't make it obvious. The @next/mdx readme.md has essential information that is easy to overlook by merely relying on the Next.js docs.

You also have to avoid certain experimental feature pitfalls with Next.js. The urlImports setting is experimental but works reliably. However, things like mdxRs and Turbopack can cause unexpected issues that aren't obvious.

@Nitin2804
Copy link

I encountered the same issue. I'd like to know if it has been resolved.

@amazingnerd
Copy link

amazingnerd commented Feb 21, 2024

I got the same issue. Everything work fine on localhost, until I use the "edge" runtime for vercel function. Vercel lets hobby plan only has max 10 second for serverless function. When I fetch api from OpenAI, Replicate Ai, it takes around 12 seconds to get the response. So my web is always got 504 timeout error on production. That's why I'm trying to use the edge runtime, but when I use it, it shows errors:

./node_modules/dotenv/lib/main.js:3:11
Module not found: Can't resolve 'os'
https://nextjs.org/docs/messages/module-not-found
Import trace for requested module:
./app/api/chat/[chatId]/route.ts

This is my route.ts:
import dotenv from "dotenv"
import { StreamingTextResponse, LangChainStream } from "ai"
import { Replicate } from "langchain/llms/replicate"
import { CallbackManager } from "langchain/callbacks"
import { NextResponse } from "next/server"
import { currentUser } from "@/lib/auth"
import { MemoryManager } from "@/lib/memory"
import { rateLimit } from "@/lib/rate-limit"
import { db } from "@/lib/db"

dotenv.config({ path: .env })

export const runtime = "edge"
export async function POST(
request: Request,
{ params }: { params: { chatId: string } }
) {
try {
const { prompt } = await request.json()
const user = await currentUser()

if (!user || !user.name || !user.id) {
  return new NextResponse("Unauthorized", { status: 401 })
}

const identifier = request.url + "-" + user.id
const { success } = await rateLimit(identifier)

if (!success) {
  return new NextResponse("Rate limit exceeded", { status: 429 })
}

const companion = await db.companion.update({
  where: {
    id: params.chatId,
  },
  data: {
    messages: {
      create: {
        content: prompt,
        role: "user",
        userId: user.id,
      },
    },
  },
})

if (!companion) {
  return new NextResponse("Companion not found", { status: 404 })
}

const name = companion.id
const companion_file_name = name + ".txt"

const companionKey = {
  companionName: name!,
  userId: user.id,
  modelName: "mistral-7b-instruct-v0.2",
}
const memoryManager = await MemoryManager.getInstance()

const records = await memoryManager.readLatestHistory(companionKey)
if (records.length === 0) {
  await memoryManager.seedChatHistory(companion.seed, "\n\n", companionKey)
}
await memoryManager.writeToHistory("User: " + prompt + "\n", companionKey)

// Query Pinecone

const recentChatHistory = await memoryManager.readLatestHistory(
  companionKey
)

// Right now the preamble is included in the similarity search, but that
// shouldn't be an issue

const similarDocs = await memoryManager.vectorSearch(
  recentChatHistory,
  companion_file_name
)

let relevantHistory = ""
if (!!similarDocs && similarDocs.length !== 0) {
  relevantHistory = similarDocs.map((doc) => doc.pageContent).join("\n")
}
const { handlers } = LangChainStream()
// Call Replicate for inference
const model = new Replicate({
  model:
    "mistralai/mistral-7b-instruct-v0.2:79052a3adbba8116ebc6697dcba67ad0d58feff23e7aeb2f103fc9aa545f9269",
  input: {
    max_length: 625,
  },
  apiKey: process.env.REPLICATE_API_TOKEN,
  callbackManager: CallbackManager.fromHandlers(handlers),
})

// Turn verbose on for debugging
model.verbose = true

const resp = String(
  await model
    .call(
      `
    ONLY generate plain sentences without prefix of who is speaking. DO NOT use ${companion.name}: prefix. 

    ${companion.instructions}

    Below are relevant details about ${companion.name}'s past and the conversation you are in.
    ${relevantHistory}


    ${recentChatHistory}\n${companion.name}:`
    )
    .catch(console.error)
)

const cleaned = resp.replaceAll(",", "")
const chunks = cleaned.split("\n")
const response = chunks[0]

await memoryManager.writeToHistory("" + response.trim(), companionKey)
var Readable = require("stream").Readable

let s = new Readable()
s.push(response)
s.push(null)
if (response !== undefined && response.length > 1) {
  memoryManager.writeToHistory("" + response.trim(), companionKey)

  await db.companion.update({
    where: {
      id: params.chatId,
    },
    data: {
      messages: {
        create: {
          content: response.trim(),
          role: "system",
          userId: user.id,
        },
      },
    },
  })
}

return new StreamingTextResponse(s)

} catch (error) {
return new NextResponse("Internal Error", { status: 500 })
}
}

@dstaley
Copy link
Collaborator

dstaley commented Mar 21, 2024

@mdx-js/mdx relies on eval to execute compiled code, which is not supported by the Vercel Edge Runtime.

@dstaley dstaley closed this as completed Mar 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants