2

Following: https://github.com/kyriesent/node-rtsp-stream and How to display IP camera feed from an RTSP url onto reactjs app page? I was trying to display the RTSP stream from a CCTV but it gives me an error. ReferenceError: document is not defined at scripts\jsmpeg.min.js (1:701) @ eval

I haven't found a single implementation of this module in NextJS so I might be doing something wrong but I can't tell what. And I didn't find any better solution for NextJS.

There wasn't anything to get me out in: https://github.com/phoboslab/jsmpeg but I might be using it wrong in here.

The rabbit hole started from this: How can I display an RTSP video stream in a web page? but things are either outdated, do not apply or I couldn't figure them out.

The actual question:

How can I fix the error I get? Is there an alternative to this in NextJS? I don't care how all I need is to stream the RTSP feed from a CCTV.

Folder Structure:

components
   -layout
      -Stream.js
pages
   -api
   -stream
       -[streamId].js
       -app.js
   -index.js
scripts
    -jsmpeg.min.js

Stream.js is a component in stream/app.js and stream/app.js is used in stream/[streamId].js

Client-side : Stream.js

import JSMpeg from "../../scripts/jsmpeg.min.js";

const Stream = (props) => {
  const player = new JSMpeg.Player("ws://localhost:9999", {
    canvas: document.getElementById("video-canvas"), // Canvas should be a canvas DOM element
  });

 return (
    <Fragment>
        <canvas
          id="video-canvas"
          className={classes.canvas}
          onMouseDown={onMouseDownHandler}
        ></canvas>
    </Fragment>
  );
};

Server-side : [streamId.js]

export async function getStaticProps(context) {
const StreamCCTV = require("node-rtsp-stream");
  const streamCCTV = new StreamCCTV({
    ffmpegPath: "C:\\Program Files\\ffmpeg\\bin\\ffmpeg.exe", //! remove on Ubuntu
    name: "name",
    streamUrl: "rtsp://someuser:somepassword@1.1.1.1",
    wsPort: 9999,
    ffmpegOptions: {
      // options ffmpeg flags
      "-stats": "", // an option with no neccessary value uses a blank string
      "-r": 30, // options with required values specify the value after the key
    },
  });

Edit:

I have also tried with https://www.npmjs.com/package/jsmpeg Where i changed Stream.js to:

import jsmpeg from 'jsmpeg';

const Stream = (props) => {
  const client = new WebSocket("ws://localhost:9999")
  const player = new jsmpeg(client, {
    canvas: document.getElementById("video-canvas"), // Canvas should be a canvas DOM element
  });

 return (
    <Fragment>
        <canvas
          id="video-canvas"
          className={classes.canvas}
          onMouseDown={onMouseDownHandler}
        ></canvas>
    </Fragment>
  );
};

Now the error is: ReferenceError: window is not defined

Vlad Crehul
  • 31
  • 1
  • 6
  • 1
    Does this answer your question? [Window is not defined in Next.js React app](https://stackoverflow.com/questions/55151041/window-is-not-defined-in-next-js-react-app) – juliomalves Dec 03 '21 at 22:55
  • @juliomalves I'd say it might but I can't afford to test it anymore. My solution now was to give up on node for RTSP handling and made another server in Flask and handled the stream there. – Vlad Crehul Dec 06 '21 at 11:52
  • I am trying to have this solution in python as well. May I use your flask solution? – YSLdev Sep 15 '22 at 21:20

2 Answers2

2

I managed to make this work with two solutios:

  1. Download jsmpeg.min.js from here and add it to the same directory as your component (or to any other desired directory).

  2. Change the first part of the file from var JSMpeg... to export const JSMpeg...

  3. Add a component with the following content:

import { useRef } from 'react'

const StreamPlayer = () => {
    
    const streamRef = useRef(null)

    useEffect(() => {
        const { JSMpeg } = require('./jsmpeg.min.js')
        const player = new JSMpeg.Player('ws://localhost:9999', {
            canvas: streamRef.current
        })
    }, [])

return <canvas ref={streamRef} id="stream-canvas"></canvas>
}

NOTE

I am not sure if it is ok to use require in useEffect but since JSMpeg is using document and window objects, they have to be imported after they are loaded.\

The second way
to do this is a less modular way but doesn't include this potentially bad practice:
  1. Download jsmpeg.min.js from here and add it to the /public directory of your Nextjs project.

  2. Add a component with the following content:

import Script from 'next/script'

const StreamPlayer = () => {

return (
    <>
        <canvas id="stream-canvas"></canvas>
        <Script src="jsmpeg.min.js" id="jsmpeg"></Script>
    </>
)}
  1. Then, at the end of the jsmpeg.min.js I added:
player = new JSMpeg.Player('ws://localhost:9999', {
    canvas: document.getElementById('stream-canvas')
})
YSLdev
  • 142
  • 1
  • 11
  • Instead of using next/sript, I'm using import() inside useEffect, tho. Thanks, it's working. – ImBIOS May 08 '23 at 01:42
0

jsmpeg can play mpeg1 only. please use ffmpeg with mpegts format stream