4

I have been unsuccessful trying to use the OpenGL driver with ILNumerics visualizations. I am trying to just do a basic visualization following Quick Start guide - every time I launch the application I get the error message "no compatible hardware accelerated driver could be found or activated" with error reported "Attempted to read or write protected memory. This is often an indication that other memory is corrupt". The graphics driver falls back to GDI which is really slow.

I have tried all of the suggested fixes for this problem. I installed the latest Intel HD graphics driver, and I ran OpenGL Extensions viewer which indicates that OpenGL 4.0 is supported. ILNumerics documentation indicates 3.1+ is required, which my system appears to support.

So I am at a loss here. Is there a way to use hardware rendering with this Intel card, or not?

2 Answers2

2

I have also been trying to use the ILNumerics OpenGL driver but with an an Intel HD4000. I get the same error and the debug log shows that ILNumerics crashes with at the glDrawElements call.

I found a work around when initializing an ilPlotCube so that the OpenGL driver will not crash. I am using the Window Forms ilPanel control and ilNumerics 3.2.2.0 from NuGet.

  • In the ilPanel_load event create an ilPlotCube and set the x-axis scale to logarithmic. Add the plotcube to the scene.
  • Add an ilPoint element to the plotcube. I fill it with random data.
  • For me this runs and the plot control loads using the OpenGL driver without crashing.

    void ilPanel1_Load(object sender, EventArgs e)
    {                     
        var pc = new ILPlotCube(twoDMode: false);
        // Set an axis scale to logarithmic so the GL driver will not crash
        pc.ScaleModes.XAxisScale = AxisScale.Logarithmic;
    
        // Create a new scene
        var scene = new ILScene();  
        scene.Add(pc);            
        this.ilPanel1.Scene = scene;
    
        // Add points to the scene so the GL driver will not crash
        this.AddPoints();
    }
    
    /// <summary>
    /// Add an ILPoints object so GL driver will not crash
    /// </summary>
    private void AddPoints()
    {
        var pc = ilPanel1.Scene.First<ILPlotCube>();
    
        ILArray<float> A = ILMath.tosingle(ILMath.rand(3, 1000));
        var points = new ILPoints
        {
            Positions = A,
            Colors = A,
            Size = 2,
        };
    
        pc.Add(points);
        this.points = points;
    }
    

If the control loads successfully with the OpenGL driver then remove the points element from the scene. Set the axis scale as desired. Add another charting element which plots the actual thing you want to plot.

        // Remove the ILPoints shape
        if (this.points != null && ilPanel1.Scene.Contains(points))
        {
            ilPanel1.Scene.Remove(this.points);
            this.points = null;
        }

        // Set the axis scale back to linear
        var pcsm = ilPanel1.Scene.First<ILPlotCube>().ScaleModes;
        pcsm.XAxisScale = AxisScale.Linear;

        // Add actual plots here
Decatf
  • 36
  • 1
  • 2
  • Decatf, this is very interesting! It may help to narrow down the bug. I do still think, it is an issue with the driver rather than ILNumerics. But we will take a look and possibly provide a workaround for the issue. Thanks – Haymo Kutschbach Oct 21 '13 at 09:50
  • 1
    After having spent some time on this, I cannot recommend this method. Indeed it does seem to work (also on my HD 2500) but is simply too hacky to serve as a 'solution'. On my machine, it is even sufficient to only enable log axis scales on some plot. No need for the additionl points shape. But all what this does it to modify the rendering order in a (random) way, that the Intel driver does not crash. However, the driver still messes around with the OpenGL uniform blocks and it might crash at some other point instead. So: no, unfortunately, this is not a solution :/ – Haymo Kutschbach Oct 21 '13 at 13:51
  • if it's dumb but it works, it's not dumb! i can't force all of my users to buy a proper graphics card. my choices are to use GDI rendering only and have all of the graphics be painfully slow, or to at least attempt this hack. and if the hack fails and the graphics driver still crashes, it's not like any harm was done. it will fall back to GDI rendering and I'm back where I started. thanks, @Decatf ! – christophocles Jan 08 '14 at 16:52
  • I have to agree with Haymo. I have noticed over time that there are also rendering problems across different driver versions. For example with some driver versions the axis lines are not rendered or flicker when the camera view is changed. Just beware that although this gets around the initial crash it is not really a good solution since users will be running it on all different kinds of driver configurations. – Decatf Jan 13 '14 at 21:08
1

Intel HD graphics often causes problems with OpenGL. You should file a bugreport on the Intel bugtracker and resort to a graphics card which does support OpenGL 3.1 - really.

Haymo Kutschbach
  • 3,322
  • 1
  • 17
  • 25
  • 2
    sure, I can obtain a discrete graphics card for my own workstation, but the application I am developing will be used by others who most likely have intel graphics. how can I get more information to attach to this bug report? like what specific opengl extension is failing? – christophocles Oct 16 '13 at 15:35