Short answer: Not really, no.
Longer answer: A "full-screen" app uses the display in a different way than a normal Windows GUI app. Exactly how they work depends on the application and the OS; most will leverage the DirectX or OpenGL APIs which allow applications to directly manipulate the memory representing the screen, as well as access the accelerated capabilities of the GPU. These apps are supported by basically creating a GUI "window" (more like a panel) that covers the entire primary display and is "always on top", preventing the desktop elements from ever being painted as they're always behind the full-screen app's window in the Z-order. The application then gets "relaxed" access to the memory representing the display rectangle of that GUI window, so it can manipulate individual pixels without needing to use the message loop to repaint that area.
In this situation, the display is being painted when the application wants (which is virtually always either "as fast as possible" or "synchronized with the next monitor scan"), NOT when the Windows GUI thinks it's a good time. So, anything that IS painted when Windows thinks it's a good idea will flicker; Windows paints your "always on top" window over the app's "window" in the GUI's Z-order, then the app paints back over the window by drawing directly onto its rectangle. That causes Windows to invalidate and redraw your window, and the cycle continues.
The solution is not only to make an "always-on-top" window, but to somehow programmatically "task-switch" from the full-screen app's window to yours. This may require your app to have privileges that most managed runtimes can't or won't grant. If it's possible, then when it happens the full-screen app will minimize (which may or may not be a HUGE problem for your users; whatever your app is trying to tell me, it will almost certainly NOT be worth minimizing my StarCraft 2 session in the middle of an online melee).