It depends on what you mean by "used in the editing app". If you mean use it as the working colour space, then E.J. is right -- that's a bad move. Always. Whether you have a high-end monitor or a normal one, or multiple monitors of any kind.
But if you mean use it as the output profile for rendering accurate colours to the display, then absolutely you should be using it. Photoshop and Lightroom will do this automatically and not give you any choice, anyway.
The reason it's not a problem is one of the things about monitor ICC profiles that confuses a lot of people. We say "calibration" and "profiling" as if they're one and the same, but they're not. Calibration is about putting the output device into a standard baseline state. This sets things like black point, white point, colour temperature, luminance, a few things like that. An ICC profile stores calibration data but this was done only after Apple (ironically) realized that an ICC profile was a handy place to stash the monitor calibration data. Normally, calibration data is not stored in an ICC profile. For example, printer profiles do not contain printer calibration data (often called "linearization" info in the case of printers).
The thing that makes use of the calibration data stored in the profile is not actually the editing app like Photoshop. Rather it's a LUT loader utility. The calibration data is read out of the profile, and then loaded into the LUT -- either the one in a high-end monitor (in which case the LUT loader is a proprietary piece of software provided by the monitor manufacturer), or the one in the video card for all other cases (in which case the LUT loader is ColorSync on Mac OS, a calibration tool LUT loader on most Windows systems, or Windows itself if you've configured this new feature on Win7 or Win8).
The main purpose of the ICC profile is to actually contain profiling data, which can be thought of as characterizing the colour space of the output device. How big & of what shape is the colour gamut, translation tables for converting image colour data into & out of the device colour space, and rules for how out-of-gamut colours will be treated (these are the 4 rendering intents including Perceptual and Relative Colorimetric). This profiling data is what gets used by your editing software -- Photoshop, Lightroom and everything else. The editing app converts image data from the current working colour space (e.g. Adobe RGB 1998) into the monitor colour space, prior to sending the image data down to the video card.
Once the image data hits the video card, it is run through the correction curves that are in the video card LUT, if there are any. The image data is then piped out to the monitor. Again, if there are correction curves found there in a monitor LUT, the image data is run through them. Then pixels appear on the screen. All of this is high level, and somebody probably can point out somewhere I'm on a bit of thin ice in specifics; but it should be generally accurate to illustrate the point.
So that's the long answer. The short answer is, there is no "double profiling" when using a high-end monitor like a NEC with its own built-in LUT.
Unless! Yes, there is an "unless".
If you try to mix 2 monitors on one single video card, and one monitor has a built-in LUT but the other doesn't, then in most cases you will not be able to calibrate both monitors. That's because the only way to calibrate the LUT-less monitor is to load its calibration data into the video card's 8-bit LUT. If the video card only has one LUT (which is the case for most video cards), the second you load any correction curves into the card's LUT you just screwed up the high-end monitor. Even though a NEC has its own built-in LUT, it still receives image data from the video card. When you change the video card LUT, if there's only one LUT in the card, it affects the output to all connected monitors. That's normally a bad thing, and it defeats the purpose of the high-bit LUT in the monitor.
This is the big reason why you can't calibrate 2 (or more) normal monitors on Windows systems using a single garden variety video card. That's because the card has only one LUT, and there's no place to store the separate correction curves needed for more than one monitor. The are only 3 ways to calibrate multiple displays on Windows that I know of: 1) Use multiple video cards, each connected only to a single monitor. 2) Use a high-end video card that actually has multiple LUT's onboard. (Like an nVidia Quadro card.) 3) Use all high-end monitors each with their own built-in LUT.
Mac OS is different. Since Apple invented the idea of using the ICC profile to stash the calibration data, and had the ColorSync utility to handle loading that data into video card LUT's, I believe Apple since that time spec'ed out video cards that had multiple onboard LUT's. That may have changed in the era of Intel-based Mac's using more open components, I'm not entirely sure...