I'm not expecting professional levels of delta-error, but even just knowing the primaries of your LEDs and their voltage/current/PWM (depending on driving mode) response would be enough to get a rough map of the available colourspace. Once you have even a good estimate of the colourspace you can now translate between colourspaces which would greatly ease in matching multiple systems to the same colour.
The short (ish) version is:
The range of possible colours an RGB light can produce is determined by the 'primaries' (the colours of the R, G and B sources). You can think of that range as a 'volume', roughly cuboid, whose corners consist of: pure R, pure G, pure B, white (R G and B at max, AKA your 'white point') and black (R G and B at minimum). Put the black point at the 'bottom', the white point at the 'top', and that shape is the 'colourspace'.
If you imagine the most pure red possible, the most pure blue possible, and the most pure green possible, then you can make a space of all possible colours.
This cannot be achieved in reality, because no physical object could produce these theoretically pure colours. We're mapping colour as perceived by the human eye, which doesn't directly correspond to a spectrograph of a given light source.
If you have different coloured primaries, then this cuboid will change shape and move about within that 'full' colourspace.
TV, cinema, desktop monitors, and HDR all have one or more specifically defined colourspaces they are expected to conform to.
sRGB is the common one for desktop monitors though sometimes you'll find AdobeRGB, rec.709 for HDTV, DCI P3 for cinema, rec.2020 for UHD, rec.2100 for HDR, etc.
The problem is, the way colour data is
stored is not 'here is the exact colour I want to represent with this pixel', but instead 'here is a mix of R, G and B for this pixel'. Notice how we just handed over three numbers without bothering to define colour primaries, or absolute brightness, or whitepoint? This is why it's so hard to get an image on any given screen to look the same as on another: a colourspace is just assumed, and if you assume the wrong one the image is not displayed as intended.
Some displays even deliberately use a 'wrong' colourspace to have punchy saturated colours, or are just unable to actually match any specific colourspace due to poor choice of backlight and/or colour filters and.or phosphors etc.
The generally accepted standard for describing the 'space of all possible colours' is '
Lab colour space. If you know how your system fits into Lab space, then you can use that as a go-between to know your it maps to every other colour space.
If you know what colourspace a colour is stored as (for PC use, assuming sRGB is almost always correct), and know what the colourspace your light source can achieve, you can map one onto another. The actual mapping can either be just a simple intersection (i.e. limited to choosing colours both spaces have in common), or you can clip out any colours that go outside your achievable colour volume, or even 'squeeze' in colours that fall near and outside the edge of the volume. The last option seems initially attractive because it means no sharp transitions if you;re trying to do a fade, but overall it means trying to actually match any given colour is nearly impossible. Think of it like trying to solve the problem of being to tall to fit your entire view in a mirror by bending the mirror and viewing a funhouse reflection: technically solves the problem of fitting everything in, but gives a useless result.
Now, a full Lab representation of an RGB lighting system would be massive overkill. For the forseeable future (any use of non-sRGB colour is still a clusterfuck on PC, let alone HDR), even a rough translation between sRGB and what your LEDs can achieve would but it well beyond the capabilities of any RGB lighting system.
As for how to measure this, my first idea would be to grab the cheapest tristimulus colorimeter* (or spectrophotometer* if one turns up) off of ebay, which will be sold as a monitor calibrator. Pair this with
DisplayCal. Because we're only trying to get a relationship with sRGB, we can ignore most of Displaycal's capabilities, and just pretend the LED is a single-pixel monitor.
The way DisplayCal works is to draw a box of a certain colour on your screen. You stick the calibrator over that box, and DisplayCal reads the measurements to tell how far off your display is from the colour it
should be showing. The 'hacky' way to make this work to calibrate an LED would be to stick the calibrator on the LED, and 'sniff' that box for the onscreen colour, then set the LED to that colour for the sensor to read. DisplayCal will iterate through a bunch of different colours, then spit out a LUT (Look Up Table) that tells you how to get a desired colour from the actual range the LED can show. Depending on how the LED driver is set up, you could even correct how it drives the LED and recalibrate to get a more refined LUT.
*Tristimulus colorimeter = looks for how much red, green and blue light it can see using filters. Only gives truly correct results for a single colourspace and if the display primaries match those of that colourspace. Great for calibrating an sRGB display to sRGB, won't give quite the right answers in other cases
*Spectrophotometer = compact spectrograph. measures the spectrum of a lightsource. More broadly applicable than a tristrimulus colorimeter, but less accurate.