Receiver sensitivity is a measure of the ability of a receiver to demodulate and get information from a signal. Most people quantify sensitivity as the lowest signal power level from which we can get useful information.
Since the above definition uses “demodulate”, you should immediately understand, this is a meaningless specification for a pure radio. There is no demodulation in the base design, and we pass samples around. The actual receiver sensitivity will depend on channel bandwidth, temperature, modulation scheme, how robust the demodulator is, something that we just don't control (that's all up to you).
What is specified, and measured, is noise:
These noise numbers can be used to calculate the min received power to decode something.
The IEEE Standard definition of noise figure, states that:
The noise factor, at a specified input frequency, is defined as the ratio of:
(1) the total noise power per unit bandwidth available at the output port when noise temperature of the input termination is standard (290 K) to (2) that portion of (1) engendered at the input frequency by the input termination.
This is measured using test equiment, as specified in Agilent Fundamentals of RF and Microwave Noise Figure Measurements.