When a user inputs a range, let's say 3-5, the script should generate a random integer within that range. Initially, the code functions correctly.
length = Math.floor(Math.random() * (5 - 3 + 1)) + 3;
However, when I try to extract the values programmatically and apply the same logic, the results become nonsensical.
//Split into elements
var range = lengthval.split("-"),
minlen = range[0],
maxlen = range[1];
if (!isNaN(minlen) && !isNaN(maxlen)) {
//Pick a number from range
length = Math.floor(Math.random() * (maxlen - minlen + 1)) + minlen;
}
The strange thing is that this "hybrid" code snippet works smoothly.
//Split into elements
var range = lengthval.split("-"),
minlen = range[0],
maxlen = range[1];
if (!isNaN(minlen) && !isNaN(maxlen)) {
//Pick a number from range
length = Math.floor(Math.random() * (maxlen - minlen + 1)) + 3;
}
Assuming the supplied range is 3-5.
Is there anyone who could assist a novice JavaScript developer in solving this puzzle? :)