On the server side, I have implemented a loop that takes a 16-bit integer ranging from 0 to 639 and splits it into two 8-bit characters to populate a buffer of 1280 Bytes. The buffer is then sent via TCP-IP to the client.
.c
unsigned int data2[1000];
char *p;
len = generate_http_header(buf, "js", 1280);
p = buf + len;
for (j=0; j<640; j++)
{
char_out[1]=(unsigned char)(data2[j]&0x00FF);
char_out[0]=(unsigned char)((data2[j]>>8)&(0x00FF));
*p=char_out[0];
p=p+1;
*p=char_out[1];
p=p+1;
}
....
tcp_write(pcb, buf, len, 1);
tcp_output(pcb);
Meanwhile, on the client side, I am attempting to extract the 16-bit integer from a JSON object. Despite my efforts, I seem to be encountering issues in obtaining all the values within the range of 0 to 639.
.js
var bin=o.responseText;
for(i=0;i<1000;i=i+2)
{
a=bin[i].charCodeAt();
b=bin[i+1].charCodeAt();
// Get binary representation.
a=parseInt(a).toString(2);
a=parseInt(a);
b=parseInt(b).toString(2);
b=parseInt(b);
//padding zeros left.
a=pad(a,8);
b=pad(b,8)
//Concatenate and convert to string.
a=a.toString();
b=b.toString();
c=a+b;
//Convert to decimal
c=parseInt(c,2);
fin=fin+c.toString();
}
alert('FINAL NUMBER'+fin);
I decided to use Fire BUG to inspect the HTTP response from the server:
<missing_characters_here>