I've encountered a particular issue: the Javascript library I am developing utilizes JSON cross-domain requests to fetch data from a backend powered by Ruby on Rails:
function getData()
{
$.ajaxSetup({ 'beforeSend': function(xhr) {xhr.setRequestHeader("Accept", "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8")} })
$.ajax({
url: backend_server + '?callback=parseData&op=516',
contentType: "application/json; charset=utf-8",
dataType: 'jsonp',
success: function (xml) {
//console.log('success');
}
});
}
The database that RoR accesses internally uses latin1, but as far as I remember, these JSON requests can only be made using UTF-8.
The header of the webpage indicates:
Content-Type text/html; charset=ISO-8859-1
Additionally, the meta tag of the page is also set to ISO-8859-1:
<meta http-equiv="Content-Type" content="text/html;charset=iso-8859-1"/>
Upon receiving the data from the request, my JavaScript library attempted to parse it and display it within a specific div on the page (using the latest version of JQuery). However, all the Latin characters appeared distorted.
I observed that various browsers interpreted these characters differently - some displayed them correctly while others did not. To address this, I modified a utf8_decode function I found here (skipping processing for Safari, IE, and Opera based on user agents), yet I still struggled to properly display special Latin uppercase characters like "É", "Ç", "À", "Á", "Â", or "Ã".
Any suggestions? I find myself quite perplexed and would appreciate any guidance. Thank you in advance, J.
PS: The top comment on the function's website was posted by me as well.
Edit1: I also attempted using
unescape(encodeURIComponent(str_data))
, but it did not yield the desired outcome.