I am curious about the efficiency differences in terms of browser/CPU/memory between using JSON.stringify for serialization versus writing an object to an object store in indexedDB.
The context for this question is optimizing the process of writing an object from a database to a client's hard drive. Typically, the object is retrieved with a 'get' statement and then saved as a string using JSON.stringify before being written to disk. Upon retrieval, the string is converted back to an object using JSON.parse and stored in the database. This seemingly involves unnecessary steps of unserialization and reserialization.
My first question is whether it is possible to directly retrieve the serialized form of an object from the database, eliminating the need for stringify/parse operations when saving and retrieving data.
The second part of my inquiry focuses on using indexedDB without indexing or querying by value. In such cases, storing the stringified object instead of the object itself might offer advantages in reducing intermediary steps during read/write processes.
In scenarios where only out-of-line unique keys are used, avoiding stringify/parse could streamline disk operations, although modifying the data would require temporary conversion to an object format.
Comparing JSON functions to indexedDB serialization, I am intrigued by the potential performance differences. While non-object storage may seem more efficient, handling updates could pose challenges if objects need frequent conversions.
I aim to conduct experiments to observe any distinctions, yet I seek insights into the underlying mechanisms behind these two serialization methods.
EDIT
I recently learned about the structured clone algorithm used in indexedDB but wonder if further serialization occurs before storage. Additionally, JSON operates differently while not duplicating objects. These nuances have raised questions regarding the necessity of explicit serialization steps in databases.
After delving deeper into this topic, I realize that some aspects may be beyond my current understanding, leading me to reconsider whether my original inquiries hold significance.
Thank you.
EXAMPLE:
@Sam152 Incidentally, have you noticed any performance variations between using stringify versus direct object insertion in your database transactions? Please see the code snippet below for reference.
// var o = A large object.
var d;
write.x = 0;
write.y=[];
DB_open().then( inter );
function inter() { d = setInterval( write, 1000 ); }
function write()
{
let T = DB_open.base.transaction( [ 'os_name' ], 'readwrite' ),
q = T.objectStore( 'os_name' ),
start = Date.now();
T.oncomplete = function() { write.y.push( Date.now() - start ); }
if ( write.x < 50 )
write.x = write.x + 1;
else
{
clearInterval(d);
console.log( 'done' );
};
o.key = write.x;
q.put( o );
// OR
q.put( { 'key' : write.x, 'd' : JSON.stringify( o ) } );
} // close write
// When complete.
total = 0;
write.y.forEach( ( v ) => { total = total + v; } );
Or, multiple put
statements within the same transaction.
function write()
{
let T = DB_open.base.transaction( [ 'os_name' ], 'readwrite' ),
q = T.objectStore( 'os_name' ),
start = Date.now(),
i;
T.oncomplete = function()
{ console.log( 'Completed : ' + ( Date.now() - start ) ); }
for ( i = 1; i < 51; i++ )
{
o.key = i;
q.put( o );
// OR
q.put( { 'key' : i, 'd' : JSON.stringify( o ) } );
};
} // close write