We are dealing with a data model that consists of 600 boolean values per entity. This entire dataset needs to be transmitted from a node.js backend to an Angular frontend via JSON.
Considering the internal nature of this API (not public), our primary focus is on performance and bandwidth optimization rather than adhering strictly to best practices.
Seeking input on potential optimization strategies, here are some options under consideration:
Converting it into a bitfield and utilizing a massive (600-bit)
BigInt
. However, concerns exist regarding potential performance drawbacks associated with this approach.Partitioning the 600 bits into 10 integers (given that JS integers are 64-bit) and inserting them into an array within the JSON payload.
Transforming the binary blob through Base64 encoding (assumed to be decoded into a UInt8Array).
Exploring the use of Protobuf, though the complexity of implementation may outweigh its benefits considering time constraints and reluctance to make significant architecture changes.
Note: An absence of compression on the server side due to infrastructure limitations necessitates tackling this issue at the data level.
Thank you!