Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automatically split up too big messages in smaller messages #212

Open
h0jeZvgoxFepBQ2C opened this issue Jan 23, 2021 · 1 comment
Open
Labels
bug Something isn't working. It's clear that this does need to be fixed. enhancement New feature or improved functionality.

Comments

@h0jeZvgoxFepBQ2C
Copy link

h0jeZvgoxFepBQ2C commented Jan 23, 2021

Right now we only get an exception, so we built a splitting mechanism by ourselfes, but it would be great if your lib would adjust the message size automatically. Also sometimes we are still hitting the limits (even though we are calculating all the bytes by ourselfes), so maybe you have a better solution?

This is our solution (we are rebuilding the message on the client side afterwards):

max_message_size = 190_000  # Limit is 256kb, more than 190_000 doesn't work sometimes.. dunno why

chars = message.to_json.to_s.chars
message_chunks = []
while chars.any?

  chunks = []
  chunksize = 0
  while chars.any? && chunksize < max_message_size
    if (chunksize + chars[0].bytesize) > max_message_size
      break
    else
      next_char = chars.shift()
      chunksize += next_char.bytesize
      chunks << next_char
    end
  end
  message_chunks << chunks.join
end

client = Ably::Rest.new(ENV["ABLY_PRIVATE_KEY"])
channel = client.channel(channel_name)
message_id = SecureRandom.uuid

message_chunks.each_with_index do |chunk, index|
  data = {
    utf: "✓",
    message_id: message_id,
    message_index: index,
    message_count: message_chunks.count,
    data: chunk
  }

  channel.publish(event, data)
end

Rebuliding on client side:

let incomingMessages = {}

const handleMessageChunk = (type, messageChunk) => {
  
  if (messageChunk.message_id && messageChunk.message_count && (messageChunk.message_index || messageChunk.message_index === 0)) {

    if(incomingMessages[messageChunk.message_id] === undefined) {
      incomingMessages[messageChunk.message_id] = {}
    }
    
    incomingMessages[messageChunk.message_id][messageChunk.message_index] = messageChunk

    let isMessageComplete = Object.keys(incomingMessages[messageChunk.message_id]).length == messageChunk.message_count

    if (isMessageComplete) {
      let completeMessage = ""
      for (let i = 0; i < messageChunk.message_count; i++) {
        completeMessage += incomingMessages[messageChunk.message_id][i].d;
      }

      delete incomingMessages[messageChunk.message_id]
      console.log("message completed", JSON.parse(completeMessage), incomingMessages)
      dispatch({type: type, payload: JSON.parse(completeMessage)})
    } else {
      //console.log("No finished message", messageChunk)
    }
  }
}

┆Issue is synchronized with this Jira Bug by Unito

@jamienewcomb jamienewcomb added the enhancement New feature or improved functionality. label Feb 8, 2021
@lawrence-forooghian lawrence-forooghian added the bug Something isn't working. It's clear that this does need to be fixed. label Apr 22, 2024
@lawrence-forooghian
Copy link
Collaborator

Hi @h0jeZvgoxFepBQ2C, thanks for raising this issue. Automatic message splitting isn't on our roadmap, as far as I’m aware. However, I’m marking this issue as a bug so that we can investigate why you’re unable to send messages of the theoretical maximum size.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working. It's clear that this does need to be fixed. enhancement New feature or improved functionality.
Development

No branches or pull requests

3 participants