• 0

Sending large message over TCP


Question

Hello, I have a little client-server application using tcp sockets. On both sides I use a buffer size of 4096 and I run all my tests with the client and server on the same machine, using address 127.0.0.1. Everything is going well except that now I suddenly need to send much larger messages. A message could now easily be 12MB in size. Now, what happens is that the transmission is incomplete : on the receiving end, I only get a chunk of the message instead of the whole thing.

I thought about simply increasing the buffer sizes on both ends to something larger than my maximum message size, but I'm not sure that's a reliable method, even if it works, and I'm not sure it'll work (if it's supported by TCP, if I won't run out of memory, etc.).

I also thought about splitting the message and actually my sending code automatically does that already, it will keep sending until all the data has been sent. However, on the receiving end, how do I know where a message ends, do I need to implement a protocol for that? I thought TCP already took care of that.

Anyway I'm a bit lost there, thanks for any tips.

Link to comment
https://www.neowin.net/forum/topic/809920-sending-large-message-over-tcp/
Share on other sites

9 answers to this question

Recommended Posts

  • 0

The main way to doing this (which you may have realized) is to send header data with each packet sent. About 5 years back when I got into remote desktop applications (using VB6 and winsock) I'd send something like 0|DATA. Each time something was sent, an integer was prefixed with the character | as the splitter. Alternatively, if you aren't sure what characters may be sent through, you can instead use char(0) (invisible char if displayed but detectable w/ programming). I'd have constants/enumerator defining exactly what each pre-fixed integer means (something like const FIRST_DOWNLOAD_PACKET = 0; const NEXT_DOWNLOAD_PACKET = 1; const LAST_DOWNLOAD_PACKET = 2;) and so forth. Whenever the client application would receive a packet, I'd split the packet into 2 - header info and data. Header information told me what to do with the information while the data contained the actual information.

You could create a class to handle something like this too (something like a packet class) for re-using in other apps using sockets.

  • 0
  dlegend said:
The main way to doing this (which you may have realized) is to send header data with each packet sent. About 5 years back when I got into remote desktop applications (using VB6 and winsock) I'd send something like 0|DATA. Each time something was sent, an integer was prefixed with the character | as the splitter. Alternatively, if you aren't sure what characters may be sent through, you can instead use char(0) (invisible char if displayed but detectable w/ programming). I'd have constants/enumerator defining exactly what each pre-fixed integer means (something like const FIRST_DOWNLOAD_PACKET = 0; const NEXT_DOWNLOAD_PACKET = 1; const LAST_DOWNLOAD_PACKET = 2;) and so forth. Whenever the client application would receive a packet, I'd split the packet into 2 - header info and data. Header information told me what to do with the information while the data contained the actual information.

You could create a class to handle something like this too (something like a packet class) for re-using in other apps using sockets.

Yes, that's pretty much what I did, although more simple. My problem was basically how to send variable length messages; the receiving end has no idea where a message begins and where it ends. Sending fixed-sized messages is fine though, I can just buffer the input and split every [LENGTH] bytes. (where LENGTH is, of course, a constant)

So how I solved it is that I each time I send a message, I prefix it with a fixed-size header containing the length of the message. Another way I could have done it is to begin and end each message with a special character, but I was worried that could too easily break.

And yup this is all encapsulated in a nice, single-purpose, reusable class. :p

  • 0

the tcp protocol will ensure the data arrives in the correct order if you just break it up into chunks and send them off one after the other

the most stable/reliable method i've come across is where you send a packet and don't send anymore until the client responds with some form of acknowledgement

server sends 4KB

client responses "ok"

server sends 4KB

client responses "ok"

~ repeat

problem with that is the speed is now completely dependant on latency and you can't take advantage of burst techniques however if you're sending to a low latency network you shouldn't have many issues

always good to have various approaches for various network types to get the most out of them :D

  • 0
  DDStriker said:
the tcp protocol will ensure the data arrives in the correct order if you just break it up into chunks and send them off one after the other

the most stable/reliable method i've come across is where you send a packet and don't send anymore until the client responds with some form of acknowledgement

server sends 4KB

client responses "ok"

server sends 4KB

client responses "ok"

~ repeat

TCP handles acknowledgment and throttling for you automatically. If you're wanting to send data to a remote endpoint as fast as possible, you might as well send it in chunks of 4096 bytes and let TCP handle those. If throttling occurs, eventually send() will not return for awhile on the local side until the congestion has been alleviated.

  • 0

Since you're already splitting your messages at this point, you may want to reduce their size. 4096 byte packets are definitely going to be fragmented (unless this is a specialized closed network of some sort, this would even fragment on the old token ring), and that could result in latency due to retransmission. 1456 would be a typical size that would not fragment on most contemporary ethernet networks (1500 MTU, 20 byte IP header, 24 byte TCP header).

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.
  • Posts

    • eagerly awaiting a version that runs on android tablets
    • I grew up with the Beach Boys. Even saw them in concert in the 70's. Brian suffered from mental issues all his life. May you find peace wherever you are.
    • Wikipedia suffers backlash from human editors over AI summaries, prompting feature pause by David Uzondu Wikipedia editors have pushed back against plans from the Wikimedia Foundation to test AI-generated article summaries, powered by Aya, the open-weight AI model from Cohere. The non-profit has now paused the project. The decision came after a swift and overwhelmingly negative reaction from its community. As first reported by 404Media, the plan involved a two-week, opt-in trial on the mobile version of Wikipedia. But the volunteer editors who build the encyclopedia met the idea with immediate and fierce opposition. The project's discussion page became a torrent of rejection. It included simple comments like "Yuck" and blunt declarations like "strongest possible oppose" and "Absolutely not." One editor argued that a test would cause "immediate and irreversible harm to our readers and to our reputation as a decently trustworthy and serious source." They noted that Wikipedia has built its name on being sober and reliable, not flashy. Another feared it would destroy the site's collaborative model. They argued that while the "collective mass" of human editors "evens out into a beautiful corpus," the AI would install "one singular editor with known reliability and NPOV [neutral point-of-view] issues" at the very top of an article. That same editor also noted the following: For context, this is what AI-generated summaries on the platform was supposed to look like: Image: 404Media It is not hard to see why they are so protective. The editors' fears are grounded in recent and very public failures of AI features from tech giants. For example, Google's AI overviews recently hit 1.5 billion monthly users. The feature became a laughingstock for telling people to put glue on their pizza and that a dog had played in the NBA. This is the kind of humiliating error Wikipedia's community is desperate to avoid, as it would undermine two-plus decades of careful work. We also saw the potential for reputational damage back in January. That was when Apple's AI feature falsely generated a notification claiming that Luigi Mangione had died by suicide. The man was actually alive and in custody. On the site's technical discussion page earlier today, Marshall Miller (MMiller), a Senior Director at the Wikimedia Foundation, posted an update acknowledging the feedback. He admitted, "It's clear we could have done a better job introducing this idea," and confirmed the experiment was paused. The Foundation says the goal was to explore accessibility for different readers. While this specific test is off the table, the organization still wants to use new technologies. Miller ended with a promise: "We do not have any plans for bringing a summary feature to the wikis without editor involvement." A WMF spokesperson also told 404Media that though the feature has been paused, the foundation is still interested in AI-generated summaries. The spokesperson insisted the goal was to eventually build moderation systems where "humans remain central" and called this kind of backlash feedback part of what makes Wikipedia a "truly collaborative platform."
    • I see, yeah that makes sense. I have been in situations where I barely did not crush badly on the road due to other driver starting to change lanes into another car - freaked out last second and avoided it by crashing into the side of the bridge instead. i got away because I quickly changed lanes 2 times in a couple of second and unlike that idiot I did not lose control big part of this was my car was good 😊 (audi a7) vs the old van the crashed driver was driving would AI be able to react and quickly change lanes twice both time barely avoiding collision … I don’t know my car systems pumped the breaks and tried to warn me with a beep and vibration but if I slammed the breaks the car behind me would hit me then again I have BMW driver training and a good car - so I have no idea how robot taxi would react i am not sure extreme fast lane changes would be programmed in - it is dangerous as hell unless you are FULLY aware, and have done it before but it is a general risk to do it especially in the conditions with bad weather and when you are not driving a sports car with 4 wheel drive and very good control
    • PDF Arranger 1.12.1 by Razvan Serea PDF Arranger merges or splits PDF documents and rotates, crops and rearranges their pages using an interactive and intuitive graphical interface. It is a front end for pikepdf. It's available for Linux and Windows. PDF Arranger features: Merge double-sided scanned document Delete pages from a PDF file Rotate pages in a PDF file Merge multiple PDF documents Zoom in / out Export selected pages from a PDF Undo/redo support Duplicate PDF pages Crop white borders Supports importing encrypted PDF files Create a booklet from multiple pages Allow to edit Keywords, Subjects and dates in document info ...and more PDF Arranger 1.12.1 changelog: Fix incompatibility with Python 3.13.4 on Linux #1238 Update Dutch and Italian translation Download: PDF Arranger 1.12.1 | 42.6 MB (Open Source) Download: PDF Arranger Portable | PortableApps.com View: PDF Arranger Website | Other operating systems | Screenshot Get alerted to all of our Software updates on Twitter at @NeowinSoftware
  • Recent Achievements

    • Collaborator
      CHUNWEI earned a badge
      Collaborator
    • Apprentice
      Cole Multipass went up a rank
      Apprentice
    • Posting Machine
      David Uzondu earned a badge
      Posting Machine
    • One Month Later
      Stokenking earned a badge
      One Month Later
    • One Month Later
      Kevin Jones earned a badge
      One Month Later
  • Popular Contributors

    1. 1
      +primortal
      537
    2. 2
      ATLien_0
      266
    3. 3
      +Edouard
      193
    4. 4
      +FloatingFatMan
      181
    5. 5
      snowy owl
      135
  • Tell a friend

    Love Neowin? Tell a friend!