• 0

C#] Binary Serialize - Deserialize in another app?


Question

Here's the situation. I developed a Serializable class in one application, and then I binary serialize the info to a file. Using the exact same class in another one of my applications, I cannot deserialize because of some security measures in the .NET framework.

Here's the error I get when dezserializing in the other app:

SerializationException:
Unable to find assembly 'GDB Tester App, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null'.

Where 'GDB Tester App' is the name of the app that was used for serialization.

How can I workaround these darn security measures?

Thanks in advance.

11 answers to this question

Recommended Posts

  • 0
  Express said:
Thats not a security exception.

586125094[/snapback]

Oh, according to the MSDN documentation it said it was - but you're right. I don't see how it could be very secure.

  Quote
If you serializing custom objects then you should define the custom class in a shared assembly that would refrenced by both the class that deserializes and the class that serializes.

Is there any way I could deserialize with anything (i.e. not just a shared assembly)?

  • 0
  Express said:
You can use Xml Serialization instead.

586125607[/snapback]

Crap. I knew it would all come back to XML. There is a utility to convert XML Schemas (xsd.exe) to C# classes, but is there any utility to write XML schemas? It's very tedious, and an often error prone process.

If not, then I guess I'll have to start crankin' out the verbose and ugly XML by hand.

  • 0
  VB Guy said:
Crap. I knew it would all come back to XML. There is a utility to convert XML Schemas (xsd.exe) to C# classes, but is there any utility to write XML schemas? It's very tedious, and an often error prone process.

If not, then I guess I'll have to start crankin' out the verbose and ugly XML by hand.

586125817[/snapback]

Its not very complicated. You can just tag all your public fields with [XmlAttribute].

  • 0

Personally, I would suggest writing your own binary serialization for the classes that you want to convert into bytes. That's what I normally do. It is more work, but it's easier to maintain (you don't get the problems you're having), and it's faster to read/write the data.

  • 0
  dannysmurf said:
Personally, I would suggest writing your own binary serialization for the classes that you want to convert into bytes. That's what I normally do. It is more work, but it's easier to maintain (you don't get the problems you're having), and it's faster to read/write the data.

586126790[/snapback]

Could you give me a small example on how to do that?

  • 0

Sure. Just convert the objects in the class to their byte equivalents and write the bytes to a file. Say you have a class that represents a person:

public class Person
{
 ? public String FirstName = "";
 ? public String LastName = "";
 ? public Int32 Age = 0;

 ? public Person() {};

 ? public void Serialize(String Filepath)
 ? {
 ? ? ?// Convert the class' data to bytes for writing

 ? ? ?byte[] firstNameBytes = System.Text.Encoding.UTF8.GetBytes(FirstName);
 ? ? ?byte[] lastNameBytes = System.Text.Encoding.UTF8.GetBytes(LastName);
 ? ? ?byte[] ageBytes = System.BitConverter.GetBytes(Age);

 ? ? ?// You'll need to store the lengths of any strings in the file, since strings can be any length. Integers and other numbers have fixed lengths. Integers are four bytes, doubles are eight bytes, etc.

 ? ? ?byte[] fnbLength = BitConverter.GetBytes(firstNameBytes.Length);
 ? ? ?byte[] lnbLength = BitConverter.GetBytes(lastNameBytes.Length);
 ? ? ?
 ? ? ?FileStream fs = new FileStream(Filepath,FileMode.Create,FileAccess.Write);
 ? ? ?// Write any file-identification data you want to here

 ? ? ?// Write the class data into the file.
 ? ? ?fs.Write(fnbLength,0,4);
 ? ? ?fs.Write(firstNameBytes,0,firstNameBytes.Length);
 ? ? ?fs.Write(lnbLength,0,4);
 ? ? ?fs.Write(lastNameBytes,0,lastNameBytes.Length);
 ? ? ?fs.Write(ageBytes,0,4);
 ? ? ?fs.Close();
 ? }

 ? public void Deserialize(String Filepath)
 ? {
 ? ? ?// Create byte array variables for all the known-length entities in the file
 ? ? ?byte[] fnbLength = new byte[4];
 ? ? ?byte[] lnbLength = new byte[4];
 ? ? ?byte[] ageBytes = new byte[4];

 ? ? ?FileStream fs = new FileStream(Filepath,FileMode.Open,FileAccess.Read);
 ? ? ?// Read back the file identification data, if any

 ? ? ?// Read the class data from the file
 ? ? ?fs.Read(fnbLength,0,4);
 ? ? ?byte[] firstNameBytes = new byte[BitConverter.ToInt32(fnbLength,0)];
 ? ? ?fs.Read(firstNameBytes,0,firstNameBytes.Length);
 ? ? ?fs.Read(lnbLength,0,4);
 ? ? ?byte[] lastNameBytes = new byte[BitConverter.ToInt32(lnbLength,0)];
 ? ? ?fs.Read(lastNameBytes,0,lastNameBytes.Length);
 ? ? ?fs.Read(ageBytes,0,4);
 ? ? ?fs.Close();

 ? ? ?// Convert the byte arrays back into useful data
 ? ? ?FirstName = System.Text.Encoding.UTF8.GetString(firstNameBytes);
 ? ? ?LastName = System.Text.Encoding.UTF8.GetString(lastNameBytes);
 ? ? ?Age = BitConverter.ToInt32(ageBytes,0);
 ? }
}

This is a fairly simple, and somewhat incomplete example, and not the best example of good programming style, but it should get you started. This technique, as I said, is a lot more work, because the de/serialization of each class becomes a unique event that you have to specifically program, but it is much faster than .NET binary serialization, and you don't have to make any guesses about what's going on behind the scenes.

There are two problems with this example. First, it's not as fast as it could be. If you use a chunk-based format, it will be a lot faster. And second, this technique does not handle large amounts of data well. Once you get into tens of megabytes, it slows down quite a bit. At that point, .NET binary serialization actually becomes faster, but also much MUCH more memory-intensive since you can't grab just one object out of the file, as you can with this technique. In any case, a chunk-based format (almost) solves the speed problem for large amounts of data as well.

Once you're a bit more familiar with binary file reading/writing, you might also want to take a look at a pseudo-database format (basically, a smaller index file with lots of little objects, each one referencing a single complete object in the main file). If I need to handle large amounts of data, I use an indexed, chunk-based data format, which moves the level at which speed again becomes a problem even further away (handling about 3000 objects in the program at a time with that technique is where the speed and memory again become issues). If you're looking to handle more data than that, you should be looking into a full database.

If you want to look into chunk-based files, take a look at the PNG specification, or I can provide you with a C# example (a real-world example, that I'm using in an app I'm writing). Email me for that.

Edited by dannysmurf
  • 0

Thank you so much. That solution taught me more than the last three C# books I read.

While I was eagerly awaiting your response, I began googling like crazy trying to find any solution that bordered on helpful. The only thing that came close was an article on CodeProject.com that benchmarked some custom serialization and deserialization. The conclusions were similar to yours.

Anyway, I just thought I'd mention that for posterity.

Thank you again for your help.

(P.S. e-mail sent :D )

  • 0

There is a solution to the above problem without seralizing to xml.

https://forums.microsoft.com/msdn/showpost....=0&pageid=1

Credits goes to MM for the below code from the above link. It works beautifully.

	   static constructor() {
			AppDomain.CurrentDomain.AssemblyResolve += new ResolveEventHandler(CurrentDomain_AssemblyResolve);
	   }

		static Assembly CurrentDomain_AssemblyResolve(object sender, ResolveEventArgs args) {
			Assembly ayResult = null;
			string sShortAssemblyName = args.Name.Split(',')[0];
			Assembly[] ayAssemblies = AppDomain.CurrentDomain.GetAssemblies();
			foreach (Assembly ayAssembly in ayAssemblies) {
				if (sShortAssemblyName == ayAssembly.FullName.Split(',')[0]) {
					ayResult = ayAssembly;
					break;
				}
			}
			return ayResult;
		}

  • 0
  dannysmurf said:
Sure. Just convert the objects in the class to their byte equivalents and write the bytes to a file. Say you have a class that represents a person:

public class Person
{
 ? public String FirstName = "";
 ? public String LastName = "";
 ? public Int32 Age = 0;

 ? public Person() {};

 ? public void Serialize(String Filepath)
 ? {
 ? ? ?// Convert the class' data to bytes for writing

 ? ? ?byte[] firstNameBytes = System.Text.Encoding.UTF8.GetBytes(FirstName);
 ? ? ?byte[] lastNameBytes = System.Text.Encoding.UTF8.GetBytes(LastName);
 ? ? ?byte[] ageBytes = System.BitConverter.GetBytes(Age);

 ? ? ?// You'll need to store the lengths of any strings in the file, since strings can be any length. Integers and other numbers have fixed lengths. Integers are four bytes, doubles are eight bytes, etc.

 ? ? ?byte[] fnbLength = BitConverter.GetBytes(firstNameBytes.Length);
 ? ? ?byte[] lnbLength = BitConverter.GetBytes(lastNameBytes.Length);
 ? ? ?
 ? ? ?FileStream fs = new FileStream(Filepath,FileMode.Create,FileAccess.Write);
 ? ? ?// Write any file-identification data you want to here

 ? ? ?// Write the class data into the file.
 ? ? ?fs.Write(fnbLength,0,4);
 ? ? ?fs.Write(firstNameBytes,0,firstNameBytes.Length);
 ? ? ?fs.Write(lnbLength,0,4);
 ? ? ?fs.Write(lastNameBytes,0,lastNameBytes.Length);
 ? ? ?fs.Write(ageBytes,0,4);
 ? ? ?fs.Close();
 ? }

 ? public void Deserialize(String Filepath)
 ? {
 ? ? ?// Create byte array variables for all the known-length entities in the file
 ? ? ?byte[] fnbLength = new byte[4];
 ? ? ?byte[] lnbLength = new byte[4];
 ? ? ?byte[] ageBytes = new byte[4];

 ? ? ?FileStream fs = new FileStream(Filepath,FileMode.Open,FileAccess.Read);
 ? ? ?// Read back the file identification data, if any

 ? ? ?// Read the class data from the file
 ? ? ?fs.Read(fnbLength,0,4);
 ? ? ?byte[] firstNameBytes = new byte[BitConverter.ToInt32(fnbLength,0)];
 ? ? ?fs.Read(firstNameBytes,0,firstNameBytes.Length);
 ? ? ?fs.Read(lnbLength,0,4);
 ? ? ?byte[] lastNameBytes = new byte[BitConverter.ToInt32(lnbLength,0)];
 ? ? ?fs.Read(lastNameBytes,0,lastNameBytes.Length);
 ? ? ?fs.Read(ageBytes,0,4);
 ? ? ?fs.Close();

 ? ? ?// Convert the byte arrays back into useful data
 ? ? ?FirstName = System.Text.Encoding.UTF8.GetString(firstNameBytes);
 ? ? ?LastName = System.Text.Encoding.UTF8.GetString(lastNameBytes);
 ? ? ?Age = BitConverter.ToInt32(ageBytes,0);
 ? }
}

This is a fairly simple, and somewhat incomplete example, and not the best example of good programming style, but it should get you started. This technique, as I said, is a lot more work, because the de/serialization of each class becomes a unique event that you have to specifically program, but it is much faster than .NET binary serialization, and you don't have to make any guesses about what's going on behind the scenes.

There are two problems with this example. First, it's not as fast as it could be. If you use a chunk-based format, it will be a lot faster. And second, this technique does not handle large amounts of data well. Once you get into tens of megabytes, it slows down quite a bit. At that point, .NET binary serialization actually becomes faster, but also much MUCH more memory-intensive since you can't grab just one object out of the file, as you can with this technique. In any case, a chunk-based format (almost) solves the speed problem for large amounts of data as well.

Once you're a bit more familiar with binary file reading/writing, you might also want to take a look at a pseudo-database format (basically, a smaller index file with lots of little objects, each one referencing a single complete object in the main file). If I need to handle large amounts of data, I use an indexed, chunk-based data format, which moves the level at which speed again becomes a problem even further away (handling about 3000 objects in the program at a time with that technique is where the speed and memory again become issues). If you're looking to handle more data than that, you should be looking into a full database.

If you want to look into chunk-based files, take a look at the PNG specification, or I can provide you with a C# example (a real-world example, that I'm using in an app I'm writing). Email me for that.

I need to exchange data between a Pocket PC and a server on a PC. Data must also be saved on hard disk of server PC. As BinaryFormater doesn't exists on Compact Framwork and XML serializer is takes to much memory to process data of our application we need something else. Custom serialisation much faster and robust. I though processing like you said (MemoryStream, etc...) but it is a good idea ?

Else, could you give me much details on CHUNCK-BASED please ? I'm really interested to any suggestion.

Thanks

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.
  • Posts

    • Microsoft 365 Word gets SharePoint eSignature, now you can ditch third-party signing tools by Paul Hill Microsoft has just announced that it will be rolling out an extremely convenient feature for Microsoft 365 customers who use Word throughout this year. The Redmond giant said that you’ll now be able to use SharePoint’s native eSignature service directly in Microsoft Word. The new feature allows customers to request electronic signatures without converting the documents to a PDF or leaving the Word interface, significantly speeding up workflows. Microsoft’s integration of eSignatures also allows you to create eSignature templates which will speed up document approvals, eliminate physical signing steps, and help with compliance and security in the Microsoft 365 environment. This change has the potential to significantly improve the quality-of-life for those in work finding themselves adding lots of signatures to documents as they will no longer have to export PDFs from Word and apply the signature outside of Word. It’s also key to point out that this feature is integrated natively and is not an extension. The move is quite clever from Microsoft, if businesses were using third-party tools to sign their documents, they would no longer need to use these as it’s easier to do it in Word. Not only does it reduce reliance on other tools, it also makes Microsoft’s products more competitive against other office suites such as Google Workspace. Streamlined, secure, and compliant The new eSignature feature is tightly integrated into Word. It lets you insert signature fields seamlessly into documents and request other people’s signatures, all while remaining in Word. The eSignature feature can be accessed in Word by going to the Insert ribbon. When you send a signature request to someone from Word, the recipient will get an automatically generated PDF copy of the Word document to sign. The signed PDF will then be kept in the same SharePoint location as the original Word file. To ensure end-to-end security and compliance, the document never leaves the Microsoft 365 trust boundary. For anyone with a repetitive signing process, this integration allows you to turn Word documents into eSignature templates so they can be reused. Another feature that Microsoft has built in is audit trail and notifications. Both the senders and signers will get email notifications throughout the entire signing process. Additionally, you can view the activity history (audit trail) in the signed PDF to check who signed it and when. Finally, Microsoft said that administrators will be able to control how the feature is used in Word throughout the organization. They can decide to enable it for specific users via an Office group policy or limit it to particular SharePoint sites. The company said that SharePoint eSignature also lets admins log activities in the Purview Audit log. A key security measure included by Microsoft, which was mentioned above, was the Microsoft 365 trust boundary. By keeping documents in this boundary, Microsoft ensures that all organizations can use this feature without worry. The inclusion of automatic PDF creation is all a huge benefit to users as it will cut out the step of manual PDF creation. While creating a PDF isn’t complicated, it can be time consuming. The eSignature feature looks like a win-win-win for organizations that rely on digital signatures. Not only does it speed things along and remain secure, but it’s also packed with features like tracking, making it really useful and comprehensive. When and how your organization gets it SharePoint eSignature has started rolling out to Word on the M365 Beta and Current Channels in the United States, Canada, the United Kingdom, Europe, and Australia-Pacific. This phase of the rollout is expected to be completed by early July. People in the rest of the world will also be gaining this time-saving feature but it will not reach everyone right away, though Microsoft promises to reach everybody by the end of the year. To use the feature, it will need to be enabled by administrators. If you’re an admin who needs to enable this, just go to the M365 Admin Center and enable SharePoint eSignature, ensuring the Word checkbox is selected. Once the service is enabled, apply the “Allow the use of SharePoint eSignature for Microsoft Word” policy. The policy can be enabled via Intune, Group Policy manager, or the Cloud Policy service for Microsoft 365 Assuming the admins have given permission to use the feature, users will be able to access SharePoint eSignatures on Word Desktop using the Microsoft 365 Current Channel or Beta Channel. The main caveats include that the rollout is phased, so you might not get it right away, and it requires IT admins to enable the feature - in which case, it may never get enabled at all. Overall, this feature stands to benefit users who sign documents a lot as it can save huge amounts of time cumulatively. It’s also good for Microsoft who increase organizations’ dependence on Word.
    • It's always good to have an option to secure your stuff to another medium. I did that with DVD/CD collection, and run my own media server now. It's more convenient that way and no need for separate players anymore.
    • Google Search AI Mode gets support for data visualization and custom charts by Aditya Tiwari Google announced it is rolling out support for data visualizations and graphs for finance-related queries in Google Search's AI Mode. Introduced last month at the Google I/O 2025 keynote, the feature lets you analyze complex datasets and create custom charts simply using natural language prompts. The updated AI Mode lets you compare and analyze information over a specific period, Google explained. It generates interactive graphs and provides a comprehensive explanation for your questions. AI Mode utilizes Gemini's multimodal capabilities and multi-step reasoning approach to comprehend the question's intent while accessing historical and real-time information relevant to the question. For instance, instead of manually researching individual companies and their stock prices, you can use AI Mode to compare the stock performance of different companies for a specific year. Once the graph is generated, you can choose the desired time period using the mouse cursor and ask follow-up questions based on the data presented. These new data visualizations for finance queries are available to users who have enabled the AI Mode experiment in Labs. AI Mode was introduced earlier this year as an experimental feature in the US. The feature is an upgraded version of AI Overviews, and Google closely worked with AI power users through the initial development process. It uses the “query fan-out” technique to perform multiple related searches across subtopics and different data sources, then combines them to come up with a comprehensive response. Google updated AI Mode last month to use a custom version of the latest Gemini 2.5 model. It added several new features, including Deep Search, live capabilities, agentic capabilities of Project Mariner, a new shopping experience, and the ability to add personal context by linking Google apps. The search giant is planning to turn AI Mode into its bread and butter. It has begun testing ads for the feature, which will appear below and be integrated into AI Mode responses where relevant.
    • Guys, you should find another way to promote your deals... It's the third article in the last months that promote this deal for an upgrade from 10. Considering that upgrade from 10 to 11 is free it's a total non-sense.
    • Store should be a shrine of useful applications, vetted and verified. Easily sorted by publisher. Windows should start with not much installed and have things as options in the store. Not the wild west mess that it is. You could delete 95%+ of the crap on there and no one would notice. They need to add a better UI to the updates, it's awful right now.
  • Recent Achievements

    • Week One Done
      luxoxfurniture earned a badge
      Week One Done
    • First Post
      Uranus_enjoyer earned a badge
      First Post
    • Week One Done
      Uranus_enjoyer earned a badge
      Week One Done
    • Week One Done
      jfam earned a badge
      Week One Done
    • First Post
      survivor303 earned a badge
      First Post
  • Popular Contributors

    1. 1
      +primortal
      432
    2. 2
      +FloatingFatMan
      239
    3. 3
      snowy owl
      213
    4. 4
      ATLien_0
      211
    5. 5
      Xenon
      157
  • Tell a friend

    Love Neowin? Tell a friend!