• 0

C#] Binary Serialize - Deserialize in another app?


Question

Here's the situation. I developed a Serializable class in one application, and then I binary serialize the info to a file. Using the exact same class in another one of my applications, I cannot deserialize because of some security measures in the .NET framework.

Here's the error I get when dezserializing in the other app:

SerializationException:
Unable to find assembly 'GDB Tester App, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null'.

Where 'GDB Tester App' is the name of the app that was used for serialization.

How can I workaround these darn security measures?

Thanks in advance.

11 answers to this question

Recommended Posts

  • 0
  Express said:
Thats not a security exception.

586125094[/snapback]

Oh, according to the MSDN documentation it said it was - but you're right. I don't see how it could be very secure.

  Quote
If you serializing custom objects then you should define the custom class in a shared assembly that would refrenced by both the class that deserializes and the class that serializes.

Is there any way I could deserialize with anything (i.e. not just a shared assembly)?

  • 0
  Express said:
You can use Xml Serialization instead.

586125607[/snapback]

Crap. I knew it would all come back to XML. There is a utility to convert XML Schemas (xsd.exe) to C# classes, but is there any utility to write XML schemas? It's very tedious, and an often error prone process.

If not, then I guess I'll have to start crankin' out the verbose and ugly XML by hand.

  • 0
  VB Guy said:
Crap. I knew it would all come back to XML. There is a utility to convert XML Schemas (xsd.exe) to C# classes, but is there any utility to write XML schemas? It's very tedious, and an often error prone process.

If not, then I guess I'll have to start crankin' out the verbose and ugly XML by hand.

586125817[/snapback]

Its not very complicated. You can just tag all your public fields with [XmlAttribute].

  • 0

Personally, I would suggest writing your own binary serialization for the classes that you want to convert into bytes. That's what I normally do. It is more work, but it's easier to maintain (you don't get the problems you're having), and it's faster to read/write the data.

  • 0
  dannysmurf said:
Personally, I would suggest writing your own binary serialization for the classes that you want to convert into bytes. That's what I normally do. It is more work, but it's easier to maintain (you don't get the problems you're having), and it's faster to read/write the data.

586126790[/snapback]

Could you give me a small example on how to do that?

  • 0

Sure. Just convert the objects in the class to their byte equivalents and write the bytes to a file. Say you have a class that represents a person:

public class Person
{
 ? public String FirstName = "";
 ? public String LastName = "";
 ? public Int32 Age = 0;

 ? public Person() {};

 ? public void Serialize(String Filepath)
 ? {
 ? ? ?// Convert the class' data to bytes for writing

 ? ? ?byte[] firstNameBytes = System.Text.Encoding.UTF8.GetBytes(FirstName);
 ? ? ?byte[] lastNameBytes = System.Text.Encoding.UTF8.GetBytes(LastName);
 ? ? ?byte[] ageBytes = System.BitConverter.GetBytes(Age);

 ? ? ?// You'll need to store the lengths of any strings in the file, since strings can be any length. Integers and other numbers have fixed lengths. Integers are four bytes, doubles are eight bytes, etc.

 ? ? ?byte[] fnbLength = BitConverter.GetBytes(firstNameBytes.Length);
 ? ? ?byte[] lnbLength = BitConverter.GetBytes(lastNameBytes.Length);
 ? ? ?
 ? ? ?FileStream fs = new FileStream(Filepath,FileMode.Create,FileAccess.Write);
 ? ? ?// Write any file-identification data you want to here

 ? ? ?// Write the class data into the file.
 ? ? ?fs.Write(fnbLength,0,4);
 ? ? ?fs.Write(firstNameBytes,0,firstNameBytes.Length);
 ? ? ?fs.Write(lnbLength,0,4);
 ? ? ?fs.Write(lastNameBytes,0,lastNameBytes.Length);
 ? ? ?fs.Write(ageBytes,0,4);
 ? ? ?fs.Close();
 ? }

 ? public void Deserialize(String Filepath)
 ? {
 ? ? ?// Create byte array variables for all the known-length entities in the file
 ? ? ?byte[] fnbLength = new byte[4];
 ? ? ?byte[] lnbLength = new byte[4];
 ? ? ?byte[] ageBytes = new byte[4];

 ? ? ?FileStream fs = new FileStream(Filepath,FileMode.Open,FileAccess.Read);
 ? ? ?// Read back the file identification data, if any

 ? ? ?// Read the class data from the file
 ? ? ?fs.Read(fnbLength,0,4);
 ? ? ?byte[] firstNameBytes = new byte[BitConverter.ToInt32(fnbLength,0)];
 ? ? ?fs.Read(firstNameBytes,0,firstNameBytes.Length);
 ? ? ?fs.Read(lnbLength,0,4);
 ? ? ?byte[] lastNameBytes = new byte[BitConverter.ToInt32(lnbLength,0)];
 ? ? ?fs.Read(lastNameBytes,0,lastNameBytes.Length);
 ? ? ?fs.Read(ageBytes,0,4);
 ? ? ?fs.Close();

 ? ? ?// Convert the byte arrays back into useful data
 ? ? ?FirstName = System.Text.Encoding.UTF8.GetString(firstNameBytes);
 ? ? ?LastName = System.Text.Encoding.UTF8.GetString(lastNameBytes);
 ? ? ?Age = BitConverter.ToInt32(ageBytes,0);
 ? }
}

This is a fairly simple, and somewhat incomplete example, and not the best example of good programming style, but it should get you started. This technique, as I said, is a lot more work, because the de/serialization of each class becomes a unique event that you have to specifically program, but it is much faster than .NET binary serialization, and you don't have to make any guesses about what's going on behind the scenes.

There are two problems with this example. First, it's not as fast as it could be. If you use a chunk-based format, it will be a lot faster. And second, this technique does not handle large amounts of data well. Once you get into tens of megabytes, it slows down quite a bit. At that point, .NET binary serialization actually becomes faster, but also much MUCH more memory-intensive since you can't grab just one object out of the file, as you can with this technique. In any case, a chunk-based format (almost) solves the speed problem for large amounts of data as well.

Once you're a bit more familiar with binary file reading/writing, you might also want to take a look at a pseudo-database format (basically, a smaller index file with lots of little objects, each one referencing a single complete object in the main file). If I need to handle large amounts of data, I use an indexed, chunk-based data format, which moves the level at which speed again becomes a problem even further away (handling about 3000 objects in the program at a time with that technique is where the speed and memory again become issues). If you're looking to handle more data than that, you should be looking into a full database.

If you want to look into chunk-based files, take a look at the PNG specification, or I can provide you with a C# example (a real-world example, that I'm using in an app I'm writing). Email me for that.

Edited by dannysmurf
  • 0

Thank you so much. That solution taught me more than the last three C# books I read.

While I was eagerly awaiting your response, I began googling like crazy trying to find any solution that bordered on helpful. The only thing that came close was an article on CodeProject.com that benchmarked some custom serialization and deserialization. The conclusions were similar to yours.

Anyway, I just thought I'd mention that for posterity.

Thank you again for your help.

(P.S. e-mail sent :D )

  • 0

There is a solution to the above problem without seralizing to xml.

https://forums.microsoft.com/msdn/showpost....=0&pageid=1

Credits goes to MM for the below code from the above link. It works beautifully.

	   static constructor() {
			AppDomain.CurrentDomain.AssemblyResolve += new ResolveEventHandler(CurrentDomain_AssemblyResolve);
	   }

		static Assembly CurrentDomain_AssemblyResolve(object sender, ResolveEventArgs args) {
			Assembly ayResult = null;
			string sShortAssemblyName = args.Name.Split(',')[0];
			Assembly[] ayAssemblies = AppDomain.CurrentDomain.GetAssemblies();
			foreach (Assembly ayAssembly in ayAssemblies) {
				if (sShortAssemblyName == ayAssembly.FullName.Split(',')[0]) {
					ayResult = ayAssembly;
					break;
				}
			}
			return ayResult;
		}

  • 0
  dannysmurf said:
Sure. Just convert the objects in the class to their byte equivalents and write the bytes to a file. Say you have a class that represents a person:

public class Person
{
 ? public String FirstName = "";
 ? public String LastName = "";
 ? public Int32 Age = 0;

 ? public Person() {};

 ? public void Serialize(String Filepath)
 ? {
 ? ? ?// Convert the class' data to bytes for writing

 ? ? ?byte[] firstNameBytes = System.Text.Encoding.UTF8.GetBytes(FirstName);
 ? ? ?byte[] lastNameBytes = System.Text.Encoding.UTF8.GetBytes(LastName);
 ? ? ?byte[] ageBytes = System.BitConverter.GetBytes(Age);

 ? ? ?// You'll need to store the lengths of any strings in the file, since strings can be any length. Integers and other numbers have fixed lengths. Integers are four bytes, doubles are eight bytes, etc.

 ? ? ?byte[] fnbLength = BitConverter.GetBytes(firstNameBytes.Length);
 ? ? ?byte[] lnbLength = BitConverter.GetBytes(lastNameBytes.Length);
 ? ? ?
 ? ? ?FileStream fs = new FileStream(Filepath,FileMode.Create,FileAccess.Write);
 ? ? ?// Write any file-identification data you want to here

 ? ? ?// Write the class data into the file.
 ? ? ?fs.Write(fnbLength,0,4);
 ? ? ?fs.Write(firstNameBytes,0,firstNameBytes.Length);
 ? ? ?fs.Write(lnbLength,0,4);
 ? ? ?fs.Write(lastNameBytes,0,lastNameBytes.Length);
 ? ? ?fs.Write(ageBytes,0,4);
 ? ? ?fs.Close();
 ? }

 ? public void Deserialize(String Filepath)
 ? {
 ? ? ?// Create byte array variables for all the known-length entities in the file
 ? ? ?byte[] fnbLength = new byte[4];
 ? ? ?byte[] lnbLength = new byte[4];
 ? ? ?byte[] ageBytes = new byte[4];

 ? ? ?FileStream fs = new FileStream(Filepath,FileMode.Open,FileAccess.Read);
 ? ? ?// Read back the file identification data, if any

 ? ? ?// Read the class data from the file
 ? ? ?fs.Read(fnbLength,0,4);
 ? ? ?byte[] firstNameBytes = new byte[BitConverter.ToInt32(fnbLength,0)];
 ? ? ?fs.Read(firstNameBytes,0,firstNameBytes.Length);
 ? ? ?fs.Read(lnbLength,0,4);
 ? ? ?byte[] lastNameBytes = new byte[BitConverter.ToInt32(lnbLength,0)];
 ? ? ?fs.Read(lastNameBytes,0,lastNameBytes.Length);
 ? ? ?fs.Read(ageBytes,0,4);
 ? ? ?fs.Close();

 ? ? ?// Convert the byte arrays back into useful data
 ? ? ?FirstName = System.Text.Encoding.UTF8.GetString(firstNameBytes);
 ? ? ?LastName = System.Text.Encoding.UTF8.GetString(lastNameBytes);
 ? ? ?Age = BitConverter.ToInt32(ageBytes,0);
 ? }
}

This is a fairly simple, and somewhat incomplete example, and not the best example of good programming style, but it should get you started. This technique, as I said, is a lot more work, because the de/serialization of each class becomes a unique event that you have to specifically program, but it is much faster than .NET binary serialization, and you don't have to make any guesses about what's going on behind the scenes.

There are two problems with this example. First, it's not as fast as it could be. If you use a chunk-based format, it will be a lot faster. And second, this technique does not handle large amounts of data well. Once you get into tens of megabytes, it slows down quite a bit. At that point, .NET binary serialization actually becomes faster, but also much MUCH more memory-intensive since you can't grab just one object out of the file, as you can with this technique. In any case, a chunk-based format (almost) solves the speed problem for large amounts of data as well.

Once you're a bit more familiar with binary file reading/writing, you might also want to take a look at a pseudo-database format (basically, a smaller index file with lots of little objects, each one referencing a single complete object in the main file). If I need to handle large amounts of data, I use an indexed, chunk-based data format, which moves the level at which speed again becomes a problem even further away (handling about 3000 objects in the program at a time with that technique is where the speed and memory again become issues). If you're looking to handle more data than that, you should be looking into a full database.

If you want to look into chunk-based files, take a look at the PNG specification, or I can provide you with a C# example (a real-world example, that I'm using in an app I'm writing). Email me for that.

I need to exchange data between a Pocket PC and a server on a PC. Data must also be saved on hard disk of server PC. As BinaryFormater doesn't exists on Compact Framwork and XML serializer is takes to much memory to process data of our application we need something else. Custom serialisation much faster and robust. I though processing like you said (MemoryStream, etc...) but it is a good idea ?

Else, could you give me much details on CHUNCK-BASED please ? I'm really interested to any suggestion.

Thanks

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.
  • Posts

    • At least Starship Block 2 is consistent in failure.  They were lucky it was not the stack. That would have been really huge. 
    • VR is dead on the PS at this rate, sales just aren't there. Way more VR push on the PC, even Sony knows this and that's why they added PC support to the PSVR.
    • Borderlands series, Rematch, Broken Arrow, and more get Nvidia GeForce NOW support by Pulasthi Ariyasinghe Another Nvidia GeForce NOW games update has arrived, meaning subscribers now have even more games to jump into via the cloud if they own a copy. The latest wave touts 13 more games, and that includes the Borderlands franchise from Gearbox, Remedy's brand-new cooperative shooter FBC: Firebreak, and more. With the fourth entry now on the way, for those who have yet to jump into Gearbox's wacky looter shooter universe, Borderlands, Borderlands 2, Borderlands 3, and even Borderlands: The Pre-Sequel are now a part of GeForce NOW. The Sifu developer's rule-less soccer experience, Rematch, has also been released to standard edition owners today. With the latest update, for owners of the game or PC Game Pass subscribers, it is also accessible via the cloud on GeForce NOW. Here are the games announced for the program this week: REMATCH (New release on Steam, Xbox, available on PC Game Pass, June 16) Broken Arrow (New release on Steam, June 19) Crime Simulator (New release on Steam, June 17) Date Everything! (New release on Steam, June 17) FBC: Firebreak (New release on Steam, Xbox, available on PC Game Pass, June 17) Lost in Random: The Eternal Die (New release on Steam, Xbox, available on PC Game Pass, June 17) Architect Life: A House Design Simulator (New release on Steam, June 19) Borderlands Game of the Year Enhanced (Steam) Borderlands 2 (Steam, Epic Games Store) Borderlands 3 (Steam, Epic Games Store) Borderlands: The Pre-Sequel (Steam, Epic Games Store) METAL EDEN Demo (Steam) Torque Drift 2 (Epic Games Store) As always though, keep in mind that unlike subscription services like Game Pass, a copy of a game must be owned by the GeForce NOW member (or at least have a license via PC Game Pass) to start playing via Nvidia's cloud servers.
    • WHAT? First of all, Azure, literally, runs on THE LINUX KERNEL. I know, right? Windows is easier to develop drivers? This must be the joke of the century! Developing drivers on Linux, you can interact with low level implementation straight to the core. You can build and test them with standard tools like GCC and Make, no need for a full blown IDE or SDKs, only a kernel header and a Makefile. You can load/unload drivers dynamically, without rebooting, which makes debugging MUCH easier. You don't need to sign drivers, unlike Windows, even for local testing. And a ton of other conveniences. "There is no way a Linux distribution can compete against Windows". Literally, SteamOS competes against Windows on handhelds, playing games WRITTEN for Windows, BETTER than Windows. "DirectX is the most powerful API"? Really? Vulkan provides more low level control, less overhead, scales better with more threads, it's cross platform and extensible. How, exactly, is "DirectX the most powerful API"?
    • It's easier for the console market to pull in more revenue when they're prices are higher compared to the PC where games often come out cheaper than their console versions or go on sale quicker. Having said that, I'm not going to be paying $70 or $80 for a game, regardless of the platform it's on. Revenue aside, because raising prices on consoles skew things when the prices on the PC often stay around the same levels, it's been shown that the PC market is growing while the console market is overall flat. PC will pass consoles soon dropping them into 3rd place. And the PS5 being on track to pass the PS4 doesn't say much, if the console market was actually still growing Sony would've passed the PS2 as it's best selling console with the PS3, and the PS4 would've outsold both and so on. That's not happening. It took Nintendo to release a totally different hybrid system with the Switch to inject some new life into the "console" market. Even then it's pushed as a handheld first and the majority who buy it do so because it's portable and at a good price.
  • Recent Achievements

    • First Post
      MikeK13 earned a badge
      First Post
    • One Month Later
      OHI Accounting earned a badge
      One Month Later
    • Week One Done
      OHI Accounting earned a badge
      Week One Done
    • First Post
      Thornskade earned a badge
      First Post
    • Week One Done
      Higante88 earned a badge
      Week One Done
  • Popular Contributors

    1. 1
      +primortal
      716
    2. 2
      ATLien_0
      273
    3. 3
      Michael Scrip
      203
    4. 4
      +FloatingFatMan
      182
    5. 5
      Steven P.
      128
  • Tell a friend

    Love Neowin? Tell a friend!