Thursday, June 19, 2008

If you call your .NET application from a batch file...

If you deploy your .NET applications with a batch file, be aware that the way you call your .NET application might affect how it behaves.

The facts are as follows (for our application anyways):

when the application is called using the following line in the batch file:

start C:\Progra~1\ApplicationFolder\Application.exe

the application fails

if you call the application using the windows explorer in the batch file:

explorer C:\Program Files\ApplicationFolder\Application.exe

it works fine but it shows a security warnings that you're downloading files and that publisher is unknown...Not good for the end user, not good.

If you call the application from a path that doesn't have white spaces using the start command, it works fine:

start C:\ApplicationFolder\Application.exe

(This option would be good if we didn't use Program Files for the deployment, but most people do)

and, finally, if you use the start command but take out the DOS path it works!

start " " "c:\Program Files\ApplicationFolder\Application.exe"


I checked the assemblies binding errors, the code access security for every assembly in the application, debugged the application, et-cetera without results.

Changing the old DOS path did the trick!!!

Why? I still have to get a hold on this script guy to ask him why :-p

Happy coding!

Labels: ,

how to determine the assembly evidence at runtime

I ran into a problem the other day, one of the .NET applications we had deployed started crashing on some screens for no apparent reason. The application worked fine in the development environment and when called from the C drive in the test machine.
We tested for binding problems using the SDK tool fuslogvw.exe, but didn't see any binding errors. The other SDK tool we tried was the .NET Framework configuration tool MScorcfg.msc, no luck. As the SDK doesn't ship with the Framework 2.0 we had a trouble to do these tests on every single production machine...let alone that the Configuration tool does not show run-time evidence for a given assembly.
Maybe the enterprise policy had changed for the different zones and this .msi had been pushed without us knowing...
The MScorcfg.msc said otherwise...

This code and extra logs in our application did the trick on determining the evidence passed to the CLR:


private static void LogEvidence()
{

Zone myZone;
Url myURL;
Hash myHash;
Site mySite;

String strEvidence = "";

log(" ===================== Assembly Evidence: ========================= ");


foreach (Object myEvidence in System.Reflection.Assembly.GetExecutingAssembly().Evidence)
{
strEvidence = myEvidence.GetType().ToString();

switch (myEvidence.GetType().ToString())
{
case "System.Security.Policy.Zone":
myZone = (Zone)myEvidence;
strEvidence = strEvidence + ": " + myZone.SecurityZone.ToString();
break;
case "System.Security.Policy.Url":
myURL = (Url)myEvidence;
strEvidence = strEvidence + ": " + myURL.Value;
break;
case "System.Security.Policy.Hash":
myHash = (Hash)myEvidence;
strEvidence = strEvidence + ": " + BitConverter.ToString(myHash.SHA1);
break;
case "System.Security.Policy.Site":
mySite = (Site)myEvidence;
strEvidence = strEvidence + ": " + mySite.Name;
break;
default:
break;
}

log(strEvidence);


}
log(" ===================== End of Assembly Evidence: ==================== ");



}

Labels:

Wednesday, June 11, 2008

Encoding troubles, wait, your ANSI file is not the same as my ANSI file

Last week we made a utility for the release team to convert all the t-sql script files from any encoding to ANSI. Now we convert any encoding to Unicode, but the original request was to use ANSI encoding.


The .NET code we used basically opens with a StreamReader that detects encoding, opens a StreamWriter to a new file with Encoding.Default (now Encoding.Unicode) and writes the content read by the StreamReader.

The problem started when some developers submitted files saved with ANSI encoding. The tool always detected the encoding as US-ASCII, which has only 7 bits for character representation, while the file had accented letters that were lost in the conversion.

I was blaming StreamReader for not detecting the encoding properly until I found the article below on http://weblogs.asp.net/ahoffman/archive/2004/01/19/60094.aspx



A question posted on the Australian DOTNET Developer Mailing List ...

Im having a character encoding problem that surprises me. In my C# code I have a string " 2004" (thats a copyright/space/2/0/0/4). When I convert this string to bytes using the ASCIIEncoding.GetBytes method I get (in hex):

3F 20 32 30 30 34

The first character (the copyright) is converted into a literal '?' question mark. I need to get the result 0xA92032303034, which has 0xA9 for the copyright, just as happens when the text is saved in notepad

An ASCII encoding provides for 7 bit characters and therefore only supports the first 128 unicode characters. All characters outside that range will display an unknown symbol - typically a "?" (0x3f) or "|" (0x7f) symbol.

That explains the first byte returned using ASCIIEncoding.GetBytes()...

> 3F 20 32 30 30 34

What your trying to achieve is an ANSI encoding of the string. To get an ANSI encoding you need to specify a "code page" which prescribes the characters from 128 on up. For example, the following code will produce the result you expect...

string s = " 2004";
Encoding targetEncoding = Encoding.GetEncoding(1252);
foreach (byte b in targetEncoding.GetBytes(s))
Console.Write("{0:x} ", b);

> a9 20 32 30 30 34

1252 represents the code page for Western European (Windows) which is probably what your using (Encoding.Default.EncodingName). Specifying a different code page say for Simplified Chinese (54936) will produce a different result.

Ideally you should use the code page actually in use on the system as follows...

string s = " 2004";
Encoding targetEncoding = Encoding.Default;
foreach (byte b in targetEncoding.GetBytes(s))
Console.Write("{0:x} ", b);

> (can depend on where you are!)

All this is particularly important if your application uses streams to write to disk. Unless care is taken, someone in another country (represented by a different code page) could write text to disk via a Stream within your application and get unexpected results when reading back the text.

In short,always specify an encoding when creating a StreamReader or StreamWriter - for example...



Our code was initially as follows:

StreamReader SR = new StreamReader(myfile, true);
String Contents = SR.ReadToEnd();
SR.Close();

The StreamReader always detected US-ASCII as the file encoding when the file was saved with ANSI encoding, so the text lost all of the accented characters once it was read by the StreamReader. The StreamReader worked fine in detecting the encoding if the encoding was different that ANSI. This might be due to the different code pages used for the different ANSI encodings...

We changed the code not to trust on the StreamReader's ability to detect the ANSI code page:

Encoding e = GetFileEncoding(myfile);
StreamReader SR = new StreamReader(myfile, e,true);
String Contents = SR.ReadToEnd();
SR.Close();

Where GetFileEncoding was published on this post

Note that on the code above, any ANSI encoded file is defaulted to the local ANSI encoding (default). If the file was saved on a machine with an ANSI code page different than the ANSI code page where the program is running, you might still have unexpected results.

Labels:

Tuesday, June 10, 2008

Compressed snapshot check makes it fail on one subscriber

As this blog has my personal bread crumbs I thought I would better record this.

We have problems with our merge replication topology (SQL Server 2005 9.0.3228 Publisher and Dist and SQLE 9.0.3228 as Subscribers, pull subscriptions) the other day as we switched to compressed snapshots. I posted the problem in the forum and searched the books about it without luck...

Here's the problem

We noticed one subscriber downloaded the compressed dynamic snapshot, but was unable to apply it:

The trace at the subscriber showed the following error:

Partitioned snapshot validation failed for this Subscriber. The snapshot validation token stored in the specified partitioned snapshot location does not match the value ''b422311'' used by the Merge Agent when evaluating the parameterized filter function. If specifying the location of the partitioned snapshot (using -DynamicSnapshotLocation), you must ensure that the snapshot files in that directory belong to the correct partition or allow the Merge Agent to automatically detect the location.',1,1,N'
SQL Merge Agent encountered an error.


The merge agent log with OutputVerbose level 3 showed the following:

2008-06-03 18:07:48.222 [0%] [978 sec remaining] Snapshot will be applied from a compressed cabinet file
2008-06-03 18:07:48.238 OLE DB Distributor 'TORDISTRIBUTOR': {call sys.sp_MSadd_merge_history90 (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)}
2008-06-03 18:07:48.254 OLE DB Subscriber 'TORSUBSCRIBER': {call sys.sp_MSadd_merge_history90 (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)}
2008-06-03 18:07:48.285 [0%] [978 sec remaining] A dynamic snapshot will be applied from '\\TORDISTRIBUTOR\repldata\unc\TORPUBLISHER_PUBLISHER_ PUBLICATION\B422311_1\'
2008-06-03 18:07:48.285 OLE DB Distributor 'TORDISTRIBUTOR': {call sys.sp_MSadd_merge_history90 (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)}
2008-06-03 18:07:48.300 OLE DB Subscriber 'TORSUBSCRIBER': {call sys.sp_MSadd_merge_history90 (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)}
2008-06-03 18:07:48.316 [0%] [732 sec remaining] Validating dynamic snapshot
2008-06-03 18:07:48.316 OLE DB Distributor 'TORDISTRIBUTOR': {call sys.sp_MSadd_merge_history90 (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)}
2008-06-03 18:07:48.332 OLE DB Subscriber 'TORSUBSCRIBER': {call sys.sp_MSadd_merge_history90 (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)}
2008-06-03 18:07:48.347 [0%] [732 sec remaining] Extracting snapshot file 'dynsnapvalidation.tok' from cabinet file
2008-06-03 18:07:48.363 OLE DB Distributor 'TORDISTRIBUTOR': {call sys.sp_MSadd_merge_history90 (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)}
2008-06-03 18:07:48.534 OLE DB Subscriber 'TORSUBSCRIBER': {call sys.sp_MSadd_merge_history90 (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)}
2008-06-03 18:07:48.566 [0%] [732 sec remaining] Extracted file 'dynsnapvalidation.tok'
2008-06-03 18:07:48.566 OLE DB Distributor 'TORDISTRIBUTOR': {call sys.sp_MSadd_merge_history90 (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)}
2008-06-03 18:07:48.566 OLE DB Subscriber 'TORSUBSCRIBER': sys.sp_MSregisterdynsnapseqno @snapshot_session_token = N'\\TORDISTRIBUTOR\ReplData\unc\TORPUBLISHER_PUBLISHER_ PUBLICATION\20080527113713\dynsnap', @dynsnapseqno = 'E69C5BB3-1095-429C-92BC-46747E49A155'
2008-06-03 18:07:48.675 OLE DB Subscriber 'TORSUBSCRIBER': sp_MSreleasesnapshotdeliverysessionlock
2008-06-03 18:07:48.753 The merge process was unable to deliver the snapshot to the Subscriber. If using Web synchronization, the merge process may have been unable to create or write to the message file. When troubleshooting, restart the synchronization with verbose history logging and specify an output file to which to write.
2008-06-03 18:07:48.753 OLE DB Subscriber 'TORSUBSCRIBER': {call sys.sp_MSadd_merge_history90 (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)}
2008-06-03 18:07:48.862 The merge process was unable to deliver the snapshot to the Subscriber. If using Web synchronization, the merge process may have been unable to create or write to the message file. When troubleshooting, restart the synchronization with verbose history logging and specify an output file to which to write.

It seems that when we use compressed snapshots and we validate the subscriptions using the HOST_NAME() there is an extra check done by the sync agent. The sync agent checks a token file inside the snapshot.cab file for that user. The filename is dynsnapvalidation.tok and contains the HOST_NAME() in our case.

The HOST_NAME() the subscriber passed to the agent was lower cased, and the partition, token file and sync agent job and token file had the HOST_NAME() upper cased.

Partitioned snapshot validation failed for this Subscriber. The snapshot validation token stored in the specified partitioned snapshot location does not match the value ''b422311'' used by the Merge Agent when evaluating the parameterized filter function. If specifying the location of the partitioned snapshot (using -DynamicSnapshotLocation), you must ensure that the snapshot files in that directory belong to the correct partition or allow the Merge Agent to automatically detect the location.',1,1,N'

SQL Merge Agent encountered an error.

Here's the solution:

The way we solved the issue by changing the reference data to be lower cased, deleted the snapshot folder, deleted the sync agent job at the distributor for that partition. Dropped the subscription and created it again and it recreated the sync agent job at the distributor with the proper lower cased Host_Name().

We haven't been able to reproduce it again, but time permitting, we will :-p

I tried to reproduce the scenario using a publisher and subscriber on the same server (SQL Server 2005 Developer build 9.00.3228) but couldn’t reproduce the error. On our real life scenario we use SQL Server Express as subscriber and RMO to trigger replication.

Labels:

Monday, June 02, 2008

Web Services Contract First doesn't guarantee Interop always

I found this very good article that answered part of the questions I had on my previous post.
The part that is enlightening regarding WSCF as how despite that it might be less productive than Code-first, might offer better Interop in the long run is:

What? Code-first is interoperable?

Well sure! You generate schema from code, proxies consume schema and generate a representation of the schema for the caller’s platform. Of course the serialized type has to be meaningful to the caller, so a DataSet in .NET though serializable makes a horrible candidate for cross platform interoperability. Aside from this, one of the biggest interoperability issues between platforms has to do with runtime schema support. Schema is a vast standard that has much more capability than many parsers have support for. Thus, interoperability can be an issue whether you begin with schema (contract-first) or when you generate schema from objects (code-first).


Ok, ok, the article is dated 2005, but don't be too hard on someone who has been doing windows forms and CAB for the past year :-p
Designing Service Contracts with WCF

Labels: ,