If you are an MCP, MCT and live near Seattle…Casting call for Certification Video

Casting call for Certification Video
Payment: No Pay
Description:
Production Coordinator is looking for well-spoken real Microsoft Certified Individuals (MCP & MCT) —aged 21 or older, all ethnicities and types – to appear as Microsoft testimonials. Most testimonial days last between 2-4 hours and are filmed in the greater Seattle, WA area. Accepted applicants for shooting will be provided with lunch. Pre-Interview is required. (Pre-Interview will take place Week of October 27th)
Testimonial applicants will have had a positive experience with Microsoft Certifications. They will then tell us why they chose to get certified and how being certified impacted their career journey.
Send picture, current job position and a brief description of your Microsoft Certification experience to:
Regines@microsoft.com
Electronic submissions only.

If you delete a data partition on a merge repl, make sure you do this additional clean up

We wanted to delete old data partition on our merge publication. The Management Studio has a nice UI to view the data partitions and generate the dynamic snapshot per partition, also to delete the partitions you no longer need.

Be warned that this nice UI delete button won’t delete the partition folder at the distributor nor will delete the dynamic snapshot job at the distributor. If you try to add a partition with the same filter token, it will fail.

In order to have a fresh start for that partition value you should:

1. Verify that the subscriptions that use that data partition don’t exist anymore.
2. Delete the data partition using the Publication Properties UI
3. Delete the data partition using the sp_dropmergepartition stored procedure:

use [Published_Database]
exec sp_dropmergepartition @publication = N’Publication_Name’, @suser_sname = N”, @host_name = N’the string we use to filter the data part’
GO

4. Delete the replication partition folder manually at \\distributor\repldata

Happy data merging!

PS SQL Server version 9.00.3228

More on merge replication with subscription expiring after a period of time

We have a merge replication topology with pull subscriptions. We have a setting to expire the subscriptions that haven’t synchronized in the past X days. This setting was mainly due to optimizations, when you don’t expire your subscriptions the amount of metadata to be used grows and grows and your merge process suffers.

The drawback on this setting is that it also makes the snapshot obsolete after X days for any new or reinitialized subscription.

When you add a new subscription to your publication and the snapshot was generated X-1 days ago you will have the following error:

The snapshot for this publication has become obsolete. The snapshot agent needs to be run again before the subscription can be synchronized.

At first we wondered why we got this error when the rest of the subscriptions were working just fine and also wondered what impact would have on the existing subscriptions to regenerate the snapshot.

The answer is: no impact.

We got the explanation from one of Hilary Cotter‘s replies:

The snapshot contains all the data, schema and metadata required to create
all the objects on the subscriber. After one day in your case the metadata
on the publisher is purged and what is in the snapshot will not be
sufficient to sync with the publisher, hence you need a new snapshot.

You want to set a small number so that not a lot of data goes across the
wire, but a big enough number so that the majority of your subscribers will
sync within the expiration time. If you set it to an infinite amount – never
expires, a lot of data will have to go to the subscriber to get it back in
sync.

And another reply with further clarifications:

The answer lies in the MSmerge_contents and MSmerge_genhistory tables.
These two tables hold the list of data changes that happened for the
past x days, x being the subscription expiration days. After x days the
record of the data change expire from the MSmerge_contents table. The
implication of that is that existing subscriptions that have not
synchronised for the past x days will then not be able to merge that
change anymore. The same holds true for creating new subscriptions with
an old snapshot – remember the snapshot also contains data. If the
snapshot was created x-2 days ago you will missing two days of data
changes that has already expired from the MSmerge_contents table.

and

a thread on MSDN:
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=945801&SiteID=1

Arg! the telnet client is disabled by default in Vista

I’m trying to fix my dad’s website, after his gallery suffered a sql injection and his database is no longer with us. I needed to check if port 2077 was open on the server to map a folder as a drive on my windows explorer after a few problems with my ftp connection timing out.
To make the story short, how to enable the telnet client on Vista:
Go to Control Panel.
Go to Programs.
Go to Program and Features.
Click Turn Windows Feature on or off.
Check mark the telnet client.

Adding an NHibernate collection to your QuickWatch in Visual Studio loads the entire collection even if it is marked with lazy loading.

I spent most of the day trying to improve performance on an application. We use NHibernate 2.0 and try to lazy load most of the collections in  memory (RAM).
I found these two links very useful for troubleshooting my performance issue:
http://djeeg.blogspot.com/2006/08/nhibernateutilisinitialized.html
and the NHibernate reference:
http://www.hibernate.org/hib_docs/nhibernate/html/performance.html

The curious thing is I got really confused when testing my
NHibernateUtil.IsInitialized(_myobject.MyCollection) line and it was false
but I saw a query retrieving the collection in SQL Profiler…

until I realized that adding the collection to the Watch windows in Visual Studio initializes the collection. The same happens if you open the collection in QuickWatch. :-p

The tools used for troubleshooting performance were Ant Profiler and SQL Server Profiler.

This merge replication error: The merge process could not connect to the Publisher ‘Server:database’ is so misleading

We have a merge topology in place with pull subscriptions, this is the merge agent runs at the subscribers.
One of our subscriptions in had the error that the merge process couldn’t find the server. The server was there and the ping was fine, also the Replication Monitor was able to register the error with the x mark.

The details of the ertor are as follows:

Command attempted:

{call sp_MSensure_single_instance (N’Merge Agent Name’, 4)}

Error messages:

The merge process could not connect to the Publisher ‘Server:database’. Check to ensure that the server is running. (Source: MSSQL_REPL, Error number: MSSQL_REPL-2147199368)
Get help: http://help/MSSQL_REPL-2147199368

Another merge agent for the subscription(s) is running or the server is working on a previous request by the same agent. (Source: MSSQLServer, Error number: 21036)
Get help: http://help/21036

Our error was due to the second cause. It seems the subscriber had lost power while replicating, and any replication after that could not acquire a lock for the merge agent. We restarted the subscriber machine, reinitialized the subscription, with no luck. Only when we dropped the subscription and recreated it again the subscriber was able to run the replication agent again. Just a curious note for the future as this error is not well documented. The only MSDN forum thread that deals with is is still unanswered here…

My wish list for SSRS 2005 and 2008

Right now I’m swamped making reports in SSRS 2005. Even though that might be considered a junior’s task I found it interesting. Last time I created report templates was 10 years ago with Quick Reports and VB 6 in the late 1990s :-p

My team recommended SSRS over Crystal Reports and VTO mainly because we have had a previous bad experience with word templates for report generation and really liked the idea or having the report engine accessible using a web service. Also being the RDL files xml with a documented schema, instead of the proprietary Crystal Report format, made us believe SSRS might go farther in the long run. Price was also a consideration for ruling out Crystal Reports. One of the big points towards this decision was the Report Manager in SSRS. Deploying our reports independently of the application where they will be viewed allow us to deploy and test the reports in parallel. There is no need to wait until the main application goes to QA. Report subscriptions was another plus.

The purpose of this short post is not to compare those three technologies though. You can see a good comparison of Crystal Reports vs SSRS here.

Our initial hesitation of putting most of our eggs on the SSRS’s basket is almost gone now, but not without having a wish list that would make the maintenance of our solution easier:

– Widows and Orphans control, there’s a hack if you use a rectangle control to group elements, KeepTogether property for a table doesn’t work
– Full aligned text, no workaround for this AFAIK.
– Reuse datasets in the reports that belong to the same project.
– Reuse images
– Reuse custom code and make it visible other than in the Report Properties->Code Tab
– Be able to re-use headers and footers on all the reports on the project.
– Be able to use more than one reportitem in an expression for an element in the header or footer.
You have to do lots of hacks in order to achieve this.I have an example of doing this using a dataset and an internal report parameter after I gave up on using the ReportItems for hiding or showing header elements.
-Barcode print in PDF, some barcode fonts get distorted when the report is rendered as a PDF,
so you have to rely on third party components such as Aspose.Barcode. Microsoft SSRS team fixed some fonts on the SP2 for SQL Server 2005 but unfortunately the font we use is not fixed yet.
-More alignment with the Visual Studio project layout. Being able to group reports in subfolders, being able to see the shared datasets in a project folder, the custom code in a project folder, being able to see the referenced external assemblies similar to the references added to the visual studio web and windows projects.
I know that aligning the web rendering with the PDF rendering might be too much to ask, but it would be really nice if the report rendered in HTML form would look similar to its PDF counterpart. Right now you cannot rely on the HTML view at all
when your final renderer is PDF.
– Rendering RTF text out of the database (I still have to explore this on 2008)
– Using CSS to apply styles to your textboxes.

Hope this helps if you have to make a technology decision, I’m sure I’ll increase the wish list soon…

Lets rectify this error (was Compressed snapshot check makes it fail on one subscriber)


I published a post with the wrong title. It was Compressed snapshot check makes it fail on one subscriber

However the file dynsnapvalidation.tok appears inside the partitioned snapshot whether you select snapshot compression or not. This .tok file shows up if you create data partitions per parameterized filter.

If you call your .NET application from a batch file…

If you deploy your .NET applications with a batch file, be aware that the way you call your .NET application might affect how it behaves.

The facts are as follows (for our application anyways):

when the application is called using the following line in the batch file:

start C:\Progra~1\ApplicationFolder\Application.exe

the application fails

if you call the application using the windows explorer in the batch file:

explorer C:\Program Files\ApplicationFolder\Application.exe

it works fine but it shows a security warnings that you’re downloading files and that publisher is unknown…Not good for the end user, not good.

If you call the application from a path that doesn’t have white spaces using the start command, it works fine:

start C:\ApplicationFolder\Application.exe

(This option would be good if we didn’t use Program Files for the deployment, but most people do)

and, finally, if you use the start command but take out the DOS path it works!

start ” ” “c:\Program Files\ApplicationFolder\Application.exe”

I checked the assemblies binding errors, the code access security for every assembly in the application, debugged the application, et-cetera without results.

Changing the old DOS path did the trick!!!

Why? I still have to get a hold on this script guy to ask him why :-p

Happy coding!