5. October 2010 08:03
After years of deploying web site files using custom FTP, Remote Desktop File Transfer etc. We decided to give VS 2010 Publish Web FTP a try.
It worked flawlessly the first couple of times until we started receiving errors on the staging/deployment server that were not reproducible on the development machines.
After doing some digging around, we found out that the web publish process was the culprit.
It turns out some files were not being replaced/updated when their versions changed on the development machines.
In our case they were web server controls sitting under one of the web site sub folders.
For some reason the web publish process did not detect changes even though earlier file versions were previously deployed correctly. It seems like there is a bug in the file change detection algorithm inside the FTP publish web process.
To solve the problem we had to force the publish web process to detect changes by deleting all the old files before deploying a new version.
To do this make sure you check the “Delete All Existing Files Prior to Publish” option in the Publish Web dialog/FTP publish options:
It makes deployment slower but at least you know you get the latest version of the files each time you publish.