This is an update to my original post Dynamic Robots with ASP.NET 2.0, which was written for IIS6. IIS7 makes this process even easier! Use this approach to deliver different robots.txt files depending on whether you site is being crawled in http or https (or just to present your robots.txt file programatically).
Let’s take a look…
Why do you care? So. Many. Reasons.
Today, I was tasked with making some Asp:Login and Asp:CreateUserWizard controls work. Specifically, the enter button was acting with an effluence inherent only to Asp.net: it was submitting the search button. After playing with various schemes, I settled upon the following for the Login control: <asp:Panel ID=”pnlLogin” runat=”server” DefaultButton=”lgMain$LoginButton”> <asp:Login runat=”server” ID=”lgMain” /> </asp:Panel> I [...]
I am not anti “Web 2.0“. I have enjoyed seeing the evolution of web applications under its banner, though I find the concept – as intangible as it can be – amusing at times. I have to say, though, that there are certainly instances when I know it has gone too far. I am finding more and more overly AJAX-ified sites which seem to be increasingly focused on how many items on the Web 2.0 checklist they can cross off and less on the usability principles which will give their visitors the smooth and comfortable experience they deserve.
This article really only touches on one example that I have experienced recently: onBlur commits (AJAXified, of course).