I Don’t GET It

Safer Deletes Through POST Links in ASP.NET MVC

Besides providing an elegant .NET-based web application platform, ASP.NET MVC (as packaged in Visual Studio 2010) can help you crank out a decent CRUD interface before your coffee goes cold. Well, almost.

For reasons that aren’t quite crystal to your humble narrator, they leave the “D” as a lonely exercise for the MVC novice, and munsoning this implementation could result in a loophole in your architecture that could hang your data out to dry.

What I’m circumlocuting here is the peril of implementing your Delete as an HTTP GET action, rather than as an HTTP POST. On the surface, it makes sense; a Delete call looks much more like a Read call than a Create or Update does, since the only argument is the item’s key. The danger in this implementation, however, is clarified by Stephen Walther in number 46 of his ASP.NET MVC Tips series:

In theory, someone could send an email to you that contains an image. The image could be embedded in the message with the following tag:

<img src=”http://www.theApp.com/Home/Delete/23” _fcksavedurl="”http://www.theApp.com/Home/Delete/23”" />

Notice that the src attribute points at the Delete() method of the Home controller class. Opening the email (and allowing images in your email client) will delete record 23 without warning. This is bad. This is a security hole.

Walther goes on to demonstrate solving this problem from several different angles, to which I have boiled down for your consumption one straightforward jQuery-based approach. First, go server-side and enforce that your Controller‘s Delete method only responds to POST requests using an AcceptVerbsAttribute:

//
// POST: /Thing/Delete/5
[Authorize]
[AcceptVerbs(HttpVerbs.Post)]
public ActionResult Delete(int id)
{
    // TODO: Delete the thing!

    // go back to the main view
    return RedirectToAction("Index");
}

Next, include this jQuery function in your Master Page:

$(document).ready(function () {

    // force .post links to POST
    $("a.post").live('click', function (e) {

        // all of your default behavior are belong to us
        e.preventDefault();

        // POST and reload on callback
        $.post(this.href, null, location.reload);
        
    });
    
});

This function will attach itself to any anchor tag bearing the “post” class and intercept its natural event behavior, which is to GET the given URL. It will then asynchronously POST to said URL, and refresh the page when that operation completes. For extra points, you can do something far more intriguing than calling location.reload here. I mean, it’s 2k9. I think the time has come for a custom animation in which our deleted record grows legs and walk off the screen, exuding a sort of romantic sadness. Don’t you?

Meanwhile, back in your view, apply the “post” class to any link which engages in dangerous and questionable data-bending behavior:

<a href="/thing/delete/<%= thing.ID.ToString() %>" class="post">Delete</a>

Now, should some click-aggressive philistine navigate to this URL from outside the cozy boundaries of your application, they will receive an HTTP 404. This will likely be followed up with a terse call to the helpdesk announcing that OMG THE WEB SITE IS DOWN. But unlike the hapless web dude fielding that call, your data will be safe and sound.

Advertisements

Putting Down Routes

Boost Your Site’s SEO Friendliness using ASP.NET 4.0

The Dark Art of Search Engine Optimization involves a number of mysterious rituals and incantations used to divine the favor of major search engine crawler robots to your particular corner of the Web. If your site happens to make liberal use of query string arguments in its URLs, however, it could introduce some snags and snares for these benevolent bots to stumble upon that will cause them to turn tail on perhaps some of the more important portions of your online presence. Non-ACSII characters, spaces, and other misfits of the URI that your browser will put up with can all be problematic for your SEO profile. The common fix is to streamline your URLs using routing.

For those of you using ASP.NET MVC, you can rest easy knowing that you already have this capability baked in to your site, and you can go about living your awesome exciting lives on the bleeding edge and fly home in your jetpack. Many of us, however, are still wrangling ye old-fashioned ASP.NET Web Forms, only daring to dream of rocking sleek and modern SEO-friendly URLs.

Gu to the Rescue

In the eighth of a series of posts on the forthcoming ASP.NET 4.0, our hero at Microsoft Scott Guthrie breaks down for us how SEO-cooperative URL-ing works, and also how we can easily retrofit this technique to our dusty old Web Forms apps that aren’t so hot at gussying themselves up for the bots:

For example, the URL for a traditional page that displays product categories might look like below:

http://www.mysite.com/products.aspx?category=software

Using the URL routing engine in ASP.NET 4 you can now configure the application to accept the following URL instead to render the same information:

http://www.mysite.com/products/software

With ASP.NET 4.0, URLs like above can now be mapped to both ASP.NET MVC Controller classes, as well as ASP.NET Web Forms based pages.

Guthrie goes on to provide an example of how this is done:

step2

This is a great example of low-hanging fruit available upon upgrading to Visual Studio 2010/ASP.NET 4.0 that can provide a real-world boost to your existing web applications, without the unseemly chicken sacrifice. Also, it’s shiny.

URL Routing with ASP.NET 4 Web Forms (VS 2010 and .NET 4.0 Series) [ScottGu’s Blog]

Thank You For Sharing

Automated Deployment for SharePoint 2007 with Team Server and VSeWSS

So you’ve built a custom Web Part. VSeWSS has been your tool of choice, especially since you lassoed that pkg folder into source control. It deftly dispatches the drudgery of creating WSP and XML files from your content and code. It even writes you an adorable little batch file to provide a smooth install with just a double-click. But you and I both know that’s not enough. It’s the twenty-first flippin’ century, and we want Continuous Integration.

I’m going to go ahead and assume that you have checked your VSeWSS project into Team Foundation Server, and created a basic Team Build that is kicked off on a check-in. If you haven’t, by all means, go set that up. The rest of the class and I are waiting.

Ps I Love You

The SharePoint administration command-line tool stsadm is designed in such a way that it only operates on the SharePoint server instance on the local machine. This is all fine and dandy for the small shop whose Team and SharePoint Servers (as well as their Exchange, SQL, and warez servers) reside on the same put-upon little machine. For the rest of us, however, this means that we have to bribe a lonely sysadmin to log on to the SharePoint box and copy/execute the aforementioned batch file. Or, we could whisper a prayer and have it answered by a miraculous man known only as Mark Russinovich.

Several years back, Mr. Russinovich decided that it was about time we had a decent tool to run commands on a remote Windows machine. He was right, and today we have PsExec. Step one, for us, is to download and install PsTools on our Team Server host.

If You MSBuild It…

Step two is slightly more developer-intensive. And by developer-intensive, I of course mean that you will have to copy and paste the MSBuild code from this post into your TFSBuild.proj file, pausing only briefly to modify some configuratory values on your way to SharePointy continuously integrational bliss. You probably will not need to alter the Executables values from the listing below except for the psexec credentials, but surely you must have a better name for your SharePoint solution.

<!--Figure the First, in which our constants are defined -->
<PropertyGroup>

  <!--Executables-->
  <Devenv>"%ProgramFiles%\Microsoft Visual Studio 9.0\Common7\IDE\devenv"</Devenv>
  <Psexec>psexec /i /accepteula \\$(TargetServer) /u spAdminUsername /p s01337itHz</Psexec>

  <!--Paths-->
  <SolutionDir>$(SourceDir)\BestPortalEver</SolutionDir>
  <ProjectDir>$(SolutionDir)\MahWebParts</ProjectDir>
  <TempDropFolder>%TEMP%\$(BuildNumber)</TempDropFolder>

  <!--Etc.-->
  <Configuration>Release</Configuration>
  <SolutionFileName>BestPortalEver.sln</SolutionFileName>
  <WSPName>Aptera.Blog.BestPortalEver.MahWebParts.wsp</WSPName>
  <TargetServer>development.sharepoint.apterasoftware.com</TargetServer>

</PropertyGroup>

The third step is to add an AfterCompile Target, in which we will spin up a second instance of Visual Studio to build the solution with the /Package switch. This will (oddly enough) fill out the pkg folder with the WSP and setup.bat files needed to deploy our solution.

<!--Figure the Second, in which we do it again the right way.-->
<Target Name="AfterCompile">
    
    <!-- Build using /package switch  -->
    <Exec Command="$(Devenv) "$(SolutionDir)\$(SolutionFileName)" /Deploy $(Configuration) /Package" />
    
    <!--Copy Package to the Drop Location-->
    <Exec Command="xcopy /y /e /i "$(ProjectDir)\bin\$(Configuration)\*.*" "$(DropLocation)\$(BuildNumber)\$(Configuration)\"" />

</Target>

This seems simple and straightforward enough; but keep your tin foil hat on, this is pre-2010 SharePoint development. The oddity here is that the pkg folder (being, from Team Server’s point of view, more source code than resultant binary files) defaults to Read-Only in the workspace. If you would be so bold as to launch your build at this point, it would trip on this step and fail with the following edifying pronouncement:

------ Validate solution and resolve deployment conflict(s) ------
Validating solution ...
EXEC : error : System.UnauthorizedAccessException
    Access to the path 'C:\Documents and Settings\tfsBuildUser\Local Settings\Temp\Aptera.Blog.BestPortalEver\Check-in Build\Sources\Aptera.Blog.BestPortalEver\MahWebParts\pkg\solution.xml' is denied.

The fix for this flounder is to remove the Read-Only flag from our favorite little output directory, just before the call to devenv:

<!-- Make the pkg folder and its contents writable -->
<Exec Command="attrib -R "$(ProjectDir)\pkg\*.*" /S /D" />

What Can PsExec Do For You

Now that we have a viable pkg folder full of deliverable SharePoint goodness, it is time to deliver the goods. To do this, we must call upon PsExec. Remember PsExec?

Back in step two, to simplify things, we added a property that sets up a call to PsExec with all of the necessary parameters, such as remote host name, user credentials, etc. Let’s see those again in slow-motion, for the people in the back:

<Psexec>psexec /i /accepteula \\$(TargetServer) /u mahSPAdminUser /p s01337itHz</Psexec>

Our fourth and final step is to create another Target named AfterDropBuild. In it we will use PsExec to copy the build’s outputs over to our SharePoint host, and execute the setup script:

<!--Figure the Third, in which the magic happens.-->
<Target Name="AfterDropBuild">
    
    <!--Copy Files to Target Server-->
    <Exec Command="$(Psexec) xcopy /y /e "$(DropLocation)\$(BuildNumber)\$(Configuration)\" "$(TempDropFolder)\"" />
    
    <!--Deactivate and Uninstall previous version of the Solution-->
    <Exec Command="$(Psexec) echo . | "$(TempDropFolder)\setup.bat" /uninstall " />
    
    <!--Install and Activate the Solution-->
    <Exec Command="$(Psexec) echo . | "$(TempDropFolder)\setup.bat" /install " />
    
</Target>

Thanks to John W Powell for providing the basis for this approach.

P.S. – A quick sidebar on calling VSeWSS’s setup.bat

Presumably under great duress and/or substance abuse, the decision was made on the VSeWSS team that when the generated setup script had finished its business, it would say so and wait more patiently than you for someone to hit the Enter key before it exited completely. This is highly convenient and even polite in the context of our lonely, lonely sysadmin running said batch file on our behalf, but it gets old pretty quickly when your Team Build is waiting just as patiently for the script to return control. Echoing a period and piping it into a script is a common technique for simulating the required keystroke at the script’s prompt.

In case that was keeping you up at night.