WCF Data Services and Web API with OData; choices, choices.

by Matt Milner 2. April 2013 15:13

Back in 2010, I wrote a course for Pluralsight on OData which covers the protocol in general and introduces the viewer to the client and server programming model. This year, Microsoft released updates to the ASP.NET Web API which includes support for OData in the controllers.  Since this latest release, I’ve received several questions about the differences between these two toolsets for building services that support OData and some guidance on which to use.  This is my attempt to answer those queries. 

 

OData

OData is a protocol developed by Microsoft and others for data access using web protocols such as HTTP, ATOMPub and JSON. One of the benefits of OData is a consistent query experience, defined in the protocol, that enables rich querying using URI query string parameters. This consistent query syntax, much like ANSI SQL, provides a platform neutral API for working with data.

This means I might be able to write a query like this:

http://pluralsight.com/odata/Categories?$filter=Name eq 'OData' 

 

There are a variety of query string options you can use to filter and identify the resource(s) you want to read or update. I can use this same pattern to apply filters to other OData services using their entity properties.

 

WCF Data Services

WCF Data Services is Microsoft’s library for building OData Services  or writing OData clients.  On the server side, the framework provides a very quick, simple model for exposing all or part of an Entity Framework model as an OData compatible service with little or no code. This service, scaffolded in minutes supports, if configured to allow it, read, insert, update and delete. 

If you don’t have an Entity Framework model, you can expose a simple .NET object with IQueryable properties for read only access, or implement the IUpdateable interface and support update, insert and delete on any collection. 

This framework provides the quickest way to get a service up and running when the data model is the primary focus of your application. You do have the ability to extend the service with functions that are exposed over the HTTP API as well. For example, at Pluralsight we could have a method to return the top 10 courses. This might be a convenience to save the client from having to compute this themselves, or it might be because the data needed to make that distinction isn’t exposed in the data available in the service therefore the client couldn’t compute or filter to get those same results. 

On the client side, the WCF Data Services library provides a .NET interface over the OData protocol and exposes the query semantics as a LINQ provider.  This enables .NET developers to access the data in an OData service as they would any other data source.

Microsoft has been moving some OData features into the OData Library to enable reuse in many different scenarios.  This means you don’t have to accept the default WCF Data Services model, especially if you don’t have an EDM for your data source. 

You can, obviously, use the client and service independently.  That is, even if you develop your service using another framework, perhaps not even Microsoft, you can use the client library to access it.

 

ASP.NET Web API

The ASP.NET web API was introduced last year as a framework for building HTTP services; that is services that expose their functionality over HTTP (these may or may not be REST services). You build these services using controllers, much like ASP.NET MVC development for web applications.  The services are most often focused on exposing certain resources and enabling various actions on those resources.

One of the features of ASP.NET Web API is content negotiation. This enables a client to request a resource, a Course for example, and indicate (using the HTTP accept header) they would like the response in JSON, XML, or any other format. If the server can support the response type, it does so, serializing the data appropriately. 

It is only natural that customers would want OData JSON or ATOMPub as a format for exposing their resources, and would request support for the query syntax. The beauty of OData is that you don’t have to write umpteen methods for querying (GetCustomer, GetCustomersByCity, GetCustomersByRegion, etc.). So, using pieces of OData Lib, the Web API team enabled support of OData query syntax on an API controller method and enabling update semantics as well. 

 

Making the decision

Having said all that, I would summarize as follows: WCF Data Services focuses on the data model and limits code, while Web API focuses on the controller/code and enables the formatting and query syntax of OData.

So, if you are looking to expose a data model (EDM or otherwise) quickly and don’t need a lot of code or business logic, WCF Data Services makes that REALLY easy and would be a good starting point. 

If, however, you are building an API and simply want to expose some resources using either OData query syntax or formatting, then ASP.NET Web API is probably the best place to start. 

I hope this is helpful and have fun writing your services no matter what toolset you choose.

Tags:

ASP.Net | WebAPI | Windows Communication Foundation | OData

WebAPI or WCF?

by Matt Milner 28. February 2012 13:44

Updated [2/29/2012]: added more information on why HTTP and thus WebAPI is important.

I’ve been part of several conversations over the past few weeks where someone posited the question: Now that WebAPI is out, how do I (or my customers) decide when to use it or WCF? This question actually has many different flavors?

  • Is WCF done? Does WebAPI replace WCF? Should I stop using WCF HTTP?
  • Why is WebAPI part of ASP.NET? Wasn’t WebAPI originally a WCF framework?
  • If WebAPI is part of ASP.NET, why don’t I just use MVC? What does WebAPI give me over MVC?

 

Is WCF done?

WCF is not done, nor is it going away anytime soon. WCF is the framework to use to build services that are flexible with regard to transport, encoding, and various protocols. This was precisely what WCF was designed for and what it does extremely well. WCF enables me to write service code and contracts which can then be exposed over various bindings (transport, security, etc.). That hasn’t changed and continues to be the case. If you are building a service in your organization and plan to support multiple protocols, or simply use protocols other than HTTP (tcp, name pipes, udp, etc.) then WCF continues to be your choice.

If you happen to want to expose your service over HTTP with WCF you have two high level choices: SOAP over HTTP or web HTTP. Obviously SOAP over HTTP is simply a different endpoint/binding choice, again where WCF shines. You can also expose your service using the WCF HTTP model that has been around since .NET 3.5. This model changes the dispatching to happen based on URI templates and HTTP verbs rather than SOAP actions. The WCF HTTP model also provides some help in providing help documentation, surfacing faults in an HTTP friendly way (think status codes) and returning content in web friendly formats such as JSON. 

But, and there had to be a but, WCF was built as a transport-neutral fashion, that’s a selling point; except when you do care about the transport and really want to leverage HTTP for example.

 

Why is WebAPI part of ASP.NET and not WCF?

Somewhere during development WCF WebAPI became ASP.NET WebAPI.[1] Knowledge that this occurred is often what leads to the previous questions about the fate or uses of WCF. In my opinion, and this is just that, WCF as the backbone of WebAPI was not the best option because in order to care about HTTP you had to work around a lot of WCF infrastructure. Things the core Message abstraction were not built to embrace any transport and didn’t easily support (note I said “easily”) the various content types that might be negotiated.

When talking with colleagues and looking at what people are doing to build web APIs the most common choice was overwhelmingly NOT WCF. In fact, the top choices were either an open source platform or using MVC controllers to return JSON results to client pages. The reason, as I see it, is that all these platforms made it easier to get a web API up and running while allowing you close control over HTTP when you care. For someone simply trying to return some objects to a client as JSON within their MVC web application it is really simple to add a method to the existing controller and return that data. No configuration, no bindings, nothing but their models and existing controllers.

HTTP is important

Getting close to HTTP allows you to take advantage of the protocol. This means I can fully leverage features of HTTP such as caching, etags, status codes and the like. Why is this important? There are a variety of reasons but I’ll focus on a few. Caching GET requests is a huge part of HTTP and of scaling any web site/service. One of SOAPs big failings is that it relies exclusively on HTTP POST when using HTTP as a transport and so cannot take advantage of caching of requests, even if those requests are returning slowly changing or unchanging data. Getting close to HTTP allows me to set expiration headers easily on the response and control the caching of my content on the client intermediaries, etc.

Being able to work easily with ETags enables me to leverage conditional gets and manage application concerns such as concurrency. Status codes allow me to be explicit when responding to clients about what happened with their request.  As an example, when someone posts a new resource to my service I want to respond with success (2xx status code) but I also want to provide the right code indicating that the resource was created (201) and provide the location header so the client knows the exact URL of the resource just created. Being close to HTTP gives me the ability to send the appropriate status code and the appropriate headers so the client can get a richer response, all with the existing HTTP protocol.

 

It makes sense, when you care about HTTP, to use MVC . . . but MVC is not the best tool for building services either.

 

What does WebAPI give me over MVC?

ASP.NET MVC provides some great tools that could be leveraged for services including model binding and routing. For most people building web APIs, however, there are other concerns as well. As a simple example, I’ve always felt a little uncomfortable building services in MVC because of the standard routing model that includes the action in the URI. A little thing, sure, and something I could work around with some MVC extensions of my own. Web API provides me a model for routing based on HTTP verb rather than a URI that contains an action. This puts me close to the HTTP protocol, simplifies my routing and seems right to me. In addition, Web API allows me to fully leverage content negotiation to enable returning various representations of my objects/resources. This means I have a pluggable model for allowing the client to tell me what representation they would like (text/xml, application/json, text/calendar) and choosing the best formatter to create the best match representation. All this comes with the ability to use the routing, dependency resolution, unit testing, and model binding.

In addition WebAPI allows you to self-host your services a la WCF (and in fact uses a little WCF under the covers to enable this) so you can, if you choose, go outside ASP.NET / IIS as the host of your service and continue to leverage all these great benefits. This enables you to host your HTTP services in any  .NET appdomain and still use the same routes, controllers, etc.

 

So . . . ?

WCF remains the framework for building services where you care about transport flexibility. WebAPI is the framework for building services where you care about HTTP.

 

What do YOU think?

 

[1] To be exact, after the 6th preview release of WCF WebApi

Demo from HDC 10

by Matt Milner 14. September 2010 16:02

Last week I had the pleasure of presenting two sessions at the Heartland Developer’s Conference (HDC). I love this show and meeting people from the central region, as well as catching up with colleagues. It’s always a great time and this year was no different.

For those who attended, you can find the samples I used at the links below. Thanks for coming, I hope you enjoyed the show as much as I did.

 

WF 4 from hello world to real world

Choosing service technology

Tags:

Windows Workflow Foundation | Windows Communication Foundation | Presentations

Interested in web services interoperability with WCF?

by Matt Milner 21. June 2010 05:38

Then go take this survey and let Microsoft feel your pain!  Not sure anyone will get zapped by their chair when you send feedback, but the team is actively looking into how to make the interop story better, so be sure to get your voice heard.  Oh, and it’s a short survey so not a huge time commitment.

http://mymfe.microsoft.com/Feedback.aspx?formID=283

Tags:

Windows Communication Foundation

Custom web faults with System.ServiceModel.Web 3.x

by Matt Milner 14. June 2010 10:04

A former student approached me with a problem related to the Web programming model using WCF in .NET 3.5.  In short, he was using a custom IErrorHandler to create a custom fault message, but the client was always receiving a generic error.  Even more of a problem was that the custom error was an HTML formatted message, despite having set the response format on the service to JSON.  This caused big problems for the AJAX client trying to reason over that response.  I knew that WCF REST Starter Kit and WCF 4 both allowed for custom error messages, so I did some digging to see what might be at the root of the problem.  It turns out that the WebHttpBehavior inserts its own IErrorHandler and it was getting in the way of the custom handler he was adding.  After pointing this out to Dave, he quickly realized he could create a class that derived from the WebHttpBehavior and override the AddServerErrorHandlers to insert his own error handler.  He also created the requisite BehaviorExtensionElement so the new endpoint behavior could be added in the configuration file. 

 

public class JsonWebHttpBehavior : WebHttpBehavior
    {
        protected override void AddServerErrorHandlers(ServiceEndpoint endpoint,
        System.ServiceModel.Dispatcher.EndpointDispatcher endpointDispatcher)
        {
            endpointDispatcher.DispatchRuntime.ChannelDispatcher.ErrorHandlers.Add(
               new JsonErrorHandler(endpointDispatcher.DispatchRuntime.ChannelDispatcher.IncludeExceptionDetailInFaults));
        }
    }

    public class JsonWebHttpElement : BehaviorExtensionElement
    {
        protected override object CreateBehavior()
        {
            return new JsonWebHttpBehavior();
        }

        public override Type BehaviorType
        {
            get { return typeof(JsonWebHttpBehavior); }
        }
    }

 

The job of the custom error handler is to create a custom fault class that provides data back to the calling application.  This solution nicely takes into consideration the IncludeExceptionDetailsInFaults property to correctly send the details only when configured to do so.  In this case, the status code is always set to 500 to trigger the correct error handling in the client library, but you could also modify this to send more specific HTTP status codes depending on the error message caught on the server. 

 

[DataContract]
    public class JsonFault
    {
        [DataMember]
        public string ExceptionType;

        [DataMember]
        public string Message;

        [DataMember]
        public string StackTrace;
    }

    public class JsonErrorHandler : IErrorHandler
    {
        public JsonErrorHandler(bool includeExceptionDetailInFaults)
        {
            this.includeExceptionDetailInFaults = includeExceptionDetailInFaults;
        }

        public bool HandleError(Exception error)
        {
            return false;
        }

        public void ProvideFault(Exception error,
            System.ServiceModel.Channels.MessageVersion version,
            ref System.ServiceModel.Channels.Message fault)
        {
            JsonFault jsonFault;
            if (includeExceptionDetailInFaults)
            {
                jsonFault = new JsonFault
                {
                    ExceptionType = error.GetType().FullName,
                    Message = error.Message,
                    StackTrace = error.StackTrace
                };
            }
            else
            {
                jsonFault = new JsonFault
                {
                    ExceptionType = typeof(System.Exception).FullName,
                    Message =
                        "An error occurred on the server. See server logs for details.",
                    StackTrace = null
                };
            }

            DataContractJsonSerializer serializer =
                new DataContractJsonSerializer(typeof(JsonFault));

            fault = Message.CreateMessage(version, null, jsonFault, serializer);
            fault.Properties.Add(WebBodyFormatMessageProperty.Name,
                new WebBodyFormatMessageProperty(WebContentFormat.Json));
            WebOperationContext.Current.OutgoingResponse.ContentType = "application/json";
            WebOperationContext.Current.OutgoingResponse.StatusCode =
                System.Net.HttpStatusCode.InternalServerError;
        }

        private bool includeExceptionDetailInFaults;
    }

 

 

You could certainly make modifications to only provide faults for certain types of exceptions (which is what .NET 4 does) log information in the HandleError method, etc.  Many thanks to Dave Grundgeiger for the inspiration to look into this and the final solution which he designed and allowed me to share here. 

Tags:

Windows Communication Foundation

Public courses listed for WCF and WF in .NET 4

by Matt Milner 14. May 2010 07:53

In addition to now having WF 4 offered as a private on-site course, we have several upcoming public offerings of our Double Feature course which has been updated to .NET 4.  I’m excited to be teaching the course at the end of July in Boston and to cover the new features in WCF including configuration enhancements and REST improvements.  The bigger change, of course, is the entirely new WF 4 programming model.  In these open enrollment classes, we will be covering the new programming model, activity development and the runtime services such as persistence and tracking.  We’ll also cover the convergence of these two technologies in Workflow Service and the new message correlation capabilities introduced in .NET 4. 

So, if you are interested in an intense week of training in the Boston (July 26) or SoCal (Oct 11) area on these two great frameworks, register or save a seat before the classes fill up!

Tags:

Windows Workflow Foundation | Windows Communication Foundation

Windows Server AppFabric hits beta 2

by Matt Milner 1. March 2010 07:26

You can download the bits and get some great samples in the MSDN development center.  AppFabric is the combination of the distributed caching features previously codenamed “Velocity” and the composite application management features previously codenamed “Dublin”.  These extensions to Windows Server continue to make it a great platform for building applications.  I’m excited about the tooling that AppFabric brings to the management of Workflow Services as it makes it much easier to configure, monitor and take action on deployed services.  In addition, AppFabric adds to the rich tracking story in WF by allowing you to store tracking information in a SQL database so you can do historical analysis. 

This build is based on the Release Candidate of .NET Framework 4, so you’ll want to have that installed first before installing AppFabric. 

Tags:

Windows Workflow Foundation | Windows Communication Foundation | AppFabric

Twin Cities Cloud Computing User Group presentation

by Matt Milner 25. January 2010 05:02

I had a great time presenting at the TC Cloud Computing User group about a week ago.  The group is still kind of small, but growing rapidly and pulling together good content as well as hands-on opportunities for people to get going on cloud technologies.  This particular presentation was on the Azure platform AppFabric (ServiceBus and Access Control).  For those of you who attended, the demos I used are in the attachments as I mentioned

Tags:

Windows Communication Foundation

New podcast on .NET Service Bus with Jeff Brand

by Matt Milner 4. September 2009 17:18

A couple of weeks ago, I sat down with Jeff Brand to do a podcast on the .NET Service Bus and Cloud Computing in general.  We had a fun conversation about the cloud and why it is such a big deal, then dove into more detailed conversation about the .NET Service Bus. Get the link for the audio from Jeff’s post

I’m really getting excited for the release of Windows Azure and seeing the kinds of applications people build on this new platform.

Tags:

General Musings | Windows Communication Foundation

Beta version of “Dublin” course published to Plurasight On Demand!

by Matt Milner 12. August 2009 07:17

We’ve recently added a short course on “Dublin” to our subscription-based on demand library.  This is very much an early adopter course using very early bits released last year at the PDC.  The intention is not to provide detailed information on specifics around the technology as those will most definitely change, rather I wanted to provide enough information and detail so developers can figure out “What is Dublin?”. 

The modules provide an overview as well as showing the tools provided by, or built upon by, “Dublin” to deploy and manage your services. 

Hopefully, these early look modules are helpful to you as a developer as you prepare for the next wave of connected systems technologies from Microsoft.  As “Dublin” reaches further milestones such as Beta  and Release Candidate status, I will be updating the course, and adding more content to round it out with deeper detail. 

Enjoy!

Tags:

Windows Workflow Foundation | Windows Communication Foundation