Microsoft Certification 70-487: Azure and Web Services Certification Study Guide
This article is part of my personal wiki where I write personal notes while I am learning new technologies. You are welcome to use it for your own learning!
Hi! These are my personal notes for the 70-487 certification on Microsoft Azure and Web Services. Feel free to use them when preparing yourself for the certification exam!
This is a summary of all the topics that you need to prepare for this exam:
- Module 1
- Choose Data Access Technologies: ADO.NET vs EF vs WCF Data Services vs Azure Storage
- Caching
- ObjectCache
- Http.Context.Cache
- Azure Cache
- Transactions
- TransactionScope, EntityTransaction, SqlTransaction
- Azure Storage
- Blobs
- Tables
- Queues
- CDN
- WCF Data Services
- OData
- XML
- XmlRead, XmlWrite, XmlDocument
- Xml namespaces
- XSLT, XML Transformations
- LINQ to XML, XDocument, XElement, XAttribute, etc
- XPath
- XmlConvert
- Module 2
- EF
- LINQ
- LINQ to Entities
- ADO.NET
- EF
- Module 3
- WCF
- Create WCF web services
- Create, host and consume WCF web services
- ServiceContract, OperationContract, DataContract, FaultContract
- Endpoints and Bindings
- WCF extensibility with inspectors and behaviors
- WCF async operations
- Create WCF Duplex Contracts (CallbackContract)
- Configure WCF web services via XML configuration or .NET API
- Create WCF web services
- WCF
- Module 4
- WEB API
- HTTPClient, HTTPWebRequest, WebClient
- Setup Authentication (Basic, Forms, etc)
- WEB API
- Module 5
- Deployment
- Add-ons
- Azure Service Bus
- Use an Azure Service Bus to expose a WCF service
- SignalR
- OWIN Basics
- Azure Service Bus
And here are some more extended notes:
Accessing Data
Choose data access technologies
Choose technology ADO.NET as data access technology based on requirements
ADO.NET was designed with support for large loads of data, security, scalability, flexibility and dependability in mind. Most of these are taken care by the fact that ADO.NET has a bias for disconnectedness. It is based on opening a connection, executing queries fast and immediately closing the connection. From there you work with a local version of the data until you need to submit changes back to the database.
-
Benefits of ADO.NET
- Disconnected Model. It does not required persistent connections to the underlying data store.
- Cross-platform compatibility with support for many databases
- Let’s you map database information to custom objects or DataSets and DataTables
- Stable technology
- Easy to learn and understand
- Database agnostic. Even though Azure was not around when ADO.NET was built, you can use it against Azure SQL databases only by modifying the connection string
-
Keywords: disconnected model, connection pooling, data providers, data adapter, data reader, DataSet, DataTable, DbConnection, DbCommand
-
TODO:
- READ more about DataAdapters and DataReaders
- EXPERIMENT using DataAdapters and DataReaders
- There’s more practical stuff about using ADO.NET in the second module (Querying and Manipulating Data with Entity Framework, which contains a submodule on querying and manipulating data with ADO.NET)
Choose Entity Framework as data access technology based on requirements
ORMs (Object Relational Mappers) appeared to bridge the gap between how entities are modeled in OOP programming languages, and how they are modeled in relational databases, that is, objects vs tables. This is usually known as impedance mismatch between OOP and RDBMSs. Entity Framework is the de factor ORM used within .NET.
- Benefits of EF
- Allows developers to work against OO domain models without regard to the underlying data storage solution
- Great tooling meant to increase developer productivity. Can create a database from a conceptual model or viceversa.
- Promotes separation concerns between the conceptual model (business domain model) and the data store
- The data store can be changed without affecting the application logic
- EF is built to work perfectly together with WCF
Choose WCF Data Services as data access technology based on requirements
WCF Data Services helps you bring data access to distributed system by easily exposing your data access layer via RESTful web services that use OData and JSON.
- Benefits of WCF Data Services
- It uses EF as foundation, so scenarios where EF is appropriate work well with WCF Data Services when we want to expose the data via web services
- Resources are addressable via URIs with OData and easily accessible via HTTP. This makes WCF Data Services very interoperable
- WCF Data services can serve Atom, JSON or XML
- WCF Data Services and Odata allow for very powerful queries with easy semantics
Choose Azure storage as data access technology based on application requirements
See objective 1.4. Implement Data Storage in Azure which is some sections below.
- TODO
- READ more about Azure Storage
- EXPERIMENT with the different Azure Storage options: table, blob, queues, etc
Implement caching
When used effectively caching can improve application responsiveness and provide a much better user experience.
Cache static data
.NET provides many mechanisms for caching data, some of them are:
ObjectCache
HttpContext.Cache
- ASP.NET caching mechanisms
ApplicationState
,Session
,ViewState
,OutputCache
,Cookies
, etc - Azure shared(co-located) or dedicated caching
- Shared (co-located) where part of a web or worker role resources are used for caching purposes
- Dedicated where you add a new role type to a project, a Cache worker role whose only purpose is to manage and host a cache.
ObjectCache
You can access the ObjectCache
by using the MemoryCache.Default
property. It provides an API that allows you to add/remove item to the cache and provide policies in regards to how long items should remain within the cache:
ObjectCache cache = MemoryCache.Default;
cache.Add("key", "value", new CacheItemPolicy {AbsoluteExpiration = new DateTimeOffset(DateTime.Now.AddMinutes(30)});
string value = cache.Get("key").ToString();
The ObjectCache
provides Add
and Set
methods in its API. Add
adds an object to the cache if it doesn’t already exist whereas Set
adds an object even if it already exists (effectively updating it).
- TODO
- READ more about
ObjectCache
- EXPERIMENT with the
ObjectCache
- READ more about
HttpContext.Cache
The HttpContext.Cache
is part of ASP.NET and provides similar caching functionality to ObjectCache
. It is created once per AppDomain
and remains active as long as the AppDomain
is active. It consists of a key/value store that uses string keys and objects as values.
It provides a similar API to the ObjectCache
with a dictionary interface and Add
and Insert
methods. These provide overloads to directly specify policies of expiration, dependencies and remove/update callbacks.
- TODO
- READ more about the HttpContext.Cache
- EXPERIMENT with the
HttpContext.Cache
Apply cache policy (including expirations)
Cache policies let you determine how an specific item will be cached. You can use an AbsoluteExpiration where a cached item is removed after a specific period of time from it being added to the cache or SlidingExpiration where a cached item is removed a period of time after it was read the last time.
In addition to the expiration strategy you can also set the CacheItemPriority
.
For the ObjectCache
you have these options (System.Runtime.Caching
:
Default
: the default. There is no priority to remove this itemNotRemovable
: this item cannot be removed from the cache
For the HttpContext.Cache
you have these options: Low, BelowNormal, Normal, AboveNormal, High, NotRemoveable, Default (which sets the priority to Normal)
It is interesting to remark that cache policies are represented by a single class CacheItemPolicy
in the case of the ObjectContext
and as separate values in the constructor of the HttpContext.Cache
.
- TODO
- READ more about cache policies
- EXPERIMENT with cache policies
Use CacheDependency to refresh cache data; query notifications
Cache dependencies with ObjectCache
The CacheItemPolicy
has a collection of so-called ChangeMonitor
:s. These allow you to create cache dependencies so that, if any of these dependencies change the cache is updated. There are several change monitors that provide dependencies to different sources:
-
CacheEntryChangeMonitor
: monitorCacheItems
for changes -
FileChangeMonitor
: monitor files for changes -
HostFileChangeMonitor
: monitor directories and file paths for changes -
SqlChangeMonitor
: monitor tables for changes -
TODO
- READ more about cache dependencies via change monitors
- EXPERIMENT with change monitors
Cache dependencies with HttpContext.Cache
With the HttpContext.Cache
cache dependencies are represented by the CacheDependency
class and its derived classes (SqlCacheDependency
). These Cache dependencies let you establish relationships between a cached item and another cached item, a file, an array of either, a sql database or any other CacheDependency
object.
- TODO
- READ more about cache dependencies via CacheDependency
- EXPERIMENT with CacheDependency
Implement transactions
Manage transactions by using the API from System.Transactions namespace
In order to have a transaction we need to meed the ACID constraints: Atomic, Consistent, Isolated and Durable. Transactions have two primary purposes: providing reliable unit of works that allow to correctly recover from failures and keep the database consistent even when a system failure occurs, AND, providing isolation between programs accessing a database concurrently.
Transactions let you see interactions with a database as an all or nothing operation. Either everything we want to do with the database is executed as we want or everything is rolledback and the database remains in its previous stable state.
Two interesting concepts relation to transactions are:
- Isolation levels determine the conditions by which a transaction is isolated from other transactions
- Simple vs distributed transactions where simple transactions affect one single database and distributed ones affect many databases
The core transaction objects exist in the System.Transactions
namespace. The other two important ones exist in System.Data.SqlClient
and System.Data.Entity
.
The System.Transaction.TransactionScope
class lets you define transactions via the using
block and its Complete
method. The using
block determines which operations should be considered part of a transaction and the Complete
method when the transaction should be committed. If an exception is thrown within the using
block, the transaction will be rolled back.
using(var scope = new TransactionScope()){
// do stuff...
// commit
scope.Complete();
}
Even though EF operates automatically with transactions (on SaveChanges
), there may be situations in which you’ll need to use EntityCommand
. In those cases you can use the TransactionScope
OR the EntityTransaction
class together with a EntityCommand
and an EntityConnection
. You can create an entity transaction by using the BeginTransaction
method of the connection object, use it as an argument to an EntityCommand
and commit the whole thing via the EntityTransaction.Commit
method. (Summary, EF and transactions, use DbContext.SaveChanges
, or TransactionScope
, or EntityTransaction
).
When using ADO.NET you also have access to the SqlTransaction
class which works in exactly the same fashion as an EntityTransaction
.
- TODO
- READ more about transactions in .NET
- READ more about
System.Transactions
,TransactionScope
,EntityTransaction
andSqlTransaction
- READ more about isolation levels and distributed transactions
- EXPERIMENT with transactions
Implement distributed transactions
Using TransactionScope
simple transactions are automatically promoted to distributed transactions if necessary. For instance, if within a transaction scope we query two different databases, the transaction is automatically promoted to a distributed transaction:
using(var scope = new TransactionScope()){
// do distributed stuff!
// like opening two connections to two different databases
// commit
scope.Complete();
}
Specify transaction isolation level
The IsolationLevel
enumeration is used to specify the level of isolation between multiple transactions, that is, how transactions interact with one another (locking behavior). They exist both in System.Data
and System.Transaction
but they share the same values.
Remarks from MSDN:
The data affected by a transaction is called volatile. When you create a transaction, you can specify the isolation level that applies to the transaction. The isolation level of a transaction determines what level of access other transactions have to volatile data before a transaction completes.
The lowest isolation level, ReadUncommitted, allows many transactions to operate on a data store simultaneously and provides no protection against data corruption due to interruptive transactions. The highest isolation level, Serializable, provides a high degree of protection against interruptive transactions, but requires that each transaction complete before any other transactions are allowed to operate on the data.
The isolation level of a transaction is determined when the transaction is created. By default, the System.Transactions infrastructure creates Serializable transactions. You can determine the isolation level of an existing transaction using the IsolationLevel property of a transaction.
The different isolation levels are:
- Unspecified: A different isolation level than the one specified is being used, but the level cannot be determined. An exception is thrown if this value is set.
- Chaos: The pending changes from more highly isolated transactions cannot be overwritten
- ReadUncommitted: (LOWEST isolation level) volatile data can be read and modified during a transaction
- ReadCommitted: volatile data cannot be read during the transaction, but can be modified. (wat)
- RepeatableRead: volatile data can be read but not modified during the transaction. New data can be added during the transaction.
- Serializable: (HIGHEST isolation level) volatile data can be read but not modified, and no new data can be added during the transaction.
- Snapshot: Volatile data can be read. Before a transaction modifies data, it verifies if another transaction has changed the data after it was initially read. If the data has been updated, an error is raised. This allows a transaction to get to the previously committed value of the data.
When you try to promote a transaction that was created with this isolation level, an InvalidOperationException is thrown with the error message “Transactions with IsolationLevel Snapshot cannot be promoted”.
Implement data storage in Azure
Access data storage in Azure
Azure provides these storage options:
- local storage: per instance temporary storage that disappears into the ether when a machine is restarted or decommissioned (<20GB-2TB)
- blog storage: durable unstructured storage for large binary objects (audio, video, images, etc) but that can store any data (200GB-1TB, 1TB per page blob, 100TB per account)
- table storage: tabular or structured data (super column table storage in NoSQL jargon like cassandra) (100TB)
- queues: message queue (100 TB)
- SQL Azure: relation database in the cloud (150GB)
For more information take a look at the Azure Storage offering and particuarly at the Azure Storage documentation
Choose data storage mechanism in Azure (blobs, tables, queues, SQL Database)
- Blobs are great for unstructured binary and text data
- Tables are great for structure but non-relational data
- Queues are great for storing and deliverying messages between components of a distributed application
Blobs
Perfect for storing unstructured binary and text data, blobs have these features:
-
let you access information from anywhere over HTTP and HTTPS
-
ideally suited for streaming video or audio, images, documents, backups
-
structured in accounts > containers > blobs (containers provide a logical and physical grouping for blobs)
-
A blob is merely a file. There are two types, blobs can take up to 200GB and pages up to 1TB
-
you can access blobs via HTTP/HTTPS or using the storage API (available for .NET and other platforms)
-
TODO
- READ more about Blob Storage and the .NET API
- READ even more about Blob Storage
- EXPERIMENT with blob storage
Table Storage
Tables let you store and manage structure data that is non relational. You can interact with table storage by using so-called TableOperation
:s and TableQuery
:s
- TODO
- READ more about Table Storage and the .NET API
- EXPERIMENT with table storage
Queues
Queues offers a push-pop mechanism to get messages into and off a queue. When you get a message you don’t immediatelly pop it from the queue, the message becomes invisible for a period of time, during these period it is guaranteed that nobody else will be able to process the message. After you process the message you can delete it from the queue.
- TODO
- READ more about Queues and the .NET API
- EXPERIMENT with queues
Distribute data by using the Content delivery network (CDN)
Microsoft Azure CDN lets you cache blobs and static content in specific locations to maximize performance and bandwidth. CDN lets you provide better performance for users who are far from a content source and provides a better handling for higher loads of traffic.
CDNs work in a transparent fashion, when a user makes a request the CDN will redirect the user to the closes location that can serve that request. Expiration is handled by the TTL (Time To Live) value.
In order for CDN to work, you need to make your Azure Storage containers publicly available and with anonymous access.
- TODO
- READ more about Azure CDN
Handle exceptions by using retries (SQL Database)
Because SQL Azure databases live as separate entities from your Azure applications and may be very physically separated, higher latency is expected and there’s a higher chance of experiencing timeouts. Because of that, it is recommended to have a retry policy so that if a connection doesn’t go through, we will try again.
Azure offers the transient fault handling framework to manage these retry scenarios. You can define a retry policy by implementing the ITransientErrorDetectionStrategy
and using it when creating a RetryPolicy
.
- TODO:
- READ more about the transient fault handling framework
- EXPERIMENT!
Manage Azure Caching
Azure caching provides a caching layer for your Azure applications. It can greatly reduce the cost of SQL Azure database transactions. Azure caching uses role-based caching where you can host a cache in any Azure role. This cache can be used by any other roles within the same cloud service deployment. There are different topologies available:
-
co-located caching where a part of the memory within a given role will work as cache. You can setup the threshold via configuration
-
dedicated caching where you define a specific role to host and manage the cache
-
there’s also the possibility of using Azure shared caching that lives outside of your roles and is used as a service
-
TODO
- READ more about Azure caching
- EXPERIMENT!
Create and implement a WCF Data Services service
- TODO
- READ more about creating a WCF Data Service
- EXPERIMENT creating a WCF Data Service
Address resources
You can create a WCF by following these steps:
- Setup EF in your solution
- Add WCF Data Service
- Specify service definition
- Enable access to the data service by setting the properties of the
DataServiceConfiguration
(SetEntitySetAccessRule
,SetServiceOperationsAccessRule
andDataServiceBehavior
)
When adding a WCF data service you will get a new class with a configuration method called InitializeService
:
public class MyService : System.Data.Services.DataService<MyDbContext>{
// stuff
public static void InitializeService(DataServiceConfiguration config){
// use the config to configure access to the service!
}
}
The most important configurations are:
SetEntityAccessRule
: Lets you set access rules for specific entity setsSetOperationAccessRule
: Lets you set access rules for specific service operations. They overwrite the entity access rules aboveDataServiceBehavior.MaxProtocolVersion
: can take either V1 or V2 based on the OData version protocol to use
// setting entity access rules
config.SetEntityAccessRules("Courses", EntitySetRights.AllRead | EntitySetRights.WriteMerge);
// setting service operation access rules
config.SetServiceOperationAccessRule("MyOperation", ServiceOperatinRights.All | ServiceOperationRights.OverrideEntitySetRights)
Create a query expression
After you have configured a data service you can just query the service using OData which exposes a RESTful API for all entities. You can access the different resources exposed through URIs:
http://host/blog.svc/tags
: gets all tagshttp://host/blog.svc/tags
:
Implement filtering
Access payload formats (including JSON)
Use data service interceptors and service operators
Manipulate XML data structures
Read filter, create, modify XML data structures
Manipulate XML data by using XMLReader, XMLWriter, XMLDocument, XPath, LINQ to XML
Transform XML by using XSLT transformations
Querying and Manipulating Data with Entity Framework
Entity Framework is the ORM of choice in the .NET world. It has three parts:
- A conceptual model known as the CSDL (Conceptual Schema Definition Language)
- The data storage aspect known as the SSDL (Store Schema Def. Language)
- The mapping between the conceptual and the data model is done via the MSL (Mapping Specification Language)
In modern versions of EF these three parts are contained within the *.edmx file. There are three ways you can build an entity data model using EF: code first, model first or database first.
- EF Tools:
- Entity Model Designer: creates the .edmx file and lets you manipulate the entity model
- Entity Data Model Wizard: enables you to setup a entity model in several steps, like when creating your model from an existing database
- Create Databas Wizard: lets you create a database from an entity model
- Update Model Wizard: helps you update model and database after an entity model has been created
When you use either a model or db first, you get an EF edmx file that in turn is composed of several parts:
- Model.Context.tt: the EF context that works as an entry point that your application uses to interact with EF. It is a T4 template that generates a context class for you based on your entity model
- Model.tt: the EF domain model, built of entities mapped to/from the database. It is a T4 template that generates all domain model classes you’ll use within your application
- Model.edmx.diagram used for saving the layout of your entity model within the designer
- Model.Designer
When you map an OO domain model to a database using EF there are different things you need to take into account like how to model inheritance in your database. EF offers different strategies to handle this:
- TPH (table per hierarchy) where a single table contains all objects within a class hierarchy. It offers the best performance by paying the cost of denormalization
- TPT (table per type) where tables are created for common fields and an additional table contains a single type. If offers worse performance but normalized information.
- Table per concrete type and mixed inheritance
You also have the option to store a whole object within another entity’s table. These are known as complex types.
As mentioned above, the core class that you use to interact with EF is the DbContext
(in old versions of EF the ObjectContext
) also know as the context. There are several options that you can configure in your EF context:
LazyLoadingEnabled
: enables lazy loading (load of related entities on traversal). Lazy loading can cause performance problems when many requests are sent to the database as we traverse a collection of related entities. If related data is only access seldomly, then it can be good in the sense that you only access the information you need. Alternatively you can use eager loading and select which parts of an object graph are loaded into memory by using theInclude
or theLoadProperty
methods.ProxyCreationEnabled
: create proxies for use with data objects that are persistence-ignorant like POCOsUseConsistentNullReferenceBehavior
: From MSDN, if this flag is set to false, then setting the Value property of the EntityReferencefor an FK relationship to null when it is already null will have no effect. When this flag is set to true, then setting the value to null will always cause the FK to be nulled and the relationship to be deleted even if the value is currently null. The default value is false when using ObjectContext and true when using DbContext. UseCSharpNullComparisonBehavior
: If enabled it adds clauses to your queries in which you check forNULL
values in scenarios when it makes sense (it would be the expected behavior).UseLegacyPreserveChangesBehavior
: In current versions of EF if you attach an entity to the context that already exists within the context and it has properties that differ from those in the context then the properties are set as modified. Setting this option to true reverses to the old behavior of EF where the new values are not copied over, that is, the entity within the context remains with the properties that it had.
There are several options in the types of entities that can be used within EF:
-
In EF 4 the default entity object inherited from
EntityObject
. They are usually generated via T4 templates with theEdmEntityType
,Serializable
andDataContract
attributes. Their properties are decorated with theEdmScalarProperty
andDataMember
attributes. -
After POCOs (plain old CLR objects) and self-tracking entities (POCOs that implement the
IObjectWithChangeTracker
andINotifyProperyChanged
interfaces) gained more traction -
TODO
- LEARN more about Entity Framework
- EXPERIMENT creating entity model via code first, model first and database first. Test the different parts of EF tooling.
Query and manipulate data by using the Entity Framework
Query, update, and delete data by using DbContext; build a query that uses deferred execution; implement lazy loading and eager loading; create and run compiled queries; query data by using Entity SQL; perform asynchronous operations using Entity Framework; map a stored procedure
Take a look at these articles:
- Entity Framework DbContext
- LINQ and LINQ to Entities
- Create and run compiled queries with EF
- EntitySQL
- EF Async Operations
- Map stored procedures with EF
Query and manipulate data by using Data Provider for Entity Framework
Query and manipulate data by using Connection, DataReader, and Command from the System.Data.EntityClient namespace; perform synchronous and asynchronous operations; manage transactions (API); programmatically configure a Data Provider
Take a look at these articles:
Query data by using LINQ to Entities
Query data by using LINQ operators (for example, project, skip, aggregate, filter, and join); log queries and database commands; implement query boundaries (IQueryable vs. IEnumerable); implement async query
Take a look at these articles:
- LINQ and LINQ to Entities
- Logging EF queries and database commands
- LINQ deferred execution and IQueryable vs IEnumerable
- EF Async Operations
Query and manipulate data by using ADO.NET
ADO.NET was designed with support for large loads of data, security, scalability, flexibility and dependability in mind. Most of these are taken care by the fact that ADO.NET has a bias for disconnectedness. It is based on opening a connection, executing queries fast and immediately closing the connection. From there you work with a local version of the data until you need to submit changes back to the database.
Benefits of a disconnected model:
- Connections are expensive for an RDBMS to maintain
- Connections can hold locks on data which cause concurrency problems
ADO.NET keeps connections closed as much as possible and also provides connection pooling, that is, it opens a pool connections and shares them among several requests. This way the cost of managing, open and closing connections is minimized.
ADO.NET provides a series of data providers that let you manipulate data and to have fast, forward-only, read-only access to data. These are:
-
DbConnection: helps you manage database connections
-
DbCommand: used to interact with a database. Provides support for parameterization
-
DbDataReader: high speed read-only, forward-only access similar to a Stream
-
DbDataAdapter: used in conjunction with connection and command objects to populate a DataSet or DataTable from the database, or to perform changes in the database
-
DataSet: in-memory copy of the RDBMS or a portion of it. Collection of DataTable objects, their relationships and additional metadata and operations
-
DataTable: specific view of data analogous to a database table. Partially populated. Tracks the state of the data within the table and can submit changes back to the database
-
DataAdapter vs DataReader: (note that the DataAdapter uses a reader internally)
- Using the DataReader produces faster results for the same data
- DataReader provides more asynchronous methods than DataAdapter
- The
Fill
method of DataAdapter only lets you populate DataSet and DataTable. If you are using a domain model, you’ll need an additional step to parse your DataSet and DataTable into domain objects - The DataSet provides a better API that is more similar to using an RDBMS
- The
Fill
method of DataAdapter only returns when all data has been retrieved. As such you can get the number of recrods in a given table. By contrast the only way to know the number of records using a DataReader is to iterate through it until the end and count the results - You can iterate over a DataReader only one and in a forward-only fashion. You can interate over a DataTable any number of times.
- DataSets can be loaded directly from XML and persisted to XML natively. This means that they are inherently serializable.
- Once a DataSet or DataTable is populated, no further interaction with the database is needed unless you want to send changes back.
Query and manipulate data by using Connection, DataReader, Command, DataAdapter, DataSet ; Perform synchronous and asynchronous operations; Manage transactions (API)
Take a look at these articles:
- Querying data using ADO.NET and ADO.NET overview
- Async operations with ADO.NET
- Transactions in ADO.NET
Create an Entity Framework data model
Structure the data model using table per type, table per class, table per hierarchy; choose and implement an approach to manage a data model (code first vs. model first vs. database first); implement POCO objects; describe a data model by using conceptual schema definitions, storage schema definition, mapping language (CSDL, SSDL, MSL), and Custom Code First Conventions
- Implementing inheritance in EF
- EF Get Started with Code First, Model First and Database First
- EF Designer
Designing and Implementing WCF Services
Create a WCF service
In order to work effectively with WCF you need to learn some basic SOA (Service Oriented Architecture) terms:
- A Service is a component that can perform a task
- A Service definition is a exhaustive description of what a service can do and it takes the form of a contract between client and server
- A Binding specifies the transport, encoding and protocols used to communicate client and server
- Serialization is the mechanism by which data is transformed into a format suitable for transfer between processes
Within SOA, boundaries are explicit, service autonomous, services share a schema and a contract but not a class, and service compatibility is based on policy.
Create contracts (service, data, message, callback, and fault)
There are several types of contracts you can define in WCF via attributes:
- ServiceContract: the service contract attribute is used to define an interface as a contract
- OperationContract: is used to define a given method as a valid operation within the context of a service contract
- DataContract: used to define data that is part of a service and will be transferred between client and server as part of a service contract. It also informs the runtime that a given type is serializable (using the
DataContractSerializer
). Additionally you can use the[KnownType]
attribute to send additional information about types that should be known in the serialization process, this is useful when usin polymorphism in your services (when the contract is expressed in terms of a base class, interface orObject
). TheDataContract
attribute also lets you use references between objects within the WCF response by setting itsIsReference
property totrue
.- DataMember: A data contract usually decorates a class and the propeties within that class that are part of the service contract should be decorated with the
[DataMember]
attribute. It has several properties:EmitDefaultValues
determines whether or not properties with a default value are sent to a client (it defaults totrue
);Order
specifies in which order properties are serialized. - EnumMember: used to decorate enum members (the enum itself would be decorated with
[DataContract]
).
- DataMember: A data contract usually decorates a class and the propeties within that class that are part of the service contract should be decorated with the
- FaultContract: used to define exceptions that can occur within a service. Since programmatic exceptions are platform depending, fault contracts represent a platform agnostic way to communicate errors occurred within a service. You can find more info about Fault Contracts in MSDN.
An example contract in WCF can look like this:
[ServiceContract]
public interface IGiftShop{
[OperationContract]
[FaultContract(typeof(DontLikeJaimeException))]
string SayHello(string name);
[OperationContract]
Gift GetGiftFor(string name);
}
[DataContract]
public class Gift{
[DataMember]
public string Name {get;set;}
[DataMember]
public string Note {get;set;}
}
public class GiftShop : IGiftShop{
public string SayHello(string name)
{
if (name == "Jaime"){
throw new FaultException<DontLikeJaimeException>("Don't like you Jaime! Bugger off! ;)");
}
return string.Format("Hello {0}", name);
}
public Gift GetGiftFor(string name)
{
return new Gift
{
Name = "Awesome gift",
Note = string.Format("with love to {0}", name)
};
}
}
You can create a new WCF project in many ways. If you create a WCF Service library, which like a class library specially setup to contain a WCF service, you’ll get an example Service Contract, your project will reference System.ServiceModel
(WCF) and System.Runtime.Serialization
, and you’ll get an initial configuration for your service within an app.config
file.
If you run the project (F5
or CTRL+F5
) Visual Studio will open the WCF Test Client which will read the service definition for your service and automagically generate a GUI that you can use to test your service.
Once you have created your service and hosted it, the next step is to consume the service within a client. A very straight-forward way to consume a WCF service is by generating a proxy class from its service definition. You can achieve this in an easy manner by using the Add Service Reference option within Visual Studio (similar to Add Reference).
As stated above, the app.config
contains configuration that enables your service to run. A very important part of the configuration are the endpoints, which tell a client where it can access your service. An endpoint is defined by its Address, Binding and Contract (also known as ABC):
- the Address is a URI where you can find the service
- the Binding describes the transport medium of the communication between client and server
- the Contract describes the service itself
Bindings are objects used to specify the communication details required to connect to an endpoint in WCF. A Binding defines a protocol (which determine the security mechanism being used, reliability, context flow), an encoding (which determines how the message is encoded, f.i. text, binary, MTOM, etc) and the transport (which determines the underlying transport protocol f.i. TCP, HTTP, etc). A Binding usually consists of an ordered stack of binding elements, each specifying part of the communication information required to connect to a service endpoint. The two lower layers of the stack are required and descrbie the transport binding and the message encoding specifications.
WCF provides some built-in bindings that are designed to cover most application requirements:
BasicHttpBinding
: an HTTP protocol binding suitable to connect to web services that conforms to the WS-1 Basic profile spec.WSHttpBinding
: an interoperable binding suitable for connecting to endpoints that conform to WS-* protocols.NetNamedPipeBinding
: Uses the .NET framework to connect to other WCF endpoints in the same machineNetMsMqBinding
: Uses the .NET framework to create queued message connections with other WCF endpoints.NetTcpBinding
: A secure and optimized binding suitable for cross-machine communication between WCF applications.NetHttpBinding
: A binding designed for consuming HTTP or WebSocket services that uses binary encoding by default.
Additionally you can create your own custom bindings. You can create a custom binding from pre-existent binding elements via CustomBinding
or creating a completely new one by deriving from Binding
.
You can find more information about WCF bindings in MSDN. You can also take a look at this list of all built-in bindings.
Implement message inspectors
There are several ways in which you can hook up into the WCF request pipeline and extend a WCF services:
- Parameter inspectors
- Message formatters
- Message inspectors
Parameter inspectors allow you to inspect the arguments being sent to a service and perform some arbitrary logic. You can implement parameter inspectors by:
- implementing the
IParameterInspector
interface which has two methodsAfterCall
andBeforeCall
. TheBeforeCall
method is called just before the parameters are serialized into aMessage
object while theAfterCall
occurs after a call has been processed. Adding some validation in theBeforeCall
method would prevent useless request from being sent to the service. - Implementing a behavior
IOperationBehavior
,IEndpointBehavior
,IServiceBehavior
orIContractBehavior
(depending upon the required scope) to add your parameter inspector to either theClientOperation.ParameterInspectors
orDispatchOperation.ParameterInspectors
properties.
You can find more information about parameter inspectors and behaviors at MSDN.
Message inspectors let you examine the contents of an inbound or outbound message before it gets processed by WCF or modify it after the message is serialized and returned to the client.
On the server side you can implement message inspectors by using the IDispatchMessageInspector
interface that has the AfterReceiveRequest
and BeforeSendReply
methods. On the client side you use IClientMessageInspector
. Both can be added to a service as behaviors.
You can find more information about message inspectors in MSDN.
Implement asynchronous operations in the service
- TODO
- READ more about Async operations in WCF
- EXPERIMENT with WCF Async operations. The proxy generates both synchronous and async methods. The async methods end in
Async
.
Configure WCF services by using configuration settings
Configure service behaviors; configure service endpoints; configure bindings including WebSocket bindings; specify a service contract; expose service metadata (XSDs, WSDL, and metadata exchange endpoint); configure message compression and encoding
Take a look at this articles:
- Configure WCF services
- Configure WCF via configuration files
- How to Expose WCF service metadata
- How to create Custom Compression Message Encoder
Configure WCF services by using the API
Configure service behaviors; configure service endpoints; configure binding; specify a service contract; expose service metadata (XSDs, WSDL, and metadata exchange); WCF routing and discovery features
Take a look at these articles:
- Configure WCF services
- Configure WCF programmatically
- How to Expose WCF service metadata
- WCF routing
Secure a WCF service
Implement message level security, implement transport level security; implement certificates; design and implement multiple authentication modes
Take a look at these articles:
Consume WCF services
Generate proxies by using SvcUtil; generate proxies by creating a service reference; create and implement channel factories
Take a look at these articles:
Version a WCF service
Version different types of contracts (message, service, data); configure address, binding, and routing service versioning
Take a look at these articles:
Create and configure a WCF service on Azure
Create and configure bindings for WCF services (Azure SDK—extensions to WCF); relay bindings to Azure using service bus endpoints; integrate with the Azure service bus relay
Take a look at these articles:
Implement messaging patterns
Implement one way, request/reply, streaming, and duplex communication; implement Azure Service Bus and Azure Queues
Take a look at these:
Host and manage services
Manage services concurrency (single, multiple, reentrant); create service hosts; choose a hosting mechanism; choose an instancing mode (per call, per session, singleton); activate and manage a service by using AppFabric; implement transactional services; host services in an Azure worker role
Take a look at these:
- WCF service concurrency, instancing and sessions
- WCF and transactions
- Host WCF services in a worker role
Creating and Consuming Web API based Services
Read the very awesome WEB API documentation, and don’t forget new features like attribute routing, CORS and batching requests. You should also take a look at OWIN and SignalR.
- Design a Web API
- Define HTTP resources with HTTP actions
- Plan appropriate URI space, and map URI space using routing
- Choose appropriate HTTP method (get, put, post, delete) to meet requirements
- Choose appropriate format (Web API formats) for responses to meet requirements
- Plan when to make HTTP actions asynchronous; design and implement routes
- Implement a Web API
- Accept data in JSON format (in JavaScript, in an AJAX callback)
- Use content negotiation to deliver different data formats to clients
- Define actions and parameters to handle data binding
- Use HttpMessageHandler to process client requests and server responses
- Implement dependency injection, along with the dependency resolver, to create more flexible applications
- Implement action filters and exception filters to manage controller execution
- Implement asynchronous and synchronous actions
- Implement streaming actions
- Implement SignalR
- Test Web API web services
- Secure a Web API
- Implement HTTPBasic authentication over SSL
- Implement Windows Auth
- Prevent cross-site request forgery (XSRF)
- Design, implement, and extend authorization and authentication filters to control access to the application
- Implement Cross Origin Request Sharing (CORS)
- Implement SSO by using OAuth 2.0
- Configure multiple authentication modes on a single endpoint
- Host and manage Web API
- Host Web API in an ASP.NET app
- Self-host a Web API in your own process (a Windows service) including Open Web Interface for .NET (OWIN)
- Host services in an Azure worker role
- Restrict message size; configure the host server for streaming
- Consume Web API web services
- Consume Web API services by using HttpClient synchronously and asynchronously
- Send and receive requests in different formats (JSON/HTML/etc.)
- Request batching
Deploying Web Applications and Services
Design a deployment strategy
Create an IIS install package; deploy to web farms; deploy a web application by using XCopy; automate a deployment from TFS or Build Server
Take a look at these:
-
Choose a deployment strategy for an Azure web application
- Perform an in-place upgrade and VIP swap
- Configure an upgrade domain
- Create and configure input and internal endpoints
- Specify operating system configuration
- Deploy applications using Azure Web Site
-
Configure a web application for deployment
- Switch from production/release mode to debug mode
- Use SetParameters to set up an IIS app pool
- Set permissions and passwords
- Enable and monitor ASP.NET App Suspend
- Configure WCF endpoints (including HTTPS protocol mapping), bindings, and behaviors
- Transform web.config by using XSLT (for example, across development, test, and production/release environments)
- Configure Azure configuration settings
-
Manage packages by using NuGet
- Create and configure a NuGet package
- Install and update an existing NuGet package
- Connect to a local repository cache for NuGet, set up your own package repository
-
Create, configure, and publish a web package
- Create an IIS InstallPackage
- Configure the build process to output a web package
- Apply pre- and post- condition actions to ensure that transformations are correctly applied
- Include appropriate assets (web content, certificates)
-
Share assemblies between multiple applications and servers
- Prepare the environment for use of assemblies across multiple servers (interning)
- Sign assemblies by using a strong name
- Deploy assemblies to the global assembly cache
- Implement assembly versioning
- Create an assembly manifest
- Configure assembly binding redirects (for example, from MVC4 to MVC5)
TODO Summary
- ADO.NET
- READ more about DataAdapters and DataReaders
- READ more about handling multiple queries within a DataReader using
NextResult
- EXPERIMENT using DataAdapters and DataReaders
- There’s more practical stuff about using ADO.NET in the second module (Querying and Manipulating Data with Entity Framework, which contains a submodule on querying and manipulating data with ADO.NET)
- EF
- READ more aboue Entity Framework
- READ more about Entity SQL (Entity SQL Quick Reference)
- READ more about DataAnnotations
[InverseProperty]
vs[ForeignKey]
- READ more about async operations with EF
- READ more about configuring DbContext with the fluent API (remember to take a look at
WithRequiredPrincipal
,WithRequiredDependent
that I always forget xD) - READ more about logging queries emitted by EF and other information (
query.ToString
,dbContext.DataBase.Log
, etc) - EXPERIMENT with EF different workflows
- EXPERIMENT with async operations, query and save in EF
- WCF Data Services
- READ more about WCF Data Services
- EXPERIMENT with WCF Data Services and OData
- Azure Storage
- READ more about Azure Storage
- READ more about the Azure storage offering
- READ more about Blob Storage and the .NET API
- READ even more about Blob Storage
- READ more about how to use table storage
- READ more about how to design a scalable partitioning strategy for windows azure table storage
- READ more about Queues and the .NET API
- READ more about Azure caching
- READ more about the transient fault handling framework
- EXPERIMENT with the different Azure Storage options: table, blob, queues, etc
- EXPERIMENT with Azure cache
- EXPERIMENT with the transient fault handling framework
ObjectCache
- READ more about
ObjectCache
- READ more about cache policies
- READ more about cache dependencies via change monitors
- EXPERIMENT with the
ObjectCache
- EXPERIMENT with cache policies
- EXPERIMENT with change monitors
- READ more about
HttpContext.Cache
- READ more about the HttpContext.Cache
- READ more about cache policies
- READ more about cache dependencies via CacheDependency
- Remember
HttpContext.Current.Cache
caches data andHttpContext.Current.Response.Cache
caches pages (it is the output cache) - EXPERIMENT with the
HttpContext.Cache
- EXPERIMENT with cache policies
- EXPERIMENT with CacheDependency and SqlCacheDependency
- Transactions
- READ more about transactions in .NET
- READ more about
System.Transactions
,TransactionScope
,EntityTransaction
andSqlTransaction
- READ more about isolation levels and distributed transactions
- READ more about Concurrency Series: Basics of transaction isolation levels. Also about the concepts of dirty reads, non-repeatable reads and phantom data.
- EXPERIMENT with transactions
- XML
- READ more about XSLT
- READ more about XML transforms used in web.config, etc
- WCF Services
- READ more about WCF Services in general
- READ more about MTOM encoding (Message Transmission Optimization Mechanism and how to use it to optimize the payload being sent from a WCF service
- READ more about forward compatible data contracts and the
IExtensibleDataObject
interface - READ more about Common Security Scenarios in WCF
- ASP.NET WEB API
- READ more about getting started with WEB API
- READ more about Web API Request Batching
- EXPERIMENT with Web API Request Batching
- READ more about
HttpClient
,WebClient
andHttpWebRequest
(also look at async methods) - READ more about allowing CORS using custom headers in web.config
- READ more about hosting WEB API in an Azure Worker Role
- READ more about attribute routing in WEB API (remember constraints and such)
- EXPERIMENT with
HttpClient
- READ more about supporting multiple authentication schemes with Web API when hosted in IIS
- EXPERIMENT with Web API and authentication
- READ more about implementing Basic Authentication in a Web API
- Deploying Web Applications and Services
- READ more about Building and Packaging Web Applications in .NET
- MSBuild
- READ Web Packaging: Creating web packages using MSBuild
- Other Stuff
- OWIN and authentication using OWIN
- Hosting OWIN in an Azure Worker Role
- SignalR
- Best practices in asynchronous programming
- Review Changes on Certification Curriculum
- …
- Tools
- Use LINQPad to test LINQ stuff
Interesting references
Take a look at this interesting references as well:
- Microsoft Virtual Academy course for preparing the 70-487: Developing Windows Azure And Web Services Jump Start
- A lot of Pluralsight courses:
And that was that :)
Written by Jaime González García , dad, husband, software engineer, ux designer, amateur pixel artist, tinkerer and master of the arcane arts. You can also find him on Twitter jabbering about random stuff.