Skating to blood, tears, exhaustion and civil arrest,listening to Skinny Puppy and remembering playing Bubble Booble at the Student Union Building at Central Washington State University circa 1987-1989:
Back to work…
I am working with SharePoint and Project Server CSOM (Client-Server Object Model) for both SharePoint and Project Server for some months now. This API is a replacement for the traditional XML/WCF web services exposed by both products for years now. The APIs are rational, consistent, and do not require the usual rigging for a web service (endpoint configuration, proxies, and the various other scaffolding required to consume the web services). These are all great improvements.
Note that the CSOM for both SharePoint and Project Server, to date, implement a subset of the functionality of the traditional web services—and performance work is ongoing in the product groups. Like anything new out of these SDKs, it takes investment to reach parity in feature completeness and performance. That being said, get on it Microsoft!
I found CSOM is repetitive and while vastly less costly in terms of key strokes, it is nonetheless bulky and disjointed. Also, because I tend to use both the SharePoint CSOM and the Project Server CSOM in the same project, I decided I was ready to put something together like mpFx. The term “mpFx” means “Microsoft Project Effects” or, what I prefer “Microsoft Project Feature eXtensions”, which is more descriptive.
I set out to design a single API for accessing SharePoint and Project Server through a single library, with the ability to request services from Azure such as Cloud Storage, Cloud Services, DocumentDB, Managed Cache Services, and others through a “late-bound factory”, so as not to burden the basic API. In addition to CSOM, the library will expose oData from both SharePoint and Project Server.
This new API is call OSoFx, pronounced OH-SO-FX—Office Server Online Feature eXtensions. OSoAFx maybe, as in Office Server Online and Azure Feature eXtensions—pronounced OH-SO-A-FX?
A combined API allows for myriad benefits, but in today’s post I want to demonstrate the differences in syntax and the brevity of the API.
The following operations are performed by both straight CSOM and OSoFx (both servers are Office Online):
1.) Setup credentials
2.) Create a connection to Project Server
3.) Create a connection to SharePoint Server
4.) Present the version numbers for both servers
5.) List the projects
6.) List the sites
7.) Show how long it took to do this
Here is a side-by-side, the first straight CSOM and the second OSoFx
There is a bit of work behind the scenes in OSoFx to make this work, but the fruits of labor can be used by others. More on this later!
In 2007, with the advent of Project Server 2007, I started the mpFx project to provide a simplied API for the Project Server Interface. That work has taken me a long way, including landing me a job in Microsoft Consulting Services and further down the road, a lead role at forProject Technologies. These days, I am focusing most of my technical effort on SharePoint/Project Server Online and Azure.
In the tradition of mpFx, I have started a new project called OSoFx, pronounced “oh-so F X”, which provides a single interface into SharePoint Online, Project Onine (including CSOM, oData, and PSI) and Azure.
More to come!
Starting with SharePoint 2010, a new integration technology was introduced called CSOM, or Client Side Object Model. The API is an alternative to using web services. It is a much cleaner API. Not only can you utilize CSOM in a .NET application such as WinForms or WPF, you can use it in your browser-based apps. If you prefer REST, you are in luck as it too is supported.
When Microsoft released Project Server 2013, the product group embraced the CSOM model . It is important to understand what it is capable of and its limitations. In Project Server 2013, many of the PSI web services you are accustomed to programming against are still available. Visit this page for the full set of types and examples to get you started.
Project Server 2013 supports both online and on-premise implementations. Using oData is really the best way to get at data which was once stored in the Reporting Database. Project Server’s oData reference can be found here.
One thing you will notice right away is that Project Server 2013 ships with a single database. The drafts, published, archive, and reporting database are combined and schemas are used to separate the various content containers.
The introduction of the Project Calculation Service puts the server-side scheduling engine to closer parity with the scheduling engine implemented in Project Professional. This a great new feature. For an overview of Project Server 2013’s architecture, visit this page.
In Project Server 2016, which is heading toward RTM, introduces even more changes. You can download Beta 2 here.
In 2016, Project Server is included in the SharePoint install by default—much like the Visio, Excel, and Access services. You still have to purchase a server license. Install SharePoint and run the configuration wizard just as you normally would.
To enable Project Server use the SharePoint PowerShell commandlet enable-projectserverlicense Y2WC2-K7NFX-KWCVC-T4Q8P-4RG9W. This will give you a fully featured 180 trail.
The PWA site provisioning UI is mostly gone. To create a PWA site, follow the steps below:
Sorry for the bad formatting…
public async Task UpbloadBlobAsync(string blobContainer, string path, string name, bool zip, CancellationToken cancellationToken, IProgress<BlobTransferProgress> progress)
CloudBlobContainer cloudBlobContainer = GetBlobContainer(blobContainer);
CloudBlockBlob cloudBlockBlob = cloudBlobContainer.GetBlockBlobReference(name);
FileInfo fileInfo = new FileInfo(path);
long fileSize = fileInfo.Length;
long fileSizeForReporting = fileSize;
if (fileSize < Constants.DEFAULT_BUFFER_SIZE)
using (FileStream fileStream = File.Open(path, FileMode.Open))
await cloudBlockBlob.UploadFromStreamAsync(fileStream, cancellationToken);
int index = 1;
long startPosition = 0;
long bytesUploaded = 0;
long bufferSize = Math.Min(Constants.DEFAULT_BUFFER_SIZE, fileSize);
byte buffer = new byte[bufferSize];
using (FileStream fileStream = new FileStream(path, FileMode.Open))
fileStream.Position = startPosition;
fileStream.Read(buffer, 0, (int)bufferSize);
string blockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(index.ToString("d6")));
using (MemoryStream memoryStream = new MemoryStream(buffer))
using (Task putBlockTask = cloudBlockBlob.PutBlockAsync(blockId, memoryStream, null, cancellationToken))
await putBlockTask.ContinueWith(t =>
bytesUploaded += bufferSize;
fileSize -= bufferSize;
startPosition += bufferSize;
double percentComplete = 100 * bytesUploaded / fileSizeForReporting;
progress.Report(new BlobTransferProgress(fileSize, bytesUploaded, percentComplete));
} while (fileSize > 0);
catch (Exception exception)
And in my unit test, you can see the progress (plus the task wait where “I am doing other stuff”):
Remaining Bytes = 9737856 - Bytes uploaded = 262144 - Percent Complete = 2
Remaining Bytes = 9475712 - Bytes uploaded = 524288 - Percent Complete = 5
Remaining Bytes = 9213568 - Bytes uploaded = 786432 - Percent Complete = 7
Remaining Bytes = 8951424 - Bytes uploaded = 1048576 - Percent Complete = 10
Remaining Bytes = 8689280 - Bytes uploaded = 1310720 - Percent Complete = 13
Remaining Bytes = 8427136 - Bytes uploaded = 1572864 - Percent Complete = 15
Remaining Bytes = 8164992 - Bytes uploaded = 1835008 - Percent Complete = 18
Remaining Bytes = 7902848 - Bytes uploaded = 2097152 - Percent Complete = 20
Remaining Bytes = 7640704 - Bytes uploaded = 2359296 - Percent Complete = 23
Remaining Bytes = 7378560 - Bytes uploaded = 2621440 - Percent Complete = 26
Doing other stuff...
Remaining Bytes = 7116416 - Bytes uploaded = 2883584 - Percent Complete = 28
Remaining Bytes = 6854272 - Bytes uploaded = 3145728 - Percent Complete = 31
Remaining Bytes = 6592128 - Bytes uploaded = 3407872 - Percent Complete = 34
Remaining Bytes = 6329984 - Bytes uploaded = 3670016 - Percent Complete = 36
Remaining Bytes = 6067840 - Bytes uploaded = 3932160 - Percent Complete = 39
Remaining Bytes = 5805696 - Bytes uploaded = 4194304 - Percent Complete = 41
Remaining Bytes = 5543552 - Bytes uploaded = 4456448 - Percent Complete = 44
Remaining Bytes = 5281408 - Bytes uploaded = 4718592 - Percent Complete = 47
Remaining Bytes = 5019264 - Bytes uploaded = 4980736 - Percent Complete = 49
Remaining Bytes = 4757120 - Bytes uploaded = 5242880 - Percent Complete = 52
Remaining Bytes = 4494976 - Bytes uploaded = 5505024 - Percent Complete = 55
Doing other stuff...
Remaining Bytes = 4232832 - Bytes uploaded = 5767168 - Percent Complete = 57
Remaining Bytes = 3970688 - Bytes uploaded = 6029312 - Percent Complete = 60
Remaining Bytes = 3708544 - Bytes uploaded = 6291456 - Percent Complete = 62
Remaining Bytes = 3446400 - Bytes uploaded = 6553600 - Percent Complete = 65
Remaining Bytes = 3184256 - Bytes uploaded = 6815744 - Percent Complete = 68
Remaining Bytes = 2922112 - Bytes uploaded = 7077888 - Percent Complete = 70
Remaining Bytes = 2659968 - Bytes uploaded = 7340032 - Percent Complete = 73
Remaining Bytes = 2397824 - Bytes uploaded = 7602176 - Percent Complete = 76
Remaining Bytes = 2135680 - Bytes uploaded = 7864320 - Percent Complete = 78
Remaining Bytes = 1873536 - Bytes uploaded = 8126464 - Percent Complete = 81
Remaining Bytes = 1611392 - Bytes uploaded = 8388608 - Percent Complete = 83
Remaining Bytes = 1349248 - Bytes uploaded = 8650752 - Percent Complete = 86
Remaining Bytes = 1087104 - Bytes uploaded = 8912896 - Percent Complete = 89
Remaining Bytes = 824960 - Bytes uploaded = 9175040 - Percent Complete = 91
Remaining Bytes = 562816 - Bytes uploaded = 9437184 - Percent Complete = 94
Remaining Bytes = 300672 - Bytes uploaded = 9699328 - Percent Complete = 96
Remaining Bytes = 38528 - Bytes uploaded = 9961472 - Percent Complete = 99
Remaining Bytes = 0 - Bytes uploaded = 10000000 - Percent Complete = 100
I haven’t written much for a long, long time. Work and family are all consuming BUT, during the day I find myself staring at my monitor as something builds, merges, downloads, etc. Usually I pop over to the news and read CNN for a few minutes but I figured I could do something more productive so I am going to start writing again.
First, I am alive :-) Second, you can see what I spend the bulk of my time working on here: http://www.forproject.com/. For our EVMS product, we are currently working on version 3.0 but I have also been spending a lot of time on Integrator for Project Server: http://www.forproject.com/index.php/integrator-forproject/
Nights and weekends (sparingly), I get to play on my pet projects which these days are mainly related to Microsoft’s Azure cloud platform. A new portal for Azure is in preview and if you have a subscription, I recommend taking a look. It is quite an improvement over the previous version.
Here is the home page, which is fully customizable depending on what services you are consuming. Lately I have been working on Azure Web Apps and Storage
Here is a shot of the management area for my storage account:
The Azure SDK has a TON of amazing tools and technologies to work with. Everything from classic .NET technologies to Ruby, Node.js, PHP, and Python. Plus, you can build a virtual machine running Oracle on Linux if you wanted to! Microsoft has finally broken out of the “only our stuff” mode. It’s a lot of fun to play around in Azure.
Azure includes a storage service (distinct from Azure SQL), which is really nice. The system includes blobs, tables, and queues. All three are pretty amazing. As an example, a queue (which basically stores messages that any app can access across the world with guaranteed availability), stores messages up to 64KB in size and as many as you want up to 500 TB (that’s terabytes!).
The first thing I built with Azure storage was a picture synchronization tool. The idea was that everybody on my family has tons of digital pictures and we would like to make the available to everyone and also maintain a backup of the images.
Every person who has this installed can sync our pictures from the cloud to their desktop and upload new photos to add to the collection.
In the process of creating the application, I created a wrapper around Azure’s storage API called CloudStorage:
The picture synchronization tool is surprisingly small at just over 600 lines of code, most of it UI related, because the cloud storage wrapper does the bulk of the work. For those of you familiar with mpFx (a wrapper over Microsoft’s Project Server API), would find CloudStorage easy to use.
Of course, Azure Storage is not free, but it is extremely affordable based on usage. That said, I didn’t want to be charged for my test data and the various experiments I was doing. Microsoft has an answer for that called the Cloud Storage Emulator, which ships with the Azure SDK. Essentially it is a local version of storage that exposes the exact API as the cloud version. Incidentally, the storage emulator is written in .NET so you can decompile it (for instructional purposes only!) using dotPeek to see how it works.
While developing the picture synchronization tool I accumulated a ton of pictures in the emulator. I thought, well, it would be really cool to be able to synchronize my emulator data to the cloud.
Meet Azure Manager, a work in progress…
The tool allows me to create/update/add/delete blogs, queues, and tables both in the cloud and in the emulator. Visual Studio ships with this functionality but you can’t learn an API by using at an existing tool so I reinvented the wheel on purpose. Now, what Visual Studio doesn’t have is the ability to synchronize data from your emulator to the cloud or cloud to emulator. I wanted that ability so I wrote a sync engine that would go both ways.
You can chose the source and target for sync. So, if you want to update the cloud with the data in your emulator, you choose Development as the source and Cloud as the target. Choose your options and which tables, queues, and blobs you want to sync and it does all of the work for you.
So, most of the time my Cloud usage is really low until I decide I want to sync to the cloud.
That’s it for now. Next time I will share a fully cloud based product update system (imagine Windows Update) for all of my pet projects.
As I sit here waiting for a build to complete and for a database to restore, I started fiddling with my new Windows Server 2012 instance. Weird, the available memory is super low so I take look and I see that there is a mystery SQL Server instance running something called the Windows Internal Database. What’s that all about?
I guess I have been living under a rock in terms of this subject. Windows Internal Database (WID) is a version of SQL Server 2005-2012 that Microsoft ships with Windows Server 2008/2008R2/2012/2012R2. It is a variant of SQL Express that is designed to be used by Windows Services. It cannot be uninstalled and it is used by a variety of Microsoft products, including WSUS (which is how I discovered it—I was trying to figure out why available memory dropped so dramatically after I installed WSUS), ADRMS, Resource Manager, and a couple of others.
You can connect (only locally and preferably using the same account used to install Windows) through the pipe called \\.\pipe\MICROSOFT##WID\tsql\query (for 2012). I was able to throttle memory usage on WID but expanding databases caused SQL Management Studio to hang.
Learn something new every day!
I am trying to modernize a bit. My products support Windows Server 2008, Windows Server 2008 R2, SQL Server 2005-2008/2008R2, and SharePoint 2007/2010. My home network is a 2008 R2 domain. My mission was to introduce Windows Server 2012 R2, SQL Server 2012 R2, and SharePoint 2013 with Project Server 2013.
This hasn't been easy.
Figuring that after (depending on which version of which product) x number of years, things would get easier to install, configure, and go live with. Installing Windows 2012 R2 was a breeze of course. Adding the various roles and features I need was easy as well. Until I hit Windows Update Services (WSUS).
I like WSUS for my domain because I can pick and choose what updates go out to the house computers. In previous versions of Windows, it was pretty straight forward. Simply add the necessary roles and features to your server, talk to Windows Update about the types of updates you want, possibly create some rules to auto-approve the updates, setup a GPO object so your machines are looking at your local WSUS, reboot about 12 times, and you are ready to go.
This time, with Windows 2012 R2, this became a serious hassle. Absolutely nothing worked. For the machine hosting WSUS, I couldn’t even get it to talk to Windows Update even though it was a vanilla install straight from the ISO downloaded from MSDN. The MMC snapin wouldn’t initialize, the client machines couldn’t see it, and WSUS kept complaining about not being able to connect to the service.
Time to start over.
After many hours today I finally got the WSUS host to talk to Windows Update. Unfortunately, I don’t know why it started to communicate. I just kept plugging away at all of the “answers” on the web, rebooting, turning services on and off, renaming the SoftwareDistribution folder, deleting the crypto directory, etc, etc, until it finally worked. My intention was to track these modifications and create a tool that would do what Microsoft didn’t, which is to make WSUS work on Windows Server 2012 R2 right out the box. Unfortunately, I didn’t track changes because I was ripping through trying to get it to work.
I wasn’t even trying to spend the day messing with this. I wanted to get my Windows 2012 + SharePoint 2013 + Project Server 2013 instance up so I could see what it would take to deploy my stuff. That will be a topic of another post.
What I can tell you is this: once you get everything working except for clients talking to the WSUS host, you have a big problem. I really can’t explain why Microsoft did this, but the web services WSUS requires have quite the spastic web.configs. None of them have the correct protocols in their respective web.configs. I had to not only manually edit each web service .config, but I also had to take ownership of the file because the owner is “TrustedInstaller” so even though I am a domain admin, I could not edit and save the configs to fix the problem.
Everything is working great. Here’s what you need to know:
Your web configs are wrong. Open IIS Manager and select the following:
Open the web.config in Explore and look:
<!-- Run SOAP Header Filter with ClientWebService -->
Nope, not going to work! There is nothing there for get and post, so you will get something like this.
You really need your <webServices> * to look like this:
<!-- <add name="HttpSoap1.2"/> -->
<!-- <add name="HttpPostLocalhost"/> -->
<!-- Run SOAP Header Filter with ClientWebService -->
So do that! But your challenge will be to get Windows to let you. It’s not difficult. Just take ownership from TrustedInstaller and add your logged on user credentials, edit the file, save, and then you are golden.
You have to do something similar for all of WSUS web services except some don’t require the SOAP components of web.config. Not hard to figure out. Simply try to browse to the web service endpoint. If not doesn’t give you something like this:
Then you have some work to do.
Microsoft: This is not okay. It also isn’t documented anywhere that I can find. I figured this out by research. Windows 2012 is a derivative of Windows 8*, which few like so it would seem you would pay more attention to your enterprise users. That being said, I am really digging the new UI. I don’t like it on my workstations but I do like it on the server.