Saturday, June 11, 2016

Win32 Error Browser & Search


Straight from WINERROR.html:


Search includes progressive F3 search.

Small break between SharePoint/Project Online projects.

If you want it, contact me

Wednesday, May 25, 2016


Skating to blood, tears, exhaustion and civil arrest,listening to Skinny Puppy and remembering playing Bubble Booble at the Student Union Building at Central Washington State University circa 1987-1989:


Back to work…

Monday, April 11, 2016

A Comparison in Brevity Between Straight CSOM and OSoFx

I am working with SharePoint and Project Server CSOM (Client-Server Object Model) for both SharePoint and Project Server for some months now.  This API is a replacement for the traditional XML/WCF web services exposed by both products for years now.  The APIs are rational, consistent, and do not require the usual rigging for a web service (endpoint configuration, proxies, and the various other scaffolding required to consume the web services).  These are all great improvements.

Note that the CSOM for both SharePoint and Project Server, to date, implement a subset of the functionality of the traditional web services—and performance work is ongoing in the product groups.  Like anything new out of these SDKs, it takes investment to reach parity in feature completeness and performance.  That being said, get on it Microsoft!

I found CSOM is repetitive and while vastly less costly in terms of key strokes, it is nonetheless bulky and disjointed.  Also, because I tend to use both the SharePoint CSOM and the Project Server CSOM in the same project, I decided I was ready to put something together like mpFx.  The term “mpFx” means “Microsoft Project Effects” or, what I prefer “Microsoft Project Feature eXtensions”, which is more descriptive.

I set out to design a single API for accessing SharePoint and Project Server through a single library, with the ability to request services from Azure such as Cloud Storage, Cloud Services, DocumentDB, Managed Cache Services, and others through a “late-bound factory”, so as not to burden the basic API.  In addition to CSOM, the library will expose oData from both SharePoint and Project Server.

This new API is call OSoFx, pronounced OH-SO-FX—Office Server Online Feature eXtensions.  OSoAFx maybe, as in Office Server Online and Azure Feature eXtensions—pronounced OH-SO-A-FX?

A combined API allows for myriad benefits, but in today’s post I want to demonstrate the differences in syntax and the brevity of the API. 

The following operations are performed by both straight CSOM and OSoFx (both servers are Office Online):

1.) Setup credentials

2.) Create a connection to Project Server

3.) Create a connection to SharePoint Server

4.) Present the version numbers for both servers

5.) List the projects

6.) List the sites

7.) Show how long it took to do this

Here is a side-by-side, the first straight CSOM and the second OSoFx

Straight CSOM




There is a bit of work behind the scenes in OSoFx to make this work, but the fruits of labor can be used by others.  More on this later!

Chief Technology Architect @ DeltaBahn LLC

I recently moved to DeltaBahn LLC, a Texas-based company specializing in Microsoft Office Online, including Project Server and SharePoint, in addition to Microsoft Project Server and SharePoint Server on-site.  We also provide solutions to integrate and enrich project data with connectors to external systems.  We also offer general software development services,

The majority of our employees are former Microsoft employees with decades of experience working both at Microsoft or with Microsoft partners.  Not only do we have the experience, but we love this industry.

"Your success and satisfaction is what matters most. We listen and understand, we know what works, and we follow time-proven methods that have been adopted as Microsoft best practices"

I encourage you to visit our site and learn more about us. I am excited to be here (there are a lot of smart people working here, which you know I like).

I have some availability right now to provide custom development services if you have a Microsoft-based project. I work primarily in Azure, ASP.NET, SharePoint, Project Server, Windows Apps, Microsoft SQL Server, Exchange, oData, CSOM/JSOM or anything else .NET-related (C# and JavaScript are my primary languages). Let me know if you need any help!

Additional information on LinkedIn.

DeltaBahn LLC Logo
Copyright 2015 DeltaBahn LLC | All Rights Reserved | Email: | Contact us: 1.281.344.0437

Monday, February 29, 2016

Continuing the Tradition… Coming Soon OSoFx

In 2007, with the advent of Project Server 2007, I started the mpFx project to provide a simplied API for the Project Server Interface.  That work has taken me a long way, including landing me a job in Microsoft Consulting Services and further down the road, a lead role at forProject Technologies.  These days, I am focusing most of my technical effort on SharePoint/Project Server Online and Azure.

In the tradition of mpFx, I have started a new project called OSoFx, pronounced “oh-so F X”, which provides a single interface into SharePoint Online, Project Onine (including CSOM, oData, and PSI) and Azure.

More to come!

Monday, February 01, 2016

Project Server CSOM, REST, oData, and Deployment (2013 & 2016

CSOM, Rest, and oData Support

Starting with SharePoint 2010, a new integration technology was introduced called CSOM, or Client Side Object Model.   The API is an alternative to using web services.  It is a much cleaner API.  Not only can you utilize CSOM in a .NET application such as WinForms or WPF, you can use it in your browser-based apps.  If you prefer REST, you are in luck as it too is supported.

When Microsoft released Project Server 2013, the product group embraced the CSOM model .  It is important to understand what it is capable of and its limitations.   In Project Server 2013, many of the PSI web services you are accustomed to programming against are still available.  Visit this page for the full set of types and examples to get you started.

Project Server 2013 supports both online and on-premise implementations.  Using oData is really the best way to get at data which was once stored in the Reporting Database.  Project Server’s oData reference can be found here.

One thing you will notice right away is that Project Server 2013 ships with a single database.  The drafts, published, archive, and reporting database are combined and schemas are used to separate the various content containers.

The introduction of the Project Calculation Service puts the server-side scheduling engine to closer parity with the scheduling engine implemented in Project Professional.  This a great new feature.  For an overview of Project Server 2013’s architecture, visit this page.

In Project Server 2016, which is heading toward RTM, introduces even more changes.  You can download Beta 2 here.

Project Server 2016 Deployment

In 2016, Project Server is included in the SharePoint install by default—much like the Visio, Excel, and Access services.  You still have to purchase a server license.  Install SharePoint and run the configuration wizard just as you normally would.

To enable Project Server use the SharePoint PowerShell commandlet enable-projectserverlicense Y2WC2-K7NFX-KWCVC-T4Q8P-4RG9W.  This will give you a fully featured 180 trail.

The PWA site provisioning UI is mostly gone.   To create a PWA site, follow the steps below:

  1. Create a site collection for PWA
  2. Using SharePoint PowerShell, execute this command: New-SPSite http://ps2016/pwa –OwnerAlias [DOMAIN]\[USER_NAME] -Name “PWA 2016” -Template “pwa#0”  (note that the server and site collection may be different—that part is up to you.
  3. Also using SharePoint PowerShell, run this command: Enable-SPFeature pwasite -Url http://ps2016/pwa
  4. Optionally, you can enable the security UI in PWA.  Again PowerShell: Set-SPProjectPermissionMode, which will prompt for the PWA URL and the Mode, which is ProjectServer

Credit for the PowerShell commands:


Saturday, September 12, 2015

Async Blob Upload with Progress


Sorry for the bad formatting…


public async Task UpbloadBlobAsync(string blobContainer, string path, string name, bool zip, CancellationToken cancellationToken, IProgress<BlobTransferProgress> progress)
         CloudBlobContainer cloudBlobContainer = GetBlobContainer(blobContainer);

         CloudBlockBlob cloudBlockBlob = cloudBlobContainer.GetBlockBlobReference(name);

         //TODO: zip           

             FileInfo fileInfo = new FileInfo(path);
             long fileSize = fileInfo.Length;
             long fileSizeForReporting = fileSize;

             if (fileSize < Constants.DEFAULT_BUFFER_SIZE)
                 using (FileStream fileStream = File.Open(path, FileMode.Open))
                     await cloudBlockBlob.UploadFromStreamAsync(fileStream, cancellationToken);


             int index = 1;
             long startPosition = 0;
             long bytesUploaded = 0;

                 long bufferSize = Math.Min(Constants.DEFAULT_BUFFER_SIZE, fileSize);

                 byte[] buffer = new byte[bufferSize];

                 using (FileStream fileStream = new FileStream(path, FileMode.Open))
                     fileStream.Position = startPosition;
                     fileStream.Read(buffer, 0, (int)bufferSize);

                 string blockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(index.ToString("d6")));

                 if (cancellationToken.IsCancellationRequested)

                 using (MemoryStream memoryStream = new MemoryStream(buffer))
                     using (Task putBlockTask = cloudBlockBlob.PutBlockAsync(blockId, memoryStream, null, cancellationToken))
                         await putBlockTask.ContinueWith(t =>
                             bytesUploaded += bufferSize;
                             fileSize -= bufferSize;
                             startPosition += bufferSize;

                             double percentComplete = 100 * bytesUploaded / fileSizeForReporting;

                             progress.Report(new BlobTransferProgress(fileSize, bytesUploaded, percentComplete));
             } while (fileSize > 0);
         catch (Exception exception)
             progress.Report(new BlobTransferProgress(exception.Message));

And in my unit test, you can see the progress (plus the task wait where “I am doing other stuff”):


Remaining Bytes = 9737856 - Bytes uploaded = 262144 - Percent Complete = 2

Remaining Bytes = 9475712 - Bytes uploaded = 524288 - Percent Complete = 5

Remaining Bytes = 9213568 - Bytes uploaded = 786432 - Percent Complete = 7

Remaining Bytes = 8951424 - Bytes uploaded = 1048576 - Percent Complete = 10

Remaining Bytes = 8689280 - Bytes uploaded = 1310720 - Percent Complete = 13

Remaining Bytes = 8427136 - Bytes uploaded = 1572864 - Percent Complete = 15

Remaining Bytes = 8164992 - Bytes uploaded = 1835008 - Percent Complete = 18

Remaining Bytes = 7902848 - Bytes uploaded = 2097152 - Percent Complete = 20

Remaining Bytes = 7640704 - Bytes uploaded = 2359296 - Percent Complete = 23

Remaining Bytes = 7378560 - Bytes uploaded = 2621440 - Percent Complete = 26

Doing other stuff...
Remaining Bytes = 7116416 - Bytes uploaded = 2883584 - Percent Complete = 28

Remaining Bytes = 6854272 - Bytes uploaded = 3145728 - Percent Complete = 31

Remaining Bytes = 6592128 - Bytes uploaded = 3407872 - Percent Complete = 34

Remaining Bytes = 6329984 - Bytes uploaded = 3670016 - Percent Complete = 36

Remaining Bytes = 6067840 - Bytes uploaded = 3932160 - Percent Complete = 39

Remaining Bytes = 5805696 - Bytes uploaded = 4194304 - Percent Complete = 41

Remaining Bytes = 5543552 - Bytes uploaded = 4456448 - Percent Complete = 44

Remaining Bytes = 5281408 - Bytes uploaded = 4718592 - Percent Complete = 47

Remaining Bytes = 5019264 - Bytes uploaded = 4980736 - Percent Complete = 49

Remaining Bytes = 4757120 - Bytes uploaded = 5242880 - Percent Complete = 52

Remaining Bytes = 4494976 - Bytes uploaded = 5505024 - Percent Complete = 55

Doing other stuff...
Remaining Bytes = 4232832 - Bytes uploaded = 5767168 - Percent Complete = 57

Remaining Bytes = 3970688 - Bytes uploaded = 6029312 - Percent Complete = 60

Remaining Bytes = 3708544 - Bytes uploaded = 6291456 - Percent Complete = 62

Remaining Bytes = 3446400 - Bytes uploaded = 6553600 - Percent Complete = 65

Remaining Bytes = 3184256 - Bytes uploaded = 6815744 - Percent Complete = 68

Remaining Bytes = 2922112 - Bytes uploaded = 7077888 - Percent Complete = 70

Remaining Bytes = 2659968 - Bytes uploaded = 7340032 - Percent Complete = 73

Remaining Bytes = 2397824 - Bytes uploaded = 7602176 - Percent Complete = 76

Remaining Bytes = 2135680 - Bytes uploaded = 7864320 - Percent Complete = 78

Remaining Bytes = 1873536 - Bytes uploaded = 8126464 - Percent Complete = 81

Remaining Bytes = 1611392 - Bytes uploaded = 8388608 - Percent Complete = 83

Remaining Bytes = 1349248 - Bytes uploaded = 8650752 - Percent Complete = 86

Remaining Bytes = 1087104 - Bytes uploaded = 8912896 - Percent Complete = 89

Remaining Bytes = 824960 - Bytes uploaded = 9175040 - Percent Complete = 91

Remaining Bytes = 562816 - Bytes uploaded = 9437184 - Percent Complete = 94

Remaining Bytes = 300672 - Bytes uploaded = 9699328 - Percent Complete = 96

Remaining Bytes = 38528 - Bytes uploaded = 9961472 - Percent Complete = 99

Remaining Bytes = 0 - Bytes uploaded = 10000000 - Percent Complete = 100


Monday, August 24, 2015

Update + Azure


Hello World!

I haven’t written much for a long, long time.  Work and family are all consuming BUT, during the day I find myself staring at my monitor as something builds, merges, downloads, etc.  Usually I pop over to the news and read CNN for a few minutes but I figured I could do something more productive so I am going to start writing again.

First, I am alive :-)  Second, you can see what I spend the bulk of my time working on here:  For our EVMS product, we are currently working on version 3.0 but I have also been spending a lot of time on Integrator for Project Server:

Nights and weekends (sparingly), I get to play on my pet projects which these days are mainly related to Microsoft’s Azure cloud platform.   A new portal for Azure is in preview and if you have a subscription, I recommend taking a look.  It is quite an improvement over the previous version. 

Here is the home page, which is fully customizable depending on what services you are consuming.  Lately I have been working on Azure Web Apps and Storage


Here is a shot of the management area for my storage account:


The Azure SDK has a TON of amazing tools and technologies to work with.  Everything from classic .NET technologies to Ruby, Node.js, PHP, and Python.   Plus, you can build a virtual machine running Oracle on Linux if you wanted to!  Microsoft has finally broken out of the “only our stuff” mode.  It’s a lot of fun to play around in Azure.

Azure Storage

Azure includes a storage service (distinct from Azure SQL), which is really nice.  The system includes blobs, tables, and queues.  All three are pretty amazing.  As an example, a queue (which basically stores messages that any app can access across the world with guaranteed availability), stores messages up to 64KB in size and as many as you want up to 500 TB (that’s terabytes!).

The first thing I built with Azure storage was a picture synchronization tool.  The idea was that everybody on my family has tons of digital pictures and we would like to make the available to everyone and also maintain a backup of the images.


Every person who has this installed can sync our pictures from the cloud to their desktop and upload new photos to add to the collection.

In the process of creating the application, I created a wrapper around Azure’s storage API called CloudStorage:


The picture synchronization tool is surprisingly small at just over 600 lines of code, most of it UI related, because the cloud storage wrapper does the bulk of the work.  For those of you familiar with mpFx (a wrapper over Microsoft’s Project Server API), would find CloudStorage easy to use.

Cloud Storage Emulator

Of course, Azure Storage is not free, but it is extremely affordable based on usage.  That said, I didn’t want to be charged for my test data and the various experiments I was doing.  Microsoft has an answer for that called the Cloud Storage Emulator, which ships with the Azure SDK.  Essentially it is a local version of storage that exposes the exact API as the cloud version.  Incidentally, the storage emulator is written in .NET so you can decompile it (for instructional purposes only!) using dotPeek to see how it works.

While developing the picture synchronization tool I accumulated a ton of pictures in the emulator.  I thought, well, it would be really cool to be able to synchronize my emulator data to the cloud.

Azure Manager

Meet Azure Manager, a work in progress…


The tool allows me to create/update/add/delete blogs, queues, and tables both in the cloud and in the emulator.  Visual Studio ships with this functionality but you can’t learn an API by using at an existing tool so I reinvented the wheel on purpose.  Now, what Visual Studio doesn’t have is the ability to synchronize data from your emulator to the cloud or cloud to emulator.  I wanted that ability so I wrote a sync engine that would go both ways.


You can chose the source and target for sync.  So, if you want to update the cloud with the data in your emulator, you choose Development as the source and Cloud as the target.  Choose your options and which tables, queues, and blobs you want to sync and it does all of the work for you.

So, most of the time my Cloud usage is really low until I decide I want to  sync to the cloud.

That’s it for now.  Next time I will share a fully cloud based product update system (imagine Windows Update) for all of my pet projects.

Happy Monday!



Content on this site is provided "AS IS" with no warranties and confers no rights. Additionally, all content on this site is my own personal opinion and does not represent my employer's view in any way.