Check out my new blog at https://shibumiware.blogspot.com

Tuesday, July 18, 2017

History, Future Plans, and Common Controls

Progress Bar
See the first post on Common Controls here

A Little History and Future Plans

Every developer has this experience more than once: you build a component, go open source, or even buy to get it almost right.  You tweak it to meet the requirement at hand, and then you move onto the next problem.  All the while, this nagging in the back of your mind is telling you that “you could have done better, made it more extensible, more aesthetically pleasing.”  That’s over engineering if you try to make all of those things happen when other tasks remain on the critical path and you have met the requirement, especially with a paying customer--without their permission, that's downright unethical.  Still, and I think this is so for all of the good developers I have come across, there is a natural desire to perfect your craft, make it beautiful, and make it stand the test of time.

I wrote my first piece of commercial software when I was in high school and many apps before that,  used by friends primarily.  I believe I was about 11 I wrote my first useful piece of software. These apps were MS DOS apps written in Pascal or QuickBasic. The inner core of the application was a big loop that painted the ASCII characters that were used back then to create frames, menus, inputs, reports, and whatever else you did on the screen. The loop received user input, forked off to perform various tasks like reading or writing data to a floppy drive, and then came back and start painting the character-based UI to update it based on what occurred in the subroutine.  

 For those of you who weren't there to do this kind of development, you may wonder what all the weird characters in the DOS character page (see image from Wikipedia below) used for.  Those characters, my friends, were the building blocks of MS DOS applications that went beyond just standard letters.  You were quick to learn these codes and the algorithms for drawing and updating boxes, centering text in the boxes, making the box appear highlighted, etc.  A little like CSS these days--just a little, in the sense there were a ton of tricks you could master to make the app look good.

https://en.wikipedia.org/wiki/Box-drawing_character
box characters

At that point in computer history, there was no such thing as a GUI available to a little kid writing code on his TRS 80 in his basement. But that inner loop that looked a whole lot like a message loop in Windows, so the transition to Windows development was not a big deal. Plus, I switched to a much less verbose and lower level language--C.  Back in those days, there were few “controls” or much of anything to help create a functioning Windows application. You built your menus and dialogs in resource files and manipulated these objects directly. You had a heap to tend to with care, or you would leak like a stuck pig. There were the standard libraries available and the Windows headers to include to give the linker the data it needed and the macros you needed to make Windows programming possible. But it wasn’t that hard. It was labor intensive.

EnteEnter MFC with C++. This combination was freedom from the slavery of the message pump (well, sort of) and handed a shortcut or a wizard to perform all of those menial jobs you had to deal with before. Everything was good.

Enter OLE, COM, and DCOM. Back to everything being hairy and incomprehensible at first, but after a while that too became comfortable. The problem was the platform, runtime, and languages kept changing so the idea of making the last thing you did even better seem a little bit ridiculous because who knew what was coming out of Redmond next.

Well, that was VB 6.0, which we will skip, so I don’t have vicious flashbacks.

EnteEnter .NET.  I was at the PDC event in Florida when they announced it.  I remember thinking “Oh great; we are starting all over again.” And we were, to some degree.  I took the huge stack of DVDs (or were they CDs?) back to my hotel room and wrote my first "Hello World" in C#.  It felt like C.  I had been doing VB.NET.  I was frigging ecstatic!  The base class libraries, even in 1.0, were so wonderful in comparison to Visual Basic.  I almost called home to tell my manager that I was in love.  Thankfully, I didn't.  That would have been embarrassing for the both of us.

.NET has survived and thrived and will remain for many more years, if not decades, so now the idea of building a library of reusable, better engineered, better looking components seem reasonable. C and C++, and even MFC remain and are updated, but as a platform, managed code has punctuated Microsoft’s strategy from its inception. I remember thinking “this is for real” when SQL Server implemented a managed host inside the SQL engine. That’s no small feat and means a level of commitment. Now, with Azure, Office Online, and just about everything else coming out of Microsoft these days, it is either language agnostic (finally, Microsoft has decided that OPT—Other People’s Technology—is useful), .NET (C#, VB.NET, F#, and 20+ others implemented by others) or JavaScript.

I have been writing software for just about 25 years—and I still have the desire to perfect my craft, learn more about the subject, and keep fresh.  I spend several hours a week on Channel 9 (MSDN), YouTube, or my Kindle reading, listening or watching things about my craft.  Yet, it seems, a challenge for anyone who has been doing software for this long that there will be a particular technology that you end up specializing in. For me, there are two—back-end services, which is wonderful because it involves a great deal of diverse technologies, and Windows development.  I still love Windows development.  It's where I started.  I got to meet guys like Don Box and Charles Petzold when I worked at Microsoft. I have exchanged emails with various luminaries throughout the years, including Steve Jobs (yes, that Steve Jobs--he didn't do Windows but he was from that world too), Mark Russinovich, Dave Probert, and others.  A close friend of mine built HTTP.sys and helped invent various pieces of Windows you all use, plus holds a bunch of patents at Microsoft.  I came from Microsoft, practically born of its culture (original culture), and will always love Windows.  A small window in my nerdiness in this subject is that I am book two (not chapter two, BOOK two) of Windows Internals 6th edition, with the 7th Edition queued up on my Kindle.

Windows was cool and damn the haters, it still is.

But things are always changing in my industry so over the past couple of years, I have gathered what I know about Windows development in the managed world and started the ShibumiWare Stack  Aside from adding fresh content when I learn something new on the job, fixing bugs, and maintaining it, I am moving on to other topics as my primary focus—with services and other “plumbing” always being my other favorite. I see in my future much more web development, mobile development, machine learning, cryptography, system’s integration, and others. Microsoft Dynamics is going to be a big part of what I do. I will always love to do Windows development when called upon to do so, but on my 25th anniversary as a developer, I am going off to focus on other Microsoft technologies.

So What's Next with the Common Controls Series?

I plan on completing the series on Common Controls, which I figure will end up around eight to ten posts.  There is a lot of I have done and many tricks I will share along the way. 

Something to Keep in Mind

There is exactly zero code in the stack that I had to purchase.  I spent maybe 6 years doing Windows development using DevExpress and loved it most of the time.  I missed it when I wanted to do hobby projects that looked great but no way was I going to pay thousands of dollars.  I realized through that experience that with a little (or a lot) of work a dedicated developer can make standard Windows development (versus WPF) look very nice indeed.  The time I spent doing "raw" Windows development helps because I know deep down in that really cool looking component, there is just Win32 and Windows development techniques.   So it doesn't overwhelm me easily.

What to Expect in the Next Common Controls Post

I think I have written a login dialog about 20 times in my career so I took all I learned and ever needed and built a credentials management framework with a common dialog that is a shape-shifter of sorts.  Its object model is programmable to allow for themes, different kinds of user names (email address, domain credentials, and others), password complexity options (and UI components that visually tell you if you are meeting those requirements), options for remembering credentials with full encryption, profile management, web content to explain the login process, and other goodies.   It has extension points for providing your own crypto implementation via interfaces.  Various UI components can be made visible or hidden depending on your needs.  It is about as complete as I can make it. 

That post is forthcoming, but I have a lot on my plate right now so I cannot tell you when.

Thanks for staying tuned!


Colby-Tait

Monday, July 10, 2017

Microsoft Dynamics 365 & A Developer's Adventure Spelunking the SDK

Version

Microsoft Dynamics 365

Don't get blinded by the skirmishes between Microsoft and SalesForce.com over the hearts and dollars of salespeople across the world: Microsoft Dynamics 365 offers a tightly integrated, full-featured CRM and ERP platform that goes beyond just customer relationship management.  Visit https://explore.dynamic forces.com/ and take a tour of some of the fantastic capabilities the platform provides. I gave up writing like a marketer in about 1997, so I won't’ try to restart here.

Microsoft Dynamics 365 Solution Areas  
Microsoft Dynamics 365 Solution Areas

Fortunately, there are a great many technical topics that applicable to my daily work. Recently I was given access to an instance of Dynamics 365 with Project Services Automation and Field Management. It bears repeating “Project Services Automation (PSA),” which is an acronymic collision with “Professional Services Automation.” Having two PSA’s to contend with is regrettable because I think Gartner, Forrester, and IDC spent a prodigious amount of time and money explaining the original PSA to the world, and now we have a second. Project Services Automation fulfills a different business requirement than simply CRM and differently than Microsoft Project  Online, which is a traditional enterprise project management system.  Project Server Online has its place, but Project Services Automation is "end-to-end", as you can see from the quote below:

Project Service Automation in Microsoft Dynamics 365 (online) provides an end-to-end solution that helps sales and delivery teams engage customers and deliver billable projects on time and within budget. Microsoft Dynamics 365 for Project Service Automation Automation helps you: • Estimate, quote, and contract work • Plan and assign resources • Enable team collaboration • Capture time, expense, and progress data for real-time insights and accurate invoicing

- Manage Project Based Sales Project Service Automation

General Resources

Here are links for those interested in the solution generally and for those that would like to dive deep into the offering:

Developer Resources

One of the most exciting aspects of being involved with the company I work for is the changing technological landscape and variety of technologies used to accomplish a project’s goals.  Working with the Dynamics Platform differs from what I have been doing for the last decade or so. I have focused on enterprise project management systems, notably Microsoft Project, Project Server, and a touch of Primavera, with a focus on earned value management, API design, synchronization technologies, performance, domain-specific algorithms, cryptography, just to name a few. Dynamics is a big product, its cloud-based, and it has an extensive set of integration points to fulfill a broad range of structural challenges in building and integrating large-scale enterprise resource management systems (ERP). ERP rarely stands alone. Often, line-of-business off-the-shelf products augment and integrate with the ERP system. Even more often, internal solutions are built to satisfy a specific challenge a customer feels the market can not meet it. Putting all of these data flows, compute units, and human interfaces together is an incredible challenge.

Here are a few links to developer resources. If you are a Microsoft Partner, you may have access to a certain number of Dynamics instances, depending on your partnership level.  If you don't have this resource, you can follow the instructions here to start a 30 day trail.

The Beginning Is a Great Place to Start

After a little warm up, a lot of reading, and getting my hands on a Dynamics 365 instance, I was ready to jump right in. I had success with Project Server’s PSI (Project Server Interface) years ago by starting at the bottom and working my way towards more complicated integration scenarios. So successful, in fact, I ended up working for Microsoft, forProject Technology, and now DeltaBahn largely because of my experience making my way through the PSI.

I have an unnatural patience in dealing with poorly, incomplete, or flatly wrong documentation. I also won’t let go of a problem until I have resolved it, learned it, digested it, and usually written my own version of whatever extensibility “product (API)" a product group provides—or at the very least, encapsulated it to increase usability, decrease complexity, and improve performance wherever possible. If API design is exposing a system’s internal machinery and capability to the outside world in a reasonable, consistent, useful, extendable, and understandable fashion, then writing an API on top another API shares the same goals except done by people who have consumed the original API and found it lacking.

In other words, the API’s API is the result of field experience. Furthermore, encapsulation protects the consumer to future changes in the base API and is decoupled from the product group’s release cycle so, as encapsulation API designers, we are free to extend, bend, and often (legally) reverse engineer how the first API works to provide better documentation, samples,  and depth of knowledge.  In my experience dealing with APIs, particularly with Microsoft, there lies within most APIs a grand vision the developers wanted to share with the world but one of the first things to get cut to deliver on time and budget is extensibility.

Connecting to Dynamics 365 as an Office 365 User

I can't do much with my shiny new instance of Office 365 With PSA & Field Management until I authenticate. In early days of working with the PSI I had relied primarily on Active Directory integration, but I did write a Forms Authentication extension that enables the Forms scenario easily. Recently I have done quite a bit with Google's Blogger API, which is employed Open Authentication (OAuth) 2.0, and I had done certificate-based custom transport and user-level authentication on a massive WCF project at forProject Technology, Inc., so I felt comfortable about what I had read about Dynamics' authentication.

I went to the QuickStart and the "Simplified connection quick start using Microsoft Dynamics 365" samples and started in with gusto, only to be disappointed to find that neither sample worked as advertised.  The QuickStart demonstrator has plenty of ells and whistles, whereas the simplified version does not.  Problem number 1 when writing samples.  Have each sample do one thing and one thing only as you start, treat each sample as a teaching opportunity, ensure the developer has a firm grasp before moving on and never include extraneous code to support "other scenarios".    This applies to each method of the example.  In the "Simplified" version of the sample, the first call does what appears reasonable:

ServerConnection.Configuration config = serverConnect.GetServerConfiguration();

GetConfigurations(I show listing and images throughout this portion of the post because I provide the link above to the sample and it is available in the SDK--I do this for conciseness and to be able to illustrate points quicker.)

The point here is that I am first shown how to read previous connection profiles from an XML file and later to write this server configuration back to the file.   Even worse, the call to ReadConfiguration has global side-effects because it modifies a class property Configurations with the configuration information in the XML file and its return value is simply the count of the Configurations property.

Global side-effects are bad, plus any C# developer worth their weight in, say, barbershop floor detritus can figure out how to read and write an XML file. Plus, there are much better ways to do THAT than demonstrated in the sample. I would know from personal experience that many if not all samples were created by somebody other than the developers responsible for the code—one of my first tasks at Microsoft was to write sample code for the Solutions Development Kit back in the mid ‘90’s. Anway, end of rant.

In the addConfig section, the code attempts to choose the discovery URL and successfully chooses a URL but it is the wrong URL and results in an error.  I spend some time in the forums, particularly on StackOverflow and find the sample code is out of sync with the latest packages available from NuGet so I back everything out and use the binaries that ship with SDK.  From the URL built up by the sample, I get a "file not found" error back from the server so I go LOOK UP the correct URL in Dynamics-->Project Services Automation-->Settings--Customizations-->Developer Resources and here is the punch line:  the URL is nothing fancy like the sample code is building up, it is simply  https://disco.crm.dynamics.com/XRMServices/2011/Discovery.svc.

I look at this 1498 lines of code and think "forget it, it cannot possibly be that difficult to connect to Dynamics through my Office 365 account."  There are to many if\then\switch constructs depending if you are on prem, using a Live account (???), using federated AD, or Office 365.

Then What Happened?!

Login ScreenI look at a tool that ships with the SDK called the Plugin Registration Tool.  To the left is the login screen.  That's more like it.  

II mentioned earlier there are may ways to connect to Dynamics. I like OAuth as it is usable in a web app, a Windows app, or on a mobile device. I start focusing in on the parts of the various samples that deal with OAuth. I strip away the saving of configurations, the server connection constructor that is SUPPOSED to take an Organization name to work but does not (another hot topic on the forums), and do a little decompiling using dotPeek to see how it works under all the layers of “helpful” sample code.

I get the feeling the authors are embarrassed to ship something simple—as if doing just what is needed doesn’t show off how smart they are. I think the ability to simplify or distil something potentially complex down to its simplest form shows both intelligence and dedication to their audience.

I worked for about twenty minutes pulling out the pieces needed to connect, authenticate, and retrieve all of the groups I belonged to in our Dynamics implementation, including my newly created developer instance.  Here is a screen shot of the resulting widget I create that shows the groups, and upon selection, the property grid is populated with basic information about the group:

The code to accomplish this is so unbelievably simple I will provde the complete listing here.  First, I created a class called OAuthDynamicsOnlineConnector:


public class OAuthDynamicsOnlineConnector
{
    private readonly string _DiscoveryUrl;
 
    public OAuthDynamicsOnlineConnector(string discoveryUrl)
    {
        if (string.IsNullOrEmpty(discoveryUrl))
        {
            throw new ArgumentException(FoundationResources.ERROR_ARG_NULL_OR_EMPTYnameof(discoveryUrl));
        }
 
        _DiscoveryUrl = discoveryUrl;
 
        Organizations = new List<OrganizationDetail>();
    }
 
    public List<OrganizationDetailOrganizations { getset; }
 
    public void Connect(string userName, string password)
    {
        if (string.IsNullOrEmpty(userName))
        {
            throw new ArgumentException(FoundationResources.ERROR_ARG_NULL_OR_EMPTYnameof(userName));
        }
        if (string.IsNullOrEmpty(password))
        {
            throw new ArgumentException(FoundationResources.ERROR_ARG_NULL_OR_EMPTYnameof(password));
        }
 
        // Create Microsoft.Xrm.Sdk.Client.AuthenticationCredentials from the userName and password
        AuthenticationCredentials authenticationCredentials = CreateAuthenticationCredentials(userName, password);
 
        DiscoveryServiceProxy discoveryProxy = null;
 
        try
        {
            // Use the Microsoft.Xrm.Sdk.Client.ServiceConfigurationFactory to create the 
            // Microsoft.Xrm.Sdk.Client.IServiceManagement<IDiscoveryService> object from our discovery url
            IServiceManagement<IDiscoveryService> discoveryService = 
                    ServiceConfigurationFactory.CreateManagement<IDiscoveryService>(new Uri(_DiscoveryUrl));
 
            // Authenticate, which will populate the SecurityTokenResource object with a valid SecurityTokenResponse 
            // or it will throw a SecurityAccessDenied exception
            authenticationCredentials = discoveryService.Authenticate(authenticationCredentials);
 
            // Create the discovery proxy, which is combining the discoveryService object with the security metadata
            // to give you an authenticated, callable proxy for issuing calls to Dynamics
            discoveryProxy = new DiscoveryServiceProxy(discoveryService, authenticationCredentials.SecurityTokenResponse);
 
            // From here on down it is self-explanatory, resulting in a collection of OrganizationDetail objects
            RetrieveOrganizationsRequest orgRequest = new RetrieveOrganizationsRequest();
            RetrieveOrganizationsResponse response = (RetrieveOrganizationsResponsediscoveryProxy.Execute(orgRequest);
 
            OrganizationDetailCollection organizations = response.Details;
 
            if (organizations == null)
            {
                return;
            }
 
            foreach (OrganizationDetail organization in organizations)
            {
                Organizations.Add(organization);
            }
        }
        catch (SecurityAccessDeniedException accessDeniedException)
        {
            Debug.WriteLine(accessDeniedException.Message);
            throw;
        }
        catch (Exception exception)
        {
            Debug.WriteLine(exception.GetType().FullName);
            Debug.WriteLine(exception.Message);
            throw;
        }
        finally
        {
            discoveryProxy?.Dispose();
        }
    }
 
    private static AuthenticationCredentials CreateAuthenticationCredentials(string userName, string password)
    {
        AuthenticationCredentials authenticationCredentials = new AuthenticationCredentials();
        ClientCredentials clientCredentials = new ClientCredentials();
 
        clientCredentials.UserName.UserName = userName;
        clientCredentials.UserName.Password = password;
 
        authenticationCredentials.ClientCredentials = clientCredentials;
 
        return authenticationCredentials;
    }
} 

The code behind the form is equally simple:

public partial class DynamicsWorkBenchForm : Form
{
    private readonly string _DiscoveryUrl;
    private readonly string _Password;
    private readonly string _UserName;
 
    public DynamicsWorkBenchForm()
    {
        InitializeComponent();
    }
 
    public DynamicsWorkBenchForm(string userName, string password, string discoveryUrl)
    {
        if (string.IsNullOrEmpty(userName))
        {
            throw new ArgumentException(FoundationResources.ERROR_ARG_NULL_OR_EMPTYnameof(userName));
        }
        if (string.IsNullOrEmpty(password))
        {
            throw new ArgumentException(FoundationResources.ERROR_ARG_NULL_OR_EMPTYnameof(password));
        }
 
        _UserName = userName;
        _Password = password;
        _DiscoveryUrl = discoveryUrl;
        InitializeComponent();
    }
 
    private void DynamicsWorkBenchForm_Shown(object sender, EventArgs e)
    {
        OAuthDynamicsOnlineConnector dynamicsOnlineConnector = new OAuthDynamicsOnlineConnector(_DiscoveryUrl);
 
        ((Action)(() => dynamicsOnlineConnector.Connect(_UserName_Password))).TryCatchMethod(this);
 
        SortableBindingList<OrganizationDetail> organizationDetails = 
            ObjectExtensions.ToSortableBindingList(dynamicsOnlineConnector.Organizations);
 
        organizationsListBox.DisplayMember = @"FriendlyName";
        organizationsListBox.DataSource = organizationDetails;
    }
 
    public void OrganizationsListBox_SelectedIndexChanged(object sender, EventArgs e)
    {
        propertyGrid.SelectedObject = organizationsListBox.SelectedItem;
    }
}

Wrap-Up

I will continue writing about Dynamics (setup, cryptography, extension methods, and anything else that strikes my fancy too) over time. I am going to be working quite a bit in this area, and I find the documentation lacking and as a colleague or two of mine said “underwhelming.”  Please do leave a comment if you would like to see a particular topic covered and I will see what I can do to help.

I was going to explain in some detail what a SecurityResponseToken is, how it works, and where the OAuth specifications provides detailed information about it—it is central to how the mechanism works plus… it involves cryptography, specifically 192 bit Triple DES!


Colby-Tait

Tuesday, July 04, 2017

Happy 4th of July 2017

Colby & Parker

Quality Nerd Time Together

My son and I collaborated on this video--the first time for both of us combining music, video slices, and images into a single video.

We used Audacity, GIMP, LAME, and Microsoft Encoder.

Enjoy!

And Here is our Masterpiece...


Everyone have a safe and happy holiday!


Colby-Tait

Saturday, July 01, 2017

Testing 1-2.3

Microphone

Trying Various Microphones

The head set was reported to be too faint. It doesn't have gain control so I pulled out my Telex M-560 from the device archive. It has all sorts of bells and whistles; it was the first microphone sanctioned for voice recognition without a headset or sound card.

Welcome to Star Trek

It certainly doesn't look like any microphone I have seen since I bought it over a decade ago: Telex M-560

I think this will be perfect for video's BUT you can still hear background noise from a server that is sitting about 3 feet to my left.  I need to move it so I can turn the gain up a bit more.  Until then, give the volume a tick up or two when you viewing videos from the channel.

Test Video

Who knows, I may have to borrow Parker's fancy microphone but I think the only significant problem is the server fans.  I never notice the sound any more until it was pointed out.  Thanks for the feedback.  Start using the comments so you can see if I am addressing any problems!  I get too much email as it is, but I am not complaining.


Colby-Tait

Thursday, June 29, 2017

ShibumiWare Common Controls: Progress Bar

Progress Bar

Introduction

I have been working on a set of controls for years. At a previous company I had access to DevExpress' Suite of controls and loved working with the controls, but for a hobbyist, they are simply too expensive. Work on the control is organic. Each time I run into a user interface problem and the out-of-the-box.NET controls don't do the job; I look for an open source project first. Unfortunately, I found many open source projects lacking. Also, I prefer to write to the control through an API versus using design-time features. I also love composition; some my controls are easily combined to create other controls--also something I found difficult to do with open source projects. Not that I haven't used any open source. I have an HTML Editor built by a former Microsoft Consulting Services friend who made a decent control; it just needs modernizing. That is on my list, and since it began open source, I will release it accordingly.

Control Library

Tonight I made my first video for YouTube. My son is getting into it. I bought him a new computer, a high-gain condenser microphone with a studio stand, and gave him a decent set of headphones. I decided I better catch up to him so I can help him out in this new endeavor. Of course, our choice of topics is unique.

The first control I cover is the ShibumiWare Common Control Progress Bar. For those who have been around Windows development for a while, you are likely all too familiar with the built-in progress bar's shortcomings. It was quite an event when Microsoft added continuous and marquee rendering to block rendering. The first thing I wanted that Windows' version couldn't give me was control over the border: I wanted to change the border's color, thickness, and provide the ability to have a normal and selected border.

I will show the general architecture of the control library in a later post, but for now, understand that each control inherits from SWBaseControl, which provides many features common to all controls, including borders. The next thing was getting rid of the green color. It is a beautiful color, but in many of my apps, it sticks out and isn't consistent with the look-and-feel.




Part I of Breaking Down the Code

I have to admit that I had a long day and I am actually tired, even though it is only 1:45 AM.   I work strange hours and I do my hobby projects on an even stranger schedule. I am going to give you just a taste, and actually the crux, of solving the paint problem.  I wanted a different background than white.  I wanted colors other than green.  I wanted borders, as mentioned earlier (and not covered in this post), and after a while, I figured out how to do gradients. 

The Progress Bar Control inherits from Windows progress bar.    The following code is critical in even starting to gain control over the paint operation:

public void SetStyle(ProgressBarStyle style)
{
    if (style != ProgressBarStyle.Marquee)
    {
        SetStyle(ControlStyles.UserPainttrue);
    }
}

This has the effect of telling the framework "hey, painting is done by this control so don't do anything."   Overriding the OnPaint event is where the bulk of the work is done:

protected override void OnPaint(PaintEventArgs e)
{
    if (_PaintNormal)
    {
        return;
    }
 
    if (OverrideBackground)
    {
        PrepareClientRectangle(BackColor);
    }
 
    if (Value == 0 || Maximum == 0)
    {
        return;
    }
 
    using (Image image = new Bitmap(WidthHeight))
    {
        using (Graphics graphics = Graphics.FromImage(image))
        {
            Rectangle rect = new Rectangle(0, 0, WidthHeight);
 
            if (ProgressBarRenderer.IsSupported)
            {
                ProgressBarRenderer.DrawHorizontalBar(graphics, rect);
            }
 
            rect.Width = (int)Math.Ceiling(rect.Width * ((double)Value / Maximum));
 
            if (rect.Width == 0)
            {
                rect.Width = 1;
            }
 
            using (LinearGradientBrush brush = new LinearGradientBrush(rect, BackColorProgressMarkerColorGradientMode))
            {
                graphics.FillRectangle(brush, 0, 0, rect.Width, rect.Height);
            }
 
            e.Graphics.DrawImage(image, rect, rect, GraphicsUnit.Pixel);
        }
    }
}

  I won't explain in detail tonight, but I can easily disable the advanced painting and revert back to the normal behavior.  If I want something besides the gray background, I set OverrideBackground = true:

public void PrepareClientRectangle(Color backColor)
{
    if (backColor == default(Color))
    {
        backColor = BackColor;
    }
 
    using (Graphics graphics = CreateGraphics())
    {
        using (SolidBrush brush = new SolidBrush(backColor))
        {
            graphics.FillRectangle(brush, Location.XLocation.YWidthHeight);
        }
    }
}

The rest is self-explanatory, but I will say that there are some rectangles that don't need to be there and I call an overload for filling the rectangle that is much simpler, but those are items left over from some experiments I did--and I left the code alone to remind me how to pickup that original train of thought if I need to. 


Wrap-Up

As you can see from the folder structure image above, I have quite a few controls I could cover.  I chose the progress bar because I was working with it tonight on another hobby project and figured out something new I wanted so it was at the for-front of my mind.  The reason I did this post, mainly, was to accompany my first YouTube video so when I see Parker next, I can show him that I am following his lead. I think he will like that.


Colby-Tait

Thursday, June 22, 2017

More on Setup & Updates: Programmatically Accessible .NET Version Information + Some History

Version
Related Posts: Automatic Updates - Tips & Tricks

Introduction

Off and on for years,  I have been involved in developing software for setup and servicing technology for various products.  On of my first projects, when I was a wee 20-year old, working for a startup in Redmond, WA, required an early version of "Office Update."

The Microsoft main campus was just a few blocks from our first "office," which doubled as the company's owner and his wife's house and our work area.  Later,  we moved into real office space in downtown Redmond, and we were the bee's knees; but that's another story.  It was a tremendous time of learning the trade from the inside out.  I was blessed (Glenn Minch, Garin Pangburn, Ken Inglis, and Adrian Jenkins, I am in your debt).

The company, Critical Path Technical Services, Inc., landed a contract with the Office Product Group (Building 18 for you 'Softies' out there) at Microsoft to develop an "Office Solution Automatic Update Service."  Keep in mind this was circa 1995 and, if you have forgotten how unsophisticated the Internet was, take a walk down memory lane with the image below of Microsoft's main site around that time.

It wasn't much.  That year I hand-coded CPTS' first website using Notepad.exe, FTP, and a book I had bought from Barnes & Noble about HTML.

The Call

Microsoft's Website Circa 1995 Courtesy of Microsoft Corporation
 
Microsoft's Website 1995

There is nothing like getting a call from the Mothership offering money to accelerate the pace of learning.  Essentially the idea was this: Microsoft was pushing Office automation heavily in those days, particularly the use of Visual Basic for Applications, which premiered in 1995. Microsoft wanted to combine the Internet and Office so that Office ISVs (Integrated Software Vendors) would have direct access to the burgeoning body of knowledge regarding VBA and Office solution development.

The problem was MSDN shipped on CD (no, not DVD) every quarter and there was very little by way of online content to help the ISV except for a few boards, chat channels, and a tiny number of dedicated websites.

To make a very long story short, CPTS was contracted to build an Internet-based Solutions Delivery Platform--or as we would think of it today, Office Update for developers. The idea was a grand vision. Microsoft hosted an FTP site, and CPTS contracted to write an automatic download system. The system would look for updates and new solution starters, provide a web-based menu of technical assets the developer could choose to download, and take care of downloading and installing the solution-starter (think template with a code-behind). Not unlike the first MSDN down-loader, an ActiveX control would create a modal Windows dialog that did the actual download to the local disk. I hand crafted, in C++, the download engine while another developer worked on the UI portion. The download engine was straight sockets and WinINet programming. Many of you may not know that WinINet.dll is the grandfather of all Internet-related DLLs in Windows. A simple search shows browsers, Google Drive, Excel, and 40 other processes running on my machine that take a dependency on WinINet.dll.  Talk about first-generation.

WinINet.Dll

That was my introduction to servicing and setup because each download was completely self-contained, installed itself into Office through the VBE Object and provided an updating mechanism to detect and install or update VBA solutions in the various Office products. It was some kind of strange, let me say that--but I thoroughly enjoyed writing the download engine (C & C++ remain my favorite languages) using an API that is vastly closer to the wire than what we .NET developers are accustomed today. I can also so say that, before anybody else at Microsoft had an automatic update service, the Office Product Group and CPTS built one.

Setup ImageA Fascination Developed

After that experience, I developed a fascination with software that upgraded itself.  General setup technology, too, became a favorite topic.  For better or worse (probably worse), I wrote the setup technology for Project Office v3.0 at Pacific Edge and EVMS forProject v1.0 - v2.0 at forProject Technologies. I always felt that out of the box setup technologies were geared toward simple end-user products and not enterprise products. I suppose this tendency grew out of my years at Microsoft, where each product, ranging from Microsoft SQL Server to SharePoint, had an in-group setup system with very little, if any, shared concepts or code.  These days, with WIX and others, there is a better platform but if you know what you are doing, you can build a much better setup experience for the user if you combine off-the-shelf with custom setup work.

Usually, when setup is mentioned in product planning, nobody raises their hand to volunteer to take on that part of the product, but there I was, gleefully, hand raised way up, ready to take on the challenge.   I always took it on and I am not sure why.  There must be some deep psychological reason in me that whispers "set it up." uot;

You see, the other thing about setup is that it's the first thing the customer experiences with your product.  If your setup experience is horrible, there is a pretty good chance the product also has issues. Notice the various customer experience programs and other telemetry Microsoft and other software companies use to gather end-user experience data-- you see how important it is to get it right.

Enough Story Telling

I have enough content regarding my adventures at Microsoft as the youngest product manager in the Office Product Group plus the start-ups I was involved with after to write post-after-post.  This post is really about something I was working on this past weekend on one of my geeky hobby projects.  I mentioned an earlier post "Automatic Updates - Tips & Tricks".  That post didn't originate from out of the blue. Aside from the setup work I did at previous companies, I have always kept one side-project alive, in one form or another, regarding setup and servicing.  For a while it was QuickPatch , which is a fairly complete piece of work but I am dissatisfied with it (plus the user interface really bothers me).

Nowadays, under the banner of my new hobby company ShibumiWare, I have developed a system for building, packaging, deploying, setting up, and servicing my hobby products. In fact, I have become so engrossed in the effort, all of my hobby product development has stalled out while I finish this work. Unlike QuickPatch, I don’t have a catchy name for it, but it's the best work I have done in this area to date. Everything goes very slowly at ShibumiWare because I love my day job, have kids, and  have this medical thing continually to deal with, but I plow on.

.NET Version Information

Version detection one of the first things done: check to see if your product's prerequisites are present and provide options to download or otherwise obtain the necessary dependencies before installing the product. I mentioned in the Tips & Tricks post that a baseline is selected for your products to run.

In my case, I choose .NET 2.0 as the baseline version for the installer and update system. The products or utilities individually are written in a later version of .NET but to even get a .NET-based installer to run, you have to pick a baseline. .NET 2.0 is a good baseline selection for several reasons. It is broadly distributed for down-version operating systems like Windows XP, it ships with later OS's like Windows 7, 8 and 10 plus if it's not there from Microsoft, there stands a good chance that some other product installed on the customer's computer required .NET 2.0, and it's there already. Regardless, my setup flow begins with a C++ "bootstrapper" that doesn't require .NET whatsoever. Its purpose is to determine if .NET 2.0 is installed. If it isn't, the bootstrapper provides a link to installation guidance for the user and halts the installation process. Bootstrapping provides an excellent user experience.

Beyond setup, as mentioned, my hobby products and tools require different versions of .NET. Mostly I stick to version 4.5.1 these days, but in my setup builder system, I provide a mechanism for choosing prerequisites, including .NET.  I was working on this part of the system this weekend and found that I could not find in the 2.0 base class libraries a complete list of the versions of .NET.  Of course not, this is 2.0, so it has no future knowledge of versions of .NET. I started thinking about a class that would have all of the versions (subject to new releases I would have to add that's no problem), plus information about what's new associated with each version; and finally, a higher-level source of information for each major grouping of .NET versions. 1.0 to 3.5 comprise one group, while 4.0 and beyond comprise another group—the distinction is the version of the CLR changed between groups.

Custom Attributes

First things first: Custom Attributes.  For those of you who aren't familiar with attributes, you probably are, you just don't know it.  Here is a pretty common example of importing an external function call from a specific DLL:

[DllImport("kernel32.dll", EntryPoint = "TerminateProcess")]
[return: MarshalAs(UnmanagedType.Bool)]
public static extern bool TerminateProcess([In] IntPtr hProcess, uint uExitCode);

DLLImport is just one of the attributes in use here and is defined exactly the way a custom attribute is defined.  I have included a complete listing of DLLImport's implementation here .  I think of attributes as declarative markup for code, like XML tags.  Attributes have properties that are accessed by the run-time or your own custom framework to direct how the code is executed, prepared for, or to enforce run-time or compile-time directives, among other things.

In short, custom attributes are very powerful, as you can see from the DLLImport listing.  So, as I started thinking about a class the would encapsulate .NET version information, the first thing that popped in my mind was the use of attributes in two ways: one at the class level to provide version grouping information and second at the property level to provide specific information about the .NET version.  In this way, I could read the custom attributes and provide the user with contextual information about the various versions.  This ability would come in handy when defining a user interface where by a user selects a specific .NET version.  The user could be presented additional information about the version directly in the user interface.

The Version Description Attribute

Attributes inherit directly or indirectly from System.Attribute , which in turn implements the _Attribute interface. When an attribute is defined, it is important that an attribute is applied to the attribute called the AttributeUsage attribute:

[AttributeUsage(AttributeTargets.All, Inherited = false, AllowMultiple = true)]

Attributes receive their property information through constructors so everything between the open parenthesis and the closing parenthesis above can be thought of a constructor and its parameters.  Just as with regular constructors, you can provided overloads and optional parameters.  Let's think about the AttributeUsage attribute above.  The first parameter is an enum value that defines the "scope" of the attribute.  In other words, it indicates where the attribute is valid to apply, such as a class, property, struct, assembly and so on.  There are sixteen values to choose from.  The next parameter, Inherited = false,  indicates that classes derived from classes that carry this attribute do not inherit the attribute.  The default value is true because you would expect a "base" anything to be overridden explicitly by the derived class rather than either forcing it to be present or burying its presence in the inheritance.  The 3rd parameter is important for our scenario: AllowMultiple=True.  The default is false because typically an attribute is self-contained and applying it twice to a member seems strange--except for when the attribute is purely an information attribute, which we will see an example of soon.

Here is the implementation of the VersionDescriptionAttribute:

[AttributeUsage(AttributeTargets.Property)]
public class VersionDescriptionAttribute : Attribute
{
    public VersionDescriptionAttribute(string description)
    {
        Description = description;
    }
 
    public string Description { getset; }
}

The first thing you notice is the application of the AttributeUsage attribute, which allows this attribute to be applied to properties, it is inherited by derived classes (default value) and it only allows one application per property (again, the default value).  Applied, the attribute is used like this:

[VersionDescription(DotNetVersionDescriptions._1_0_0)]
public static Version _1_0_0 { getset;}

A rule in developing attributes is that all parameters must be a constant value, a typeof expression, or an array creation. In other words, you cannot have computed values as inbound data on the parameters. If you could, I would opt to put the version description text in a resource file, but that resolves to a call to ResourceManager.GetObject, which is a computation. The specific rules are here.

The first two components of the rule are easily understood.  The third requires a little explanation.  It simply means that a parameter can take the form of: type[] {x,y,z} as long as z, y, and z are constants.   The following example is acceptable: string[] {"Additional", "Subtraction", "Division"} There are means by which "dynamic" attributes are achievable.  I leave this topic to the reader to noodle over.

A final note to those not steeped in C# is variable naming constraints.  You probably noticed that version 1.0.0 is _1_0_0 because the following rules apply:

  1. The first character of a variable name must be either a letter, an underscore character (_), or the at symbol (@)
  2. Subsequent characters may be letters, underscore characters, or numbers
  3. The use of C# keywords as variable names is prohibited, although the variable name may contain the key word subject to rule #1 and #2.
Variables and Expressions - https://msdn.microsoft.com/en-us/library/gg615485(v=vs.88).aspx

VersionAdditionalLinksAttribute

public static partial class DotNetVersions
{
    [AttributeUsage(AttributeTargets.ClassAllowMultiple = true)]
    public class VersionAdditionalLinksAttribute : Attribute
    {
        public VersionAdditionalLinksAttribute(string majorVersionGroup, string additionalInformationLink)
        {
            MajorVersionGroup = majorVersionGroup;
            AdditionalInformationLink = additionalInformationLink;
        }
 
        public string MajorVersionGroup { getset; }
        public string AdditionalInformationLink { getset; }
    }
}

After the detailed description of the first attribute, I think what's going on with this attribute is straightforward.  One thing to take note of is the AttributeUsage attribute is applicable only to classes and more than one may be applied to the class.  As mentioned earlier, I want to provide detailed information about the two major groups of .NET versions: those that run on the original CLR and those that run on the second version of the CLR.   Here is an application of the attribute:

[VersionAdditionalLinks(DotNetVersionDescriptions.CLRDotNetVersionDescriptions.CLR_LINK)]
[VersionAdditionalLinks(DotNetVersionDescriptions.CLR2DotNetVersionDescriptions.CLR2_LINK)]
public static partial class DotNetVersions

Putting It All Together

Let's start with the end result and walk backwards--you may figure out how it was done given what I have shown you thus far.   The following is output from a unit test that exercises all of the major pieces of the solution: provide additional information links for the two groups of .NET versions, provide a brief description of each version, provide a sane output from all of the _X_X_X variables for use in user interfaces (such as binding the version list to a combo box).

The OutputOutput


Major Release Collection = v1.1 - v3.x
Major Release Link = https://msdn.microsoft.com/library/ms171868(v=vs.90).aspx
Major Release Collection = v4.0 - v4.x
Major Release Link = https://docs.microsoft.com/en-us/dotnet/framework/whats-new/index

Version = 1.0.0.0
Additional Information =
- First version of the.NET Framework.
Version = 1.1.0.0
Additional Information =
- ASP.NET and ADO.NET updates
- Side-by-side execution
Version = 2.0.0.0
Additional Information =
- Generics
- ASP.NET additions
Version = 3.0.0.0
Additional Information =
- WPF, WCF, WF, CardSpace
Version = 3.5.0.0
Additional Information =
- AJAX-enabled websites
- LINQ
- Dynamic data
Version = 4.0.0.0
Additional Information =
- Expanded base class libraries
- Cross-platform development with Portable Class Library
- MEF, DLR, code contracts
Version = 4.5.0.0
Additional Information =
- Support for Windows Store apps
- WPF, WCF, WF, ASP.NET updates
Version = 4.5.1.0
Additional Information =
- Support for Windows Phone Store apps
- Automatic binding redirection
- Performance and debugging improvements
Version = 4.5.2.0
Additional Information =
- New APIs for transactional systems and ASP.NET
- System DPI resizing in Windows Forms controls
- Profiling improvements
- ETW and stress logging improvements
Version = 4.6.0.0
Additional Information =
- Compilation using .NET Native
- ASP.NET Core 5
- Event tracing improvements
- Support for page encodings
Version = 4.6.1.0
Additional Information =
- Support for X509 certificates containing ECDSA
- Always Encrypted support for hardware protected keys in ADO.NET
- Spell checking improvements in WPF
Version = 4.6.2.0
Additional Information =
- Cryptography enhancements, including support for X509 certificates containing FIS 186-3 DSA, support for persisted-key symmetric encryption, SignedXml
support for SHA-2 hashing, and increased clarity for inputs to Elliptic Curve Diffie–Hellman key derivation routines.
- For Windows Presentation Foundation (WPF) apps, soft keyboard support and per-monitor DPI.
- ClickOnce support for the TLS 1.1 and TLS 1.2 protocols.
- Support for converting Windows Forms and WPF apps to UWP apps.
Version = 4.7.0.0
Additional Information =
- Support for the level of TLS support provided by the operating system.
- Ability to configure default message security settings for TLS1.1 or TLS1.2.
- Improved reliability of the DataContractJsonSerializer.
- Improved +reliability of serialization and deserialization with WCF applications.
- Ability to extend the ASP.NET object cache.
- Support for a touch/stylus stack based on WM_POINTER Windows messages instead of the Windows Ink Services Platform (WISP) for WPF applications.
- Use of Window's Print Document Package API for printing in WPF applications.
- Enhanced high DPI and multi-monitor support for Windows Forms applications running on Windows 10 Creators Update.

Version = 1.0.0.0
Version = 1.1.0.0
Version = 2.0.0.0
Version = 3.0.0.0
Version = 3.5.0.0
Version = 4.0.0.0
Version = 4.5.0.0
Version = 4.5.1.0
Version = 4.5.2.0
Version = 4.6.0.0
Version = 4.6.1.0
Version = 4.6.2.0
Version = 4.7.0.0

The Unit Test


[TestMethod]
public void VersionsClassTests()
{
    DotNetVersions.VersionInfo versionInfo = new DotNetVersions.VersionInfo();
 
    foreach (KeyValuePair<stringstringkvp in versionInfo.MajorReleaseDictionary)
    {
        Debug.WriteLine("Major Release Collection = " +  kvp.Key);
        Debug.WriteLine("Major Release Link = " + kvp.Value);
    }
 
    Debug.WriteLine(Environment.NewLine);
 
    foreach (KeyValuePair<Versionstringkvp in versionInfo.VersionDictionary)
    {
        Debug.WriteLine("Version = " + kvp.Key);
        Debug.WriteLine("Additional Information = " + kvp.Value);
        Debug.WriteLine(Environment.NewLine);
    }
 
    Debug.WriteLine(Environment.NewLine);
 
    foreach (Version version in versionInfo.Versions)
    {
        Debug.WriteLine("Version = " + version);
    }
}

The Implementation


[VersionAdditionalLinks(DotNetVersionDescriptions.CLRDotNetVersionDescriptions.CLR_LINK)]
[VersionAdditionalLinks(DotNetVersionDescriptions.CLR2DotNetVersionDescriptions.CLR2_LINK)]
public static partial class DotNetVersions
{
static DotNetVersions()
{
    _4_7_0 = new Version(4, 7, 0, 0);
    _4_6_2 = new Version(4, 6, 2, 0);
    _4_6_1 = new Version(4, 6, 1, 0);
    _4_6_0 = new Version(4, 6, 0, 0);
    _4_5_2 = new Version(4, 5, 2, 0);
    _4_5_1 = new Version(4, 5, 1, 0);
    _4_5_0 = new Version(4, 5, 0, 0);
    _4_0_0 = new Version(4, 0, 0, 0);
    _3_5_0 = new Version(3, 5, 0, 0);
    _3_0_0 = new Version(3, 0, 0, 0);
    _2_0_0 = new Version(2, 0, 0, 0);
    _1_1_0 = new Version(1, 1, 0, 0);
    _1_0_0 = new Version(1, 0, 0, 0);
}
 
[VersionDescription(DotNetVersionDescriptions._1_0_0)]
public static Version _1_0_0 { getset; }
 
[VersionDescription(DotNetVersionDescriptions._1_1_0)]
public static Version _1_1_0 { getset; }
 
[VersionDescription(DotNetVersionDescriptions._2_0_0)]
public static Version _2_0_0 { getset; }
 
[VersionDescription(DotNetVersionDescriptions._3_0_0)]
public static Version _3_0_0 { getset; }
 
[VersionDescription(DotNetVersionDescriptions._3_5_0)]
public static Version _3_5_0 { getset; }
 
[VersionDescription(DotNetVersionDescriptions._4_0_0)]
public static Version _4_0_0 { getset; }
 
[VersionDescription(DotNetVersionDescriptions._4_5_0)]
public static Version _4_5_0 { getset; }
 
[VersionDescription(DotNetVersionDescriptions._4_5_1)]
public static Version _4_5_1 { getset; }
 
[VersionDescription(DotNetVersionDescriptions._4_5_2)]
public static Version _4_5_2 { getset; }
 
[VersionDescription(DotNetVersionDescriptions._4_6_0)]
public static Version _4_6_0 { getset; }
 
[VersionDescription(DotNetVersionDescriptions._4_6_1)]
public static Version _4_6_1 { getset; }
 
[VersionDescription(DotNetVersionDescriptions._4_6_2)]
public static Version _4_6_2 { getset; }
 
[VersionDescription(DotNetVersionDescriptions._4_7_0)]
public static Version _4_7_0 { getset; }
 
public class VersionInfo
{
    public VersionInfo()
    {
        MajorReleaseDictionary = new Dictionary<stringstring>();
        VersionDictionary = new Dictionary<Versionstring>();
        Versions = new List<Version>();
 
        Type parentType = GetType().DeclaringType;
 
        Debug.Assert(parentType != null,
                        DotNetVersionDescriptionErrors.ASSERT_PARENT_TYPE_NOT_NULL);
 
        List<VersionAdditionalLinksAttribute> linksAttributes = parentType.GetCustomAttributes()?
                                                                            .OfType<VersionAdditionalLinksAttribute>()
                                                                            .ToList();
 
        if (linksAttributes == null)
        {
            throw new InvalidOperationException(DotNetVersionDescriptionErrors.ERROR_MALFORMED_VERSION_INFO);
        }
 
        foreach (VersionAdditionalLinksAttribute linksAttribute in linksAttributes)
        {
            MajorReleaseDictionary.Add(linksAttribute.MajorVersionGroup, linksAttribute.AdditionalInformationLink);
        }
 
        PropertyInfo[] properties = parentType.GetProperties();
 
        foreach (PropertyInfo propertyInfo in properties)
        {
            object versionInfoObject = propertyInfo.GetValue(nullnull);
 
            if (versionInfoObject == null)
            {
                continue;
            }
 
            Version version = new Version(versionInfoObject.ToString());
            Versions.Add(version);
 
            VersionDescriptionAttribute versionDescriptionAttribute =
                (VersionDescriptionAttribute) propertyInfo.GetCustomAttributes(typeof(VersionDescriptionAttribute)).FirstOrDefault();
 
            if (versionDescriptionAttribute == null)
            {
                throw new InvalidOperationException(DotNetVersionDescriptionErrors.ERROR_MALFORMED_VERSION_INFO);
            }
 
            VersionDictionary.Add(version, versionDescriptionAttribute.Description);
        }
    }
 
    public Dictionary<stringstringMajorReleaseDictionary { getset; }
    public Dictionary<VersionstringVersionDictionary { getset; }
    public List<VersionVersions { getset; }
}

Wrap-Up

I hope you enjoyed a little meandering down memory lane with our look back on the Internet in 1995, plus some of my own personal experiences. I hope to have taught a few something about custom attributes and, in the end, I hope the DotNetVersions classes are useful to others.

A link to a ZIP archive with everything you need to add the components to your solution follows. You may use this as you see fit. If you have questions, leave a comment.


Colby-Tait

Disclaimer

Content on this site is provided "AS IS" with no warranties and confers no rights. Additionally, all content on this site is my own personal opinion and does not represent my employer's view in any way.