The Case For Modeling Part 1

The regular readers of this blog know that my database modeling book was cancelled when Microsoft pulled the rug out from under M and the repository. I did a lot of good writing about modeling in general, so I wanted to put some of it up here on the blog, since the cancellation is official now. This is the first part of Chapter 1.

While I don’t want to spend a lot of time on a history lesson, some background information is necessary to make sure we are all on the same page. If you, the reader, have spent a lot of time watching the world of modeling pass us all by, then you might be able to safely skip the rest of this chapter.

What led to this

The rest of you: stick around. It’s a story of wonder and woe. You’ll laugh, you’ll cry. It’s better than Cats. You’ll read it again and again.

In all seriousness, the path that led to Microsoft’s latest software modeling effort is very interesting. It cuts to the core of business computer programming - how do I model my business process in code?

It is a problem that developers have faced since we were punching tape. In the end, it is all zeros and ones, so how do we make tools that make it easier to describe moving a box from point A to point B? That’s the heart of software modeling.

To break it down, let’s start with a description of the actual problem we face - moving from the model to the code. Then we will inventory some of the tools that have been tried over the years - many of which have led to SQL Modeling. Finally, we’ll look over CASE and the 800 pound gorilla: UML.

A description of the problem

Imagine you are the new CIO of a distribution company. Your company, let’s call it M Logistics, has no software. Yes, that’s what I said. They have no software at all. They use this weird stuff made out of ground up trees. They call it paper. You call it inefficient.

You are going to need to get some software in there, and you are going to need to do it fast. There are no off-the-shelf solutions available (more on that later) so you are going to have to build it.

Get out the whiteboard. Starting with a design is the best way, most will admit. There are questions that need to be asked.

What goes into a distribution company, information wise? The truck shows up with boxes. You unload the boxes, sort then in different ways perhaps. Then you put them in different trucks and send them off to a store.

“Fair enough,” you might think. “We will start with a truck containing boxes.” The drawing probably looks like Figure 1-1.

clip_image002_thumb
Figure 1-1 The whiteboard - truck contains boxes.

Great! Good start. Over the next few hours you map out the rest of the relationships between objects in your scope - or your domain model - and the whiteboard will be filled with goodness.

Oh, but wait. You can’t ship the whiteboard to the central office, so you will probably need to make a document. A word processor for the text will solve that problem, and some diagramming software for the model, like Figure 1-2.

clip_image004_thumb
Figure 1-2: The document - truck contains boxes

Nice work! Now you can package it up and send it to Central Office. All we have to do now is wait for approval.

Moving to code

The day has come and approval from Central Office has arrived (along with your new business cards, sweet!) It is time to start coding. Fire up Visual Studio, reference the carefully reviewed document, and start sketching out the domain model like the following listing.

namespace Warehouse
{
    public class Truck
    {
        public Box Boxes { get; set; }
    }
    public class Box
    {
    }
}

That’s a good start. We can work on the rest later. Now we all know that the C# class isn’t everything, we need a database to back it up. It is time to fire up the RDBMS of choice and model up a database. Depending on your tool, it should look like Figure 1-3

clip_image006_thumb
Figure 1-3: The database - trucks contain boxes

We are certainly getting somewhere. We have a domain model. We have a database. We … need some way to hook them together. Alright, enter your ORM of choice, or maybe roll up your sleeves and roll your own.

Eventually the decision is to use Entity Framework. Fine - still you have quite a few lines of code, mapping the database to the domain model:

  public partial class Truck : EntityObject
    {
        public static Truck CreateTruck(global::System.Int32 id)
        {
            Truck truck = new Truck();
            truck.Id = id;
            return truck;
        }

        [EdmScalarPropertyAttribute(EntityKeyProperty=true, IsNullable=false)]
        [DataMemberAttribute()]
        public global::System.Int32 Id
        {
            get
            {
                return _Id;
            }
            set
            {
                if (_Id != value)
                {
                    OnIdChanging(value);
                    ReportPropertyChanging("Id");
                    _Id = StructuralObject.SetValidValue(value);
                    ReportPropertyChanged("Id");
                    OnIdChanged();
                }
            }
        }
        private global::System.Int32 _Id;
        partial void OnIdChanging(global::System.Int32 value);
        partial void OnIdChanged();
    
        [EdmRelationshipNavigationPropertyAttribute("Model1", "TruckBox", "Box")]
        public EntityCollection<Box> Boxes
        {
            get
            {
                return ((IEntityWithRelationships)this).RelationshipManager.
GetRelatedCollection<Box>("Model1.TruckBox", "Box"); } set { if ((value != null)) { ((IEntityWithRelationships)this).RelationshipManager.
InitializeRelatedCollection<Box>("Model1.TruckBox", "Box", value); } } } }

And now, finally, we can get around to actually writing the software. That is a lot of steps. Something is rotten in Denmark, methinks.

The gap is wider than you think

There are two things that are clearly wrong here.

First, you had to concern yourself with object/relational mapping. Aren’t we past that? Isn’t the concept well enough understood that we don’t need to put the code in books anymore?

Second, you touched the basic model five times. That’s far too many. Why can’t we use the model we start with as the domain model, the database and the ORM?

Mapping code to models

The first of these problems, object / relational mapping or O/RM, has been solved many times. We saw above how it is a problem that has to be solved every single time we build a data driven piece of software.

The solutions range from open source products with a wide community following to expensive rarely used commercial products that solve very specific issues.

The core problem is that gap

The reason that there are so many different solutions is because of the range of that gap between the objects and the database. The example at M Logistics is a fairly straightforward one, but there are numerous more sophisticated examples.

  • In a many to many situation, should the ORM model the joining table? What if it has data associated with it?
  • What happens if there are several different databases?
  • Also, sometimes some of the data silos in an entity model aren’t databases at all.
  • We haven’t even talked about the pattern of the domain model. Repository? Lazy loading?

The more issues in an O/RM the more sophisticated the software doing the mapping must be. The more sophisticated the software is the fewer people will want to use it. This leads to fragmentation in the software market. It also leads to confused users.

Various solutions

There are all kinds of object / relational mappers out there. Eric Nelson has a great inventory in his entity framework talk, which points out the following items:

  • NHibernate - a rewrite of Hibernate, which is originally an open source Java tool. Well used and mature.
  • EntitySpaces - a commercial tool that moves a lot of SQL functionality into the C# code.
  • LLBLGenPro - another commercial tool that generates code for you to do the mapping.
  • DevForce - a newer O/RM mostly for the web space.
  • XPO - a DevExpress product, eXpress Persistant Objects. It is a complete abstraction layer.
  • Lightspeed - a Mindscape product that has a stunning visual interface.
  • Open Access - the Telerik entry that depends on convention to avoid reflection.

This list is just in the Microsoft development space, too. There are lots of O/RMs for Java and other platforms. NHibernate, in fact, is a port of Hibernate, the well-known Java O/RM.

Microsoft’s tries

One would imagine that since all of these products are developed to fix a weakness in the Microsoft development space that eventually Microsoft would step in and fix it themselves. Well, they have tried. In fact, they have been trying all along, but let’s just start with the .NET era.

Typed datasets were the first .NET implementation of an ORM back in 2002, and they are still pretty cool for small projects. They effectively represented an in-memory copy of the data that would be committed to the RDB on demand. They became unwieldy for large projects, though.

ObjectSpaces was going to be the ‘next big thing’ as part of .NET 1.1 and 2.0 but never saw the light of day. I am not sure what happened there. Maybe we can get the scoop sometime.

The Microsoft Business Framework was, I think part of what would be the Dynamics group and grew out of the SharePoint models. I think. Anyway, it doesn’t matter because it didn’t ship either.

Then there was WinFS. I you hadn’t heard about that I’ll tell you over a beer sometime. Never shipped.

Finally, in 2007, lambda calculus was added to the CLR and LINQ was introduced. LINQ did actually ship, and it is currently the best way to query a model of a database in your code for the whole Microsoft stack. This book isn’t about LINQ, but if you don’t know it you should learn it. It is a very powerful technology.

The nice thing about LINQ is that it is datasource agnostic – exactly what we needed. It can talk directly to a SQL Server database, yes, but it can also talk to another object model (like a normalized Data Access Layer) or an Entity Model built in EDMX.

So that is where we leave the history lesson on O/RM. The next significant move by Microsoft is SQL Modeling, which is the topic of the rest of the book. First, though, let’s talk about the second topic – modeling the software.

Modeling with CASE and the gorilla in the room

A lot of people have tried to solve the modeling problem too. Remember how the model got redone 5 times? Most people agree that that needs to get better.

The Unified Modeling Language

The 800 pound gorilla in the room is the UML. A gang of four people who have forgotten more about software design than I will ever know got together one day. They left their egos at the door and took their four separate design patterns and built this system for diagramming software.

This system is the Unified Modeling Language or UML, and it is the most complete, most comprehensive possible way to describe how a piece of software works. I can’t even do it justice here, so I will just direct you to Martin Fowler’s great books on the subject – especially UML Distilled. I don’t leave home without it.

But the UML has two problems. First, it isn’t code. Since it is so very comprehensive, it can be converted to code, but you have to completely fill out the model, which usually takes longer than just writing the code! (As an aside, I often write the domain model in the language of choice, and then convert it to the UML.)

The other problem with the UML is that it doesn’t inherently have any knowledge of the database. The project still has the O/RM problem from the last section. Though the UML is expressive enough to infer a schema, it is sometimes too expressive, and the schema is only as good as the tools.

Many tools have attempted to solve this, from the simple Microsoft Visio to the mighty IBM Rational Rose. None of them do it well, because that is not really what the UML is designed for – it is not an O/RM. You are forced to write yet another model, like the venerable Object Role Model, or even just an Entity Model, and then manually do the mapping.

There is clearly room for a simpler yet better modeling pattern here. We may have created a system in the UML that has become too unwieldy to use. Yet just writing on whiteboards prevents remote team access, and is tough to distribute for review.

<whatever> Driven Design

A current trend is to use part of the architectural patterns as the design medium. For instance, Test Driven Design, or TDD, is basically the use of unit tests to describe expected outcomes of user stories, and then code until the tests pass, and then refactor.

TDD is a great pattern, and I use it a lot. However, it is tough to review, and nigh impossible to generate documentation from for a formal document. It almost has to be used in conjunction with something else.

Another alternative is Design by Contract, sometimes called Contract Driven Design or CDD. It works well in large teams, can be used with TDD, and can generate the UML if you need it to. It has a steep learning curve though, and is often overkill for large projects.

There are a host of others; you probably already have your favorite. Few of them do the O/RM well, and most of them are too tough to use for 10,000 foot view types of modeling.

Especially for forms over data types of applications (which constitutes well over half of the projects we all develop) there needs to be a new solution. Somewhere there is a solution that is easy to learn, distribute, and review, and that translates well to code and the database. SQL Modeling is that solution.

Mustard Dip for Pretzels

1/2 Cup Dijon mustard

1/4 Cup Mayo

1/4 Cup Cheve (Softened in microwave)

2 green onions, chopped fine

1/2 tsp Celery Salt

 

Mix until smooth, eat with pretzels.

A week of neat security stuff


This week, I’ll be doing three neat security events, and you are invited!

Wednesday morning, I’ll be speaking at the Central Ohio ISSA about Windows Identity Foundation, OpenID and Claims Based Authentication. Details are here. This is the topic description:

“Escalation of privilege is based on a model of security that is driven by roles and groups for a given application. I am in the Administrator role, the Accounting group contains your username. What if instead you carried a token with a verifiable set of claims about your identity? One that is encrypted, requires no round trip to an authorization server, and can be coded against in a native API? Would that bring more security to our government and medical applications? Or is it just as full of holes as everything else? Join Bill in checking out Claims Based Security via Windows Identity Foundation, and see if it fixes problems or is the problem.”

That evening (wshew!) I’ll be giving a presentation on high-security locks at the Columbus Locksport International meeting at the Columbus Idea Foundry.  You can sign up here. Please RSVP if you are coming, because we need to plan for a crowd if we have one.  I’ll be covering security pins, and the idea behind sidebar locks.

Then, Friday, I’ll be at B-Sides Cleveland giving the WIF talk again.  It’s at the House of Blues, and I’ll be talking at 10AM.  The conference is sold out, though.  Too bad - it sounds like an awesome lineup, and I am just floored to be among them. Freaking ReL1K is speaking – he built the Social Engineer’s Toolkit for crying out loud. I’m truly honored.  I am looking forward to this.

CodeMash v2.0.1.1

 

Another CodeMash is in the books, and all kinds of new stuff was in the offing for me this year.  But first I would be remiss if I didn’t thank Jason Gilmore, Brian Prince and especially Jim Holmes (along with the rest of the board) for uncompromising management of simply the best conference on this topic.  Period. Not for the money, not for the constraints of space.  It is simply the best code-centric conference on the planet.

I owe a lot of people a lot of links and information on a lot of topics.

imageFirst and foremost, I was delighted to be asked to speak again, and was pleased to have Matthew Groves join me for a discussion on Monodroid.  We had 100 people join us for a look at how Monodroid came to be and what the future holds.

Then Matt took us for a tour of his excellent Stock Tracker application (shown left), converted from Windows Mobile.  There were a number of good points made all around, and generally a good time was had by all.

The Monodroid documentation contains nearly everything that you need to know to get programming.  The tutorials are the best starting point, and provide the templates for all of the major use cases. Matt’s application is on GitHub – please feel free to get it an mess around.  It’s a good app.  I’ll have BabyTrak up here in a couple of months.

imageThe Locksport openspace was a rousing success.  About 40 people were taught to pick, and about that many more stopped me in the halls and told me that they would like to have been there.  I was frankly astonished by the turnout, and would have brought 5 times as many picks if I would have known about the interest – all 15 of the sets I brought were sold.

For those looking for more information:

The Locksport International site has a lot of good links to publications and whatnot.  Deviant Ollem’s book, Practical Lock Picking, is excellent – he is the guy who wrote the presentation that I gave (twice). The best community is online at Lockpicking101, and they have an IRC channel too.  If you need to order picks, look at LockPickShop – Red does an awesome job.  The 14 piece set is on sale right now and is a great learners set!

imageFinally, if you are in the Columbus area please join us at the Columbus branch of Locksport International.  We have a Meetup group – just go sign up and you’ll get the locations for each meeting.  You can attend for free, but if you want a membership card and to participate in competitions, it’s $20 a year.

And last but not least, I got a ton of comments on the jam band.  Lots of questions too.  Yes, I was a professional musician for many years.  I taught at a lot of area band camps, like Upper Arlington and Teays Valley.  I played in a Dixieland band in London Ohio called the Lower London Street Dixieland Jazz Band and Chamber Music Society for nearly ten years. I haven’t played in quite a while, and I have to say it was a lot of fun.  Hope to do it again next year.

All in all, an awesome conference.  Again, I was a net producer of content rather than a consumer of content, and that’s OK.  I still learned a ton just by chatting with friends old and new, and picked up information about the hip new technologies that the cool kids are using by osmosis.

Hope to see everyone at DevLink!

SQL Server Developer Tools Part 1: Adding an existing project

 

I recently was needful of adding some instrumentation to the Dot Net Nuke code base, and decided to use the new SQL Server Developer Tools to load in the database as a project and manage it there.  I had written some internal documentation for the project while it was still under wraps, and was glad to see it so strong when it was released for public consumption.  I thought seeing how I used the application might give someone out there a hand.

Since we have an existing project, we have an existing database. Fortunately, SSDT has an app for that. You can add a new SQL Server Database Project, which will effectively take a snapshot of the database and expose it to Visual Studio as T-SQL Scripts. These are the scripts that will eventually make up the development base for the software

We will start by creating a new project. Right click on the Solution file and select New Project … The Project Selection dialog appears, and if you click Data, you get the template for the SSTP. Name the project and move forward.

clip_image002This project represents everything that is part of a database file in SQL Server. There is a properties folder in the project that will show you all of the database level properties – usually handled by SSMS. This is just one of many examples of SSDT bringing the DBA and developer closer together, as shown in the figure to the left. Operational properties such as the filegroup and transaction details are at least available for viewing by the developer and alterable locally. Permissions still hold, so you as a developer have to be set up to change these kinds of details to change the production system. At least you can alter them locally and see what works without a call to the DBA.

The original project is empty. In order to get the existing database into the project, an import needs to occur. Right click on the new project and then Import Objects and Settings. Select the local database and pass in the appropriate credentials. I selected the DotNetNuke database from my developer’s instance but you should select whatever you want to incorporate.

clip_image004The Import Database Schema Wizard has all of the options that define how you will interact with the database once it is in the project. I like the default settings, which define a folder for each schema, and a folder within for each database item type.

There is an option to import the SQL Server permission structure, but I find that most projects don’t use that. My DotNetNuke project uses the SQL Membership Provider, however, so there is a mapping between the login structure of the database and the Users table of the membership provider. For that reason, I do turn on Import Permissions.

Once the values are set, just follow the steps:

  1. Make a new connection
  2. Click Start
  3. Watch the magic happen
  4. Click Finish
  5. Let’s see what we have

What we have here is everything that the database has to offer, in T-SQL Scripts. This is important. Every change that is made can be included in a DAC, because there is source control and an understood level of alteration. Changes are known by the system, and go into a pot to turn over to operations, or to be reviewed by the DBAs. Changes are not made by altering the database anymore.

clip_image005

Taking a look at the DotNetNuke database project, you’ll see the main schema (dbo) broken into the familiar Tables, Views, sprocs and functions. The project also has scripts for three other database members – Scripts, Security and Storage.

Storage is just the database definition. In the case of this project, it is simply:

ALTER DATABASE [$(DatabaseName)]

ADD FILE (NAME = [DotNetNuke],

FILENAME = '$(DefaultDataPath)

$(DatabaseName).mdf',

SIZE = 9216 KB,

FILEGROWTH = 1024 KB)

TO FILEGROUP [PRIMARY];
It’s part of the completeness, but not something that you will alter a lot. The Scripts directory is another that you’ll change once and forget about – it contains the pre and post build scripts for the deployment of the database. Every time you build and push to the actual database server, these scripts will be run.

The Security folder is fascinating. It contains all of the roles and related schemas, and the associated authorizations. If you have a project that secures database assets in the database management system, this could be awesome.

The meat is in the schema folder, called ‘dbo’ in the DNN example. This is where the scripts you would expect to see in a project like this are held, and where we will be doing the majority of our work. Each entity in the database is scripted separately here, and we can modify or add to our hearts content, and deploy separately.

Set up some new entities

The first ting needed for the instrumentation being added is a table for some timing data. Start by right clicking on the Tables folder and selecting ‘Add New …’. Notice the nice Visual Studio integration that shows asset types which can be added. Select a Table. Name the table ‘Instrumentation’ in the Add New Item dialog and click OK. There is a base template for a table; go ahead and change it for the new fields:

CREATE TABLE [dbo].[Instrumentation]

(

InstrumentationId int NOT NULL IDENTITY (1, 1) PRIMARY KEY,

ClassName varchar(64) NOT NULL,

MemberName varchar(64) NOT NULL,

ElapsedSeconds bigint NULL,

Exception text null

)
There is a Commit button that is so tempting at this point, but it isn’t the button you want right now. Commit peeks at the declarative information in the script – effectively doing a pre-parse – and try and make appropriate changes to the target database. It is more or less like the Table Designer in SSMS at this point, using the references to concepts in the database to make decisions rather than just running the T-SQL as coded.

Click the Execute SQL button in the text window’s toolbar to run the CREATE TABLE procedure and save the table to the database defined in the project properties. (You can read more about that in the Deployment section below.) In this way, the database is effectively disposable. At any time, you can get a copy of the scripts from Team Foundation Server, and generate a whole new copy of the database, minus reference data. For right now, though, we are just adding this one table.

Great, that gives us a place to put the data. Next step is a way to get it there. Right click on the Stored Procedures folder and select Stored Procedure from the context menu. Just like the table above, Visual Studio will give a base template. I changed it to the code below:

CREATE PROCEDURE [dbo].[AddInstrumentationEvent]

@ClassName varchar(64),

@MemberName varchar(64),

@ElapsedSeconds bigint = 0,

@Exception text = ''

AS

INSERT INTO Instrumentation

(

ClassName,

MemberName,

ElapsedSeconds,

Exception

)

VALUES

(

@ClassName,

@MemberName,

@ElapsedSeconds,

@Exception

)

RETURN 0
clip_image007I need to make a quick shout out here for the IntelliSense. It’s expected I suppose, that this should support full IntelliSense integration but I was just shocked at how comfortable to use I personally found it. I do not care to write SQL code because it is so cumbersome. Having that modern development experience talked about in the introduction makes a big difference.

That’s all we got for new features – it’s a short paper after all. Clearly any entity that SQL Server supports can easily be added to the project, stored in source control, and managed like the rest of the application. What’s more, it can be tested and refactored like any other part of the application.

Red, green, refactor

clip_image009After further review, it was decided that Instrumentation wasn’t a good enough name for the table, since it doesn’t accurately represent what was actually put in the rows of the table. Instead, the name InstrumentationEvents is supposed to be used, so we need to rename the table.

clip_image011Right click on the table name in the CREATE statement of Instrumentation.sql and select Refactor -> Rename. Change the name to InstrumentationEvents and click Next. Notice, as shown in the figure to the left, that SSDT got it right. The preview is very helpful. It finds all of the consuming database members, and lets you determine which of them to apply the change to.. Even in the stored procedure, the pluralization is correct, changing AddInstrumentation to AddInstrumentationEvent rather than AddInstrumentationEvents. That trailing s might not seem like much to some people but it can make a big difference in a convention over configuration based system.

Rename isn’t the only refactoring available in SSDT, there are also T-SQL specific features. If you are working in DotNetNuke, open up the GetUsersByUserName.sql script in the Stored Procedures folder. It’s a little overdeveloped and has too much UI logic in it, but it works for this example.

Line 31 has a SELECT * on it, and frankly this procedure is too slow as it is. We don’t need to add a table scan. The refactor menu has an Expand Wildcards option, and I recommend its use here. Right click on the asterisk and select Refactor -> Expand Wildcards, then click on SELECT * in the treeview. The Preview Changes dialog now will show us how the sproc will look with the wildcard expanded, just like the figure to the right. Click Apply to have the changes applied to the procedure.

Don’t overlook the various features of the Visual Studio IDE that now can be used to manage the T-SQL code. For instance, consider that GetUserByUserName.sql file. Right click on the vw_Users token on line 10 and select Go To Definition to be taken to the View definition in the database project. The view doesn’t help us much, because we want to see the table in question. Scroll down in the view to the FROM statement and right click on Users and select Go To Definition again to see the Users table.

As expected, you can then right click on the Users table name and Find All References to see the 44 places that the Users table is used in the database project. The usefulness of this in a legacy environment can’t be overestimated. Finding your way around a project this easily, digging from the core code to the database and navigating using the built-in features will significantly reduce time spent in getting up to speed on existing applications.

clip_image013What’s more, code-level refactoring isn’t the only thing that the data modeling group is pushing for in SSDT. There is project-level refactoring available as well, which is a step in the right direction of whole-project management. Across-the-board changes to software code are something that Visual Studio already excels at, and SSDT is working toward providing the same kinds of features for database projects.

For instance, right click on the database project and select Refactor from that context menu. Aside from the wildcard expansion seen in the code-level refactoring, not the Rename Server/Database reference. It’s a whole-project way to change references that would be manages in configuration files for a C# project, but needs to be controlled with refactoring in T-SQL.

Refactoring is an important part of software development. Though it has been available in third party tools for a while having a standardized experience that is so tightly integrated with Visual Studio will make a big difference to the average developer. While unit testing integration with Visual Studio still isn’t in there for T-SQL it is still a step in the right direction toward that modern development experience we have been talking about.

Recipe: The Spanner

 

Bastard child of the Screwdriver and the Shirley Temple.

  • 1 shot Absolut Vodka
  • 1 shot Mandarin Orange Juice (just pour it right out of the jar or oranges)
  • 4 slices of Mandarin orange
  • Fill the glass with Sprite

Preferably put it in a Pleasure Island Jazz Company 8oz shot glass.

Enjoy!

IMG_20110102_233006

And yes, no ice.  Make sure the ingredients are cold.

Thanks to Matt Groves for the name.

Visual Basic.net For Dummies usage of Northwind database

There are a lot of questions about Visual Basic.NET 2005 For Dummies and Visual Basic.NET 2008 For Dummies and the use of Northwind for the samples.  When I wrote the majority of that book in 2004, Northwind was still a common sample database for SQL Server.  When I updated it in 2007, it will still provided as a sample.  Since then, it has been totally replaced by AdventureWorks. Now, finding the sample is hard, and installing it is even harder.

I was going to write a large post on how to do the install, but Pinalkumar Dave did such an awesome job on his blog that I don't have to.  Here is the link:

http://blog.sqlauthority.com/2007/06/15/sql-server-2005-northwind-database-or-adventureworks-database-samples-databases-part-2/

You can get the samples still, from this link:

http://www.microsoft.com/downloads/en/details.aspx?familyid=06616212-0356-46a0-8da2-eebc53a68034&displaylang=en

I hope this helps.  I will be updating the VB book series after I am done with Programming Data, and will change the data samples to use AdventureWorks, or the latest and greatest at that time, if it changes again.

I'll be speaking at DogFoodCon

I'll be speaking at the 2010 DogFood Conference that Microsoft puts on here in Columbus.  Danilo Castilo runs it (largely) and it is pretty cool - a neat community event on a budget.

It's a cool collection of talks about the newest Microsoft products and how people are using them.  Thus the name: 'DogFood' from the phrase 'eating your own dog food.'

I'll be speaking with Mario Fulan about using AFDS 2.0 to cross domain borders.  If you don't already know Mario, he is a beast - one of like ten Certified Sharepoint Masters in the whole freakin universe or something.  He has forgotten more about SharePoint than I will ever learn.  I do know Windows Identity Foundation a little bit though, so that's what I'll be covering.

The conference is at www.dogfoodcon.com and is selling out really fast.  If you are interested in the hot new stuff, check it out and get registered while you can.  It's next month - November 4 and 5.

On the death of Quadrant.

It's common knowledge that I have been following Oslo / SQL Server Modeling Services very closely.  I am working on a book on the topic, and have posted a number of blog entries.  The speaking circuit has been good to me too, and I have given my Software Modeling With ASCII talk five or six times already this year.

My focus has been on M, but today we are talking about Quadrant.  Quadrant is part of a trio of tools that includes M (a language to build data and domain models) and Modeling Services (a set of common models and repository).  Quadrant itself is tool to interact visually with SQL Server databases.

I've been watching Quadrant for over a year now, and I had a lot of questions about its viability in the marketplace.  As a data management tool, it was underpowered, but as a data browsing tool, it was overpowered.  When I eventually came to realize that it could be a domain model browser, my interest was piqued.  Since you could define the quadrants using M, it would be effectively possible to build comprehensive data management 'dashboards' in Quadrant, and use them in a power user role.

Over time, however, I began to realize that this was an edge case.  The business users and data managers that need these solutions will still find development in M too time-consuming, and the professional developers who would be asked to help them would just rather work in C#.  It will end up that the business users will go back to Access and Excel, data managers will just use SQL Management Studio, and professional developers will use Windows Forms or XAML in C#.

Apparently, Microsoft saw this too.

I am sad to see Quadrant go.  It was a beautiful application, and could have been a foundation to a number of very, very cool tools.  Hopefully the Data team wil find another use for the technology.

It should be noted that SQL Server Modeling is not necessarily dead at all.  Models can still be built with M, and exposed to the world in OData.  Applications can still be built against this model using Visual Studio, and the data can still be managed using SQL Management Studio.  

The loss of Quadrant doesn't impact this vision, and I hope that Microsoft realizes this and continues down the path toward an enterprise-class repository.  It's the last piece of the puzzle that keeps large enterprises from deploying SQL Server in application-centric environments.

 

The terrorists have won

When I was a sophomore in high school, we had a unit in our World History class about the Holocaust.  Fran LaBuda, a German Jew who escaped to the US through despite the Nazis, would stand at the door of our classroom and bark orders to us in German as we entered, using a pointer to tell us where to sit, and even push us around as necessary.  A militant looking fellow (later I learned it was her son, and as gentle a guy as you could imagine in real life) escorted anyone who didn’t take it seriously out of the room, rather roughly.

The point was to show us in general how easily we could be cowed by a force we didn’t understand taking our power of independence.  These are upper middle class high school students, and used to getting their way.  Their parents bought them the cool clothes and looked the other way when the rules were transcended.  They wore their ego on their shoulder like a badge of honor.

But when the going got rough, they folded like a bad hand at cards.  Only one person tried to joke about the event with Mrs. LaBuda, and was taken from the room.  He was the class clown, but was nearly in tears when pushed out of the door by the enforcer.

Fast forward to today.  I was in line at the TSA’s security gate at SeaTac.  Walking up and down the line was a rather militant looking fellow yelling out in plain, though loud, English:

“If you do not take your liquids and gels out of your carry-on luggage you will not be allowed to get on your plane.  You will be escorted to enhanced screening, and there is a half day wait.”

Next to me stood a seventy year old woman, grey hair, a Russian Jew by her accent; tears were streaming down her face.  She was frantically digging through her plain bag looking for the satchelof toiletries that was plainly sitting on the table in front of her, unnoticed.

“Your bottles are right here,” I showed her.

“Oh, thank you son,” she sighed in relief.  “I’m just trying to get home to Florida to see my grandson.  I’m so terrified that these people will lock me up.”

She was so terrified that those people would lock her up.  People that were purportedly trying to keep us safe, but who were instead driving this woman, others, myself to tears with worry that one wrong move with the toothpaste could cost us time with loved ones, money, business, whatever.

The terrorists have won.

The goal of a ‘terrorist,’ and thus the name, is terror.  They don’t really care, as a group, if they kill anyone.  As  long as the people they attack live in fear.  They state that they want to kill Americans, and then, largely, don’t.  They just want us to think that they will. (Remember, while 9/11 was a huge tragedy, it doesn’t make much of a mark in the numbers that have died in simple in-fighting in the Arab Alliance. The deaths weren’t the point.  The after-effects were the point.)

We, as a country, as a people, as individuals, have folded.  Just like that classroom of sophomores 20 years ago, we have turned in our independence to the authorities with our papers and our shampoo.  Even the clowns in Washington, once a source of hope, are led crying from the classroom the moment the chips are down.

Please don’t think your humble author is putting himself above you, the reader.  I had planned on traveling with a firearm this trip: because I can, then lock my luggage with my locks, and pretend that I am more secure than most.  I did not, fearing hassle, fearing delay, or just fearing – I’m not sure which.

I don’t have a solution to suggest, dear reader.  I simply needed to lament the passing of a once great country – the greatest of social experiments – into the waste bin of political history.  I do not believe that is within any of us to turn the social tide now, unless Atlas truly does shrug and some number of us retreat to a contemporary Galt’s Gulch.  The slope of our decline is too firmly now in place.  We have lost.

Bill Sempf

Husband. Father. Pentester. Secure software composer. Brewer. Lockpicker. Ninja. Insurrectionist. Lumberjack. All words that have been used to describe me recently. I help people write more secure software.

PageList

profile for Bill Sempf on Stack Exchange, a network of free, community-driven Q&A sites

MonthList